CentOS7 install spark+ipython-nodebook

ipython-nodebook

  1. IPython notebook 目前已經成爲用 Python 作教學、計算、科研的一個重要工具。node

  2. IPython Notebook 使用瀏覽器做爲界面,向後臺的 IPython 服務器發送請求,並顯示結果。python

  3. 在瀏覽器的界面中使用單元(Cell)保存各類信息。Cell 有多種類型,常用的有表示格式化文本的 Markdown單元,和表示代碼的 Code單元。linux


本文主要介紹在 centos7 minimal 上安裝 ipython-nodebook 流程c++

1. install ifconfig

yum search ifconfig
yum install net-tools.x86_64

2. install vim

yum search vim
yum install vim-enhanced

3. install wget

[libin@centos-linux-1 x]$ yum search wget
Loaded plugins: fastestmirror
Loading mirror speeds from cached hostfile
* base: mirrors.skyshe.cn
* extras: mirrors.163.com
* updates: mirrors.163.com
============================================================================================ N/S matched: wget =============================================================================================
wget.x86_64 : A utility for retrieving files using the HTTP or FTP protocols

 Name and summary matches only, use "search all" for everything.

[libin@centos-linux-1 x]$ yum install wget.x86_64

4. install Jdk

# green install jdk-7u80-linux-x64.gz
# edit /etc/profile add
## libin add ##

### JAVA ###
JAVA_HOME=/home/x/jdk
JAVA_BIN=$JAVA_HOME/bin
PATH=$JAVA_HOME/bin:$PATH
CLASSPATH=.:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/jre/lib/dt.jar:$JAVA_HOME/jre/lib/tools.jar
export JAVA_HOME JAVA_BIN PATH CLASSPATH
"/etc/profile" 86L, 2035C

# /etc/profile:該文件是用戶登陸時,操做系統定製用戶環境時使用的第一個文件,應用於登陸到系統的每個用戶。 對全部用戶有效 ##

5. install Scala

# green install scala-2.10.4.tgz
# edit /etc/profile add

### Scala ###
#export SCALA_HOME=/usr/local/xSoft/scala
export SCALA_HOME=/home/x/scala
export PATH=${SCALA_HOME}/bin:$PATH

6. install Spark (Standalone)

green install spark-1.5.2-bin-hadoop2.6.tgz
cp conf/spark-env.sh.template conf/spark-env.sh

edit conf/spark-env.sh addvim

export JAVA_HOME=/home/x/jdk
export SCALA_HOME=/home/x/scala
export SPARK_HOME=/home/x/spark
export SPARK_MASTER_IP=192.168.181.113
export MASTER=spark://192.168.181.113:7077

export SPARK_EXECUTOR_INSTANCES=2
export SPARK_EXECUTOR_CORES=1

export SPARK_WORKER_MEMORY=1000m
export SPARK_EXECUTOR_MEMORY=300m

export SPARK_LIBRARY_PATH=${SPARK_HOME}/lib

#export SPARK_LAUNCH_WITH_SCALA=0
#export SCALA_LIBRARY_PATH=${SPARK_HOME}/lib


#export SPARK_LIBRARY_PATH=/home/deploy/spark/spark-1.5.2-bin-hadoop2.6/lib

7. install ipython-nodebook

openssh、zlibcentos

yum -y install openssh-clients
yum install zlib

setuptools、pip瀏覽器

tar xvf setuptools-18.1.tar.gz
cd setuptools-18.1
sudo python setup.py build
sudo python setup.py install

tar xvf pip-8.1.0.tar.gz
cd pip-8.1.0
sudo python setup.py build
sudo python setup.py install

ipython、matplotlib服務器

sudo pip install ipython
sudo pip install matplotlib

python-dev、g++python2.7

sudo yum install python-devel (若是沒有安裝 python 源代碼,會報找不到 Python.h 的頭文件錯誤)
sudo yum install gcc-c++

install python-notebook ssh

# 前面install的各類py相關, 爲個這一步

sudo pip install notebook

8. start-up notebook

PYSPARK_DRIVER_PYTHON=ipython PYSPARK_DRIVER_PYTHON_OPTS="notebook --ip=192.168.181.113" /home/x/spark/bin/pyspark

瀏覽器訪問 http://192.168.181.113:8888/notebooks

圖片描述

9. spark-notebook example1

%pylab inline
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt

data =[33,25,20,12,10]
plt.figure(num=1, figsize=(6,6))
plt.axes(aspect=1)
plt.title('Plot 3', size=14)
plt.pie(data, labels=('Group 1','Group 2','Group 3','Group 4','Group 5'))
plt.savefig('/home/x/spark/test_libin/plot3.png', format='png')

圖片描述

maybe attention point

python -V

#若系統默認是python2.6,須要升級到2.7
tar xvf Python-2.7.tgz
./configure --with-zlib=/usr/include --prefix=/usr/local/python27 --prefix=/usr/local/python27

make
make install
mv /usr/bin/python /usr/bin/python_old
ln -s /usr/local/python27/bin/python /usr/bin/
python
此處已經能夠正常使用python2.7了
可是由於yum是使用的2.6的版原本用的,因此 還須要修改一下
[root@wangyuelou Python-2.7.2]# vim /usr/bin/yum
#!/usr/bin/python   #修改此處爲2.6的位置
相關文章
相關標籤/搜索