Elam的caffe筆記之配置篇(六):Centos6.5下編譯caffe及caffe的python3.6接口


配置要求:html

系統:centos6.5
目標:基於CUDA8.0+Opencv3.1+Cudnnv5.1+python3.6接口的caffe框架python


綜合來講,caffe的配置並無想象中的那麼難。仍是那句話已官方文檔爲準,網上的教程很難找到徹底對應的。
Centos系統下配置caffe 的官方文檔,
http://caffe.berkeleyvision.o...c++

1.安裝前準備

通常依賴項:git

sudo yum install protobuf-devel leveldb-devel snappy-devel opencv-devel boost-devel hdf5-devel

剩下的依賴項:github

sudo yum install gflags-devel glog-devel lmdb-devel

以上就是caffe配置所須要的依賴包了,這裏我採用的方法是所有手動安裝,這樣成功率要比直接用yum高上很是多.shell

① Protobuf

因爲我配置的是python3.6的接口,所以protobuf的版本必須大於3.0以上
https://github.com/google/pro...
分別去下載cpppython的包。
我選的是3.2版本,所以下載了protobuf-cpp-3.2.0.tar.gzprotobuf-python-3.2.0.tar.gz包,選好目錄bootstrap

tar -zxvf protobuf-cpp-3.2.0.tar.gz
cd protobuf-3.2.0
./configure
make
make check
make install
ldconfig
tar -zxvf protobuf-python-3.2.0.tar.gz

進入目錄以後centos

cd python
python setup.py build
python setup.py test
python setup.py install

編譯完成後能夠用一下命令確認是否安裝成功api

conda list | grep protobuf

② boost

http://www.boost.org/users/hi...
下載boost_1_65_0.tar.gzbash

tar -zxvf boost_1_65_0.tar.gz
cd boost_1_65_0
./bootstrap.sh
./b2
./b2 install

完成後 若發現沒有libboost_python生成
從新

cd boost_1_65_0
./bootstrap.sh
./b2 –-with-python include=」你pyconfig.h的路徑」←可用locate去尋找pyconfig.h的路徑

在終端輸入

locate libboost_python3

查看/usr/local/lib/下有沒有

/usr/local/lib/libboost_python3.a
/usr/local/lib/libboost_python3.so
/usr/local/lib/libboost_python3.so.1.65.0

若是有,直接建立軟連接

ln -s /usr/local/lib/libboost_python3.so.1.65.0 /usr/local/lib/libboost_python3.so

這三個若是沒有,就從booststage文件夾下的lib文件夾當中把這三個文件拷貝到/usr/local/lib/目錄下,而後建立軟連接

③ glog gflags lmdb

這三個依賴項直接根據caffe的官方文檔的命令進行安裝編譯便可

1.glog
wget https://storage.googleapis.com/google-code-archive-downloads/v2/code.google.com/google-glog/glog-0.3.3.tar.gz
tar zxvf glog-0.3.3.tar.gz
cd glog-0.3.3
./configure
make && make install
2. gflags
wget https://github.com/schuhschuh/gflags/archive/master.zip
unzip master.zip
cd gflags-master
mkdir build && cd build
export CXXFLAGS="-fPIC" && cmake .. && make VERBOSE=1
make && make install
3.lmdb
git clone https://github.com/LMDB/lmdb
cd lmdb/libraries/liblmdb
make && make install

④ hdf5

建議安裝1.8.17版本,由於anaconda自帶的hdf5也是這個版本
http://download.csdn.net/down...

tar -zxvf hdf5-1.8.17.tar.gz
cd hdf5-1.8.17
./configure --prefix=/usr/local/hdf5-1.8.17/
make
make check                
make install
make check-install

⑤ snappy

yum install snappy

⑥ leveldb

http://download.csdn.net/down...

tar –zxvf leveldb-1.7.0.tar.gz 
cd leveldb-1.7.0 
make
cp libleveldb* /usr/lib/
cp –r include/leveldb /usr/local/include

⑦atlas-devel

直接使用yum install atlas-devel 安裝

caffe編譯

1.caffe下載

git clone https://github.com/bvlc/caffe.git

2.caffe編譯

cd caffe
vi Makefile

找到Configure build其下的
COMMON_FLAGS +=後面加上-I/usr/local/hdf5-1.8.17/include
LDFLAGS +=後面加上-L/usr/local/hdf5-1.8.17/lib
固然若是你以前路徑配的都沒問題的話,沒能夠不加
修改Makefile.config
若是沒有Makefile.config,

cp Makefile.config.example Makefile.config
vi Makefile.config

如下是我修改後的完整的Makefile.config,左箭頭(←)所指部分是須要修改的地方

## Refer to http://caffe.berkeleyvision.org/installation.html
# Contributions simplifying and improving our build system are welcome!

# cuDNN acceleration switch (uncomment to build with cuDNN).
 USE_CUDNN := 1←←←

# CPU-only switch (uncomment to build without GPU support).
# CPU_ONLY := 1

# uncomment to disable IO dependencies and corresponding data layers
# USE_OPENCV := 0
# USE_LEVELDB := 0
# USE_LMDB := 0

# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
#       You should not set this flag if you will be reading LMDBs with any
#       possibility of simultaneous read and write
# ALLOW_LMDB_NOLOCK := 1

# Uncomment if you're using OpenCV 3
 OPENCV_VERSION := 3←←←

# To customize your choice of compiler, uncomment and set the following.
# N.B. the default for Linux is g++ and the default for OSX is clang++
# CUSTOM_CXX := g++

# CUDA directory contains bin/ and lib/ directories that we need.
CUDA_DIR := /usr/local/cuda-8.0←←←
# On Ubuntu 14.04, if cuda tools are installed via
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
# CUDA_DIR := /usr

# CUDA architecture setting: going with all of them.
# For CUDA < 6.0, comment the *_50 through *_61 lines for compatibility.
# For CUDA < 8.0, comment the *_60 and *_61 lines for compatibility.
CUDA_ARCH := -gencode arch=compute_30,code=sm_30 \←←←
                -gencode arch=compute_35,code=sm_35 \
                -gencode arch=compute_50,code=sm_50 \
                -gencode arch=compute_52,code=sm_52 \
                -gencode arch=compute_60,code=sm_60 \
                -gencode arch=compute_61,code=sm_61 \
                -gencode arch=compute_61,code=compute_61

# BLAS choice:
# atlas for ATLAS (default)
# mkl for MKL
# open for OpenBlas
BLAS := atlas←←←
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
# Leave commented to accept the defaults for your choice of BLAS
# (which should work)!
# BLAS_INCLUDE := /path/to/your/blas
# BLAS_LIB := /path/to/your/blas

# Homebrew puts openblas in a directory that is not on the standard search path
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
# BLAS_LIB := $(shell brew --prefix openblas)/lib

# This is required only if you will compile the matlab interface.
# MATLAB directory should contain the mex binary in /bin.
# MATLAB_DIR := /usr/local
# MATLAB_DIR := /Applications/MATLAB_R2012b.app

# NOTE: this is required only if you will compile the python interface.
# We need to be able to find Python.h and numpy/arrayobject.h.
# PYTHON_INCLUDE := /usr/include/python2.7 \←←←
                /usr/lib/python2.7/dist-packages/numpy/core/include←←←
# Anaconda Python distribution is quite popular. Include path:
# Verify anaconda location, sometimes it's in root.
ANACONDA_HOME := /root/anaconda3←←←
PYTHON_INCLUDE := $(ANACONDA_HOME)/include \←←←
                 $(ANACONDA_HOME)/include/python3.6m \←←←
                 $(ANACONDA_HOME)/lib/python3.6/site-packages/numpy/core/include←←←

# Uncomment to use Python 3 (default is Python 2)
PYTHON_LIBRARIES := boost_python3 python3.6m←←←
# PYTHON_INCLUDE := /usr/include/python3.6m \←←←
                 /usr/lib/python3.6/dist-packages/numpy/core/include←←←

# We need to be able to find libpythonX.X.so or .dylib.
# PYTHON_LIB := /usr/lib←←←
PYTHON_LIB := $(ANACONDA_HOME)/lib \←←←
                $(ANACONDA_HOME)/pkgs/python-3.6.1-2/lib←←←

# Homebrew installs numpy in a non standard path (keg only)
# PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include
# PYTHON_LIB += $(shell brew --prefix numpy)/lib

# Uncomment to support layers written in Python (will link against Python libs)
WITH_PYTHON_LAYER := 1←←←

# Whatever else you find you need goes here.
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include  /usr/local/hdf5-1.8.17/include←←←
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib64/atlas /usr/local/hdf5-1.8.17/lib←←←

# If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
# INCLUDE_DIRS += $(shell brew --prefix)/include
# LIBRARY_DIRS += $(shell brew --prefix)/lib

# NCCL acceleration switch (uncomment to build with NCCL)
# https://github.com/NVIDIA/nccl (last tested version: v1.2.3-1+cuda8.0)
# USE_NCCL := 1

# Uncomment to use `pkg-config` to specify OpenCV library paths.
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
# USE_PKG_CONFIG := 1

# N.B. both build and distribute dirs are cleared on `make clean`
BUILD_DIR := build
DISTRIBUTE_DIR := distribute

# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
# DEBUG := 1

# The ID of the GPU that 'make runtest' will use to run unit tests.
TEST_GPUID := 0

# enable pretty build (comment to see full commands)
Q ?= @

修改完成後保存退出。
主要須要指出的是
PYTHON_LIBRARIES : 贊成確保boost_python3這個動態連接庫在ld.so.conf文件中(記住ldconfig)或LD_LIBRARY_PATH中能找到。
INCLUDE_DIRSLIBRARY_DIRS的話須要加上的路徑是沒有放在local或者usr文件夾下的include或者lib文件夾中的依賴項。
而後

make all -jn
make test -jn
make runtest -jn
make pycaffe -jn

在安裝完成以後,若是想要導入caffePython模塊,則添加模塊路徑到你的環境變量 $PYTHONPATH 中。好比在你的~/.bashrc中添加以下一行:

export PYTHONPATH=/path/to/caffe/python:$PYTHONPATH

打開終端

python
import caffe

若是沒有錯誤,表示caffe的python接口配置完成

碰到的問題

[root@localhost caffe]# make runtest 
.build_release/tools/caffe
.build_release/tools/caffe: /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.15' not found (required by /home/HY/caffe/caffe/.build_release/tools/../lib/libcaffe.so.1.0.0)
.build_release/tools/caffe: /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.15' not found (required by /usr/local/lib/libopencv_core.so.3.1)
.build_release/tools/caffe: /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.15' not found (required by /usr/local/lib/libopencv_imgcodecs.so.3.1)
make: *** [runtest] Error 1

問題產生的緣由:新安裝的高版本的gcc生成的動態庫沒有替換老版本的gcc的動態庫致使的
解決方案參考這個博主寫的博客:http://blog.chinaunix.net/uid...

Warning! ***HDF5 library version mismatched error***
The HDF5 header files used to compile this application do not match
the version used by the HDF5 library to which this application is linked.
Data corruption or segmentation faults may occur if the application continues.
This can happen when an application was compiled by one version of HDF5 but
linked with a different version of static or shared HDF5 library.
You should recompile the application or check your shared library related
settings such as 'LD_LIBRARY_PATH'.
You can, at your own risk, disable this warning by setting the environment
variable 'HDF5_DISABLE_VERSION_CHECK' to a value of '1'.
Setting it to 2 or higher will suppress the warning messages totally.
Headers are 1.8.3, library is 1.8.17

這個問題產生的緣由是hdf5版本不一致所形成的系統自己已經安裝的是1.8.3版本,可是anaconda3所帶的hdf5的版本的是1.8.17,因此在編譯的時候會發生版本衝突。
爲了解決這個兼容性問題,只能是從新編譯hdf5-1.8.17版本,具體方法參考上文安裝依賴項中的hdf5編譯安裝方法

[root@localhost caffe]# make pycaffe
CXX/LD -o python/caffe/_caffe.so python/caffe/_caffe.cpp
python/caffe/_caffe.cpp:1:52: fatal error: Python.h: No such file or directory
 #include <Python.h>  // NOLINT(build/include_alpha)
                                                    ^
compilation terminated.
make: *** [python/caffe/_caffe.so] Error 1

顧名思義找不到Python.h這個頭文件,因而我利用

find / -name "Python.h"

去找文件的路徑發現

/root/anaconda2/include/python2.7/Python.h
/root/anaconda2/pkgs/python-2.7.13-0/include/python2.7/Python.h
/root/anaconda3/include/python3.6m/Python.h
/root/anaconda3/pkgs/python-3.6.1-2/include/python3.6m/Python.h

能夠看到第三個路徑是上面修改Makefile.config時修改PYTHON_INCLUDE時應該要改的路徑,打開Makefile.config到指定位置,果真發現本身的路徑配置錯誤,當時指向了python3.6而不是python3.6m。改爲python3.6m以後從新make再也不出現這個問題,所以這個問題出現的緣由就是python的路徑配置錯誤。

The following directory should be added to compiler include paths:

    /home/HY/boost_1_59_0

The following directory should be added to linker library paths:

/home/HY/boost_1_59_0/stage/lib

這個問題是最先幾回編譯caffe的時候出現的問題,查了很久是由於在編譯boost的時候沒有把python模塊編譯出來,若是安裝上文正確編譯libboost_python3的話 並不會出現這個問題,出現這個問題的朋友,能夠進入boost文件夾編譯一下python模塊,由於boost是能夠重複編譯的,命令參考上文

⑤ 編譯boost庫時

...failed gcc.compile.c++ bin.v2/libs/python/build/gcc-4.8.2/release/link-static/threading-multi/numpy/scalars.o...
gcc.compile.c++ bin.v2/libs/python/build/gcc-4.8.2/release/link-static/threading-multi/numpy/ufunc.o
In file included from ./boost/python/detail/prefix.hpp:13:0,
                 from ./boost/python/args.hpp:8,
                 from ./boost/python.hpp:11,
                 from ./boost/python/numpy/internal.hpp:17,
                 from libs/python/src/numpy/ufunc.cpp:8:
./boost/python/detail/wrap_python.hpp:50:23: fatal error: pyconfig.h: No such file or directory
 # include <pyconfig.h>
                       ^
compilation terminated.

這個問題和第四個問題出現的地方差很少,都是在編譯boost的python模塊的時候發生的,這個問題是由於編譯的時候找不到pyconfig.h
解決方法:編譯boost_python的時候use

./b2 --with-python include="path/to/pyconfig.h"

引號裏面的路徑能夠利用

locate pyconfig.h

去肯定

⑥ 在make runtest的時候

error while loading shared libraries: libpython3.6m.so.1.0: cannot open shared object file: No such file or directory

顧名思義就是找不到共享庫,那麼首先咱們要確保/etc/ld.so.conf 裏面有你本身共享庫的路徑
個人共享庫以下
圖片描述
上述錯誤顧名思義找不到libpython3.6m.so.1.0
解決方法利用find / -name XXX.so去找到libpython3.6m.so.1.0所在位置
而後複製libpython3.6m.so.1.0/usr/local/lib目錄下,或者直接把libpython3.6m.so.1.0所在的路徑直接添加到/etc/ld.so.conf裏,而後ldconfig

⑦ NVCC的警告

產生緣由好像在CUDA8.0之後把compute 20,21都棄用了,所以解決方法也很簡單。

vi Makefile.conf

CUDA_ARCH
-gencode arch=compute_20,code=sm_20-gencode arch=compute_20,code=sm_21直接去掉

相關文章
相關標籤/搜索