spark在提交任務的時候,也會跟hadoop同樣,報這個異常。shell
前提是hadoop/lib/native中已經有那些包了,而後把hadoop/lib/native加到環境變量LD_LIBRARY_PATH中。bash
在~/.bashrc中加入以下代碼,爲保險起見,在每一個spark節點都加:app
export LD_LIBRARY_PATH=/home/bigdata/apps/hadoop-2.6.0-cdh5.5.2/lib/native/:$LD_LIBRARY_PATH
source ~/.bashrcoop