1,執行 hdfs dfs -copyFromLocal 命令報錯!java
19/01/02 11:01:32 INFO hdfs.DFSClient: Exception in createBlockOutputStream java.net.ConnectException: 拒絕鏈接 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1702) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1432) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1385) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:554) 19/01/02 11:01:32 INFO hdfs.DFSClient: Abandoning BP-719105237-127.0.0.1-1525595716995:blk_1073741854_1030 19/01/02 11:01:32 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[127.0.0.1:50010,DS-ababb49a-42c6-452b-9992-e0dc201a08b5,DISK] 19/01/02 11:01:32 WARN hdfs.DFSClient: DataStreamer Exception org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tempdata/README.md._COPYING_ could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation. at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1628) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3121)
解決辦法: 關閉防火牆: node
systemctl -stop firewalld.service
永久關閉防火牆:
linux
systemctl disable firewalld.service
也能夠永久關閉selinux:使用 vim /etc/selinux/config 命令修改/etc/selinux/config 文件
設置「SELINUX=disabled」 ,再次執行上傳文件操做成功。apache
2,出現 Unable to load native-hadoop library for your platform的警告信息處理方式!vim
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builti
在/HADOOP_HOME/etc/hadoop/(也就是hadoop的配置文件目錄下的)中的hadoop_env.sh頭部添加了以下信息:bash
export HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/" export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native/"
其中的 /usr/local/hadoop是個人haddoop安裝目錄,請根據本身的安裝目錄設置便可!oop