hadoop學習筆記以及遇到的坑整理(長期更新)

1.要看官方文檔html

http://hadoop.apache.org/docs/current/java

2.start-dfs.sh時提示rcmd: socket: Permission deniednode

解決方法:
在/etc/pdsh下面新建文件rcmd_default,寫入ssh,而後回車,記得必定要輸入一個回車符另起一行,否則會提示ssh exit with code 1apache

3.org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: server3/192.168.2.107:8020api

各個節點的配置文件保存一致,這裏由於fs.defaultFS設置錯誤致使,datanode找不到namenodessh

4.hadoop執行jarsocket

idea生成可執行jar包:https://www.cnblogs.com/blog5277/p/5920560.htmlide

hadoop執行命令:bin/hadoop jar xxx.jar arg1 arg2   (single node)oop

5.hadoop java api文檔idea

http://hadoop.apache.org/docs/r3.1.0/api/index.html

6.org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /tmp/hadoop-kevin/dfs/name is in an inconsistent state: storage directory does not exist or is not accessible.

很明顯/tmp下是按期清理的 因此咱們要把dfs/name設置到其餘位置

core-site.yml

<property>   <name>hadoop.tmp.dir</name>   <value>/data/hadoop/hadoop-${user.name}</value>   <description>A base for other temporary directories.</description> </property>

相關文章
相關標籤/搜索