異常1:java
2016-12-31 22:39:45,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: NameNode/192.168.174.128:9090. Already tried 9 time(s). 2016-12-31 22:39:46,314 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Call to NameNode/192.168.174.128:9090 failed on local exception: java.net.NoRouteToHostException: No route to host at org.apache.hadoop.ipc.Client.wrapException(Client.java:775) at org.apache.hadoop.ipc.Client.call(Client.java:743) at org.apache.hadoop.ipc.RPC\$Invoker.invoke(RPC.java:220) at com.sun.proxy.\$Proxy4.getProtocolVersion(Unknown Source) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:346) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:383) at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:314) at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:291) at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:269) at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:216) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1283) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1238) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1246) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1368) Caused by: java.net.NoRouteToHostException: No route to host at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304) at org.apache.hadoop.ipc.Client\$Connection.access\$1700(Client.java:176) at org.apache.hadoop.ipc.Client.getConnection(Client.java:860) at org.apache.hadoop.ipc.Client.call(Client.java:720) ... 13 more 2016-12-31 22:39:46,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: ************************************************************ SHUTDOWN_MSG: Shutting down DataNode at DataNode_01/192.168.174.129 ************************************************************
解決方法:DataNode節點沒法鏈接NameNode節點的緣由linux的防火牆的某些規則限制了maters-slaves之間的通信,直接關閉linux的防火牆便可:sudo service iptables stop(每一個節點都執行一遍) 上述問題會致使DataNode進程啓動一會後又自動關閉node
異常2:linux
問題:hadoop啓動以後,slave節點的dataNode節點未正常啓動apache
解決方法:檢查hdfs-site.xml文件關於數據存儲位置的配置,若是配置的文件不存在則不會啓動DataNodewindows
問題:hadoop啓動mapreduce做業莫名出現map可處理,reduce就出現空指針等異常,可是代碼和配置卻沒有問題oop
解決方法:多是hadoop集羣節點的主機名格式不對:必定不能包含‘_’下劃線,切記。.net
異常3:debug
Could not locate executable null\bin\winutils.exe in the Hadoop binaries
分析:這是因爲hadoop目前沒有專門針對windows平臺的支持包,致使Hadoop按照Linux的方式處理,致使異常指針
解決方法:org.apache.hadoop.util.Shellcode
public static final String getQualifiedBinPath(String executable) throws IOException { // construct hadoop bin path to the specified executable String fullExeName = HADOOP_HOME_DIR + File.separator + "bin" + File.separator + executable; File exeFile = new File(fullExeName); if (!exeFile.exists()) { throw new IOException("Could not locate executable " + fullExeName + " in the Hadoop binaries."); } return exeFile.getCanonicalPath(); } private static String checkHadoopHome() { // first check the Dflag hadoop.home.dir with JVM scope String home = System.getProperty("hadoop.home.dir"); // fall back to the system/user-global env variable if (home == null) { home = System.getenv("HADOOP_HOME"); } try { // couldn't find either setting for hadoop's home directory if (home == null) { throw new IOException("HADOOP_HOME or hadoop.home.dir are not set."); } if (home.startsWith("\"") && home.endsWith("\"")) { home = home.substring(1, home.length()-1); } // check that the home setting is actually a directory that exists File homedir = new File(home); if (!homedir.isAbsolute() || !homedir.exists() || !homedir.isDirectory()) { throw new IOException("Hadoop home directory " + homedir + " does not exist, is not a directory, or is not an absolute path."); } home = homedir.getCanonicalPath(); } catch (IOException ioe) { if (LOG.isDebugEnabled()) { LOG.debug("Failed to detect a valid hadoop home directory", ioe); } home = null; } return home; }
將winutils.exe放到Hadoop安裝包的bin目錄,而後設置HADOOP_HOME環境變量或者經過System.setProperty("hadoop.home.dir", "D:\Tools\hadoop-2.7.3");方法設置系統屬性