java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries

在已經搭建好的集羣環境Centos6.6+Hadoop2.7+Hbase0.98+Spark1.3.1下,在Win7系統Intellij開發工具中調試Spark讀取Hbase。運行直接報錯: html

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
15 / 06 / 11 15 : 35 : 50 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null \bin\winutils.exe in the Hadoop binaries.
     at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java: 356 )
     at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java: 371 )
     at org.apache.hadoop.util.Shell.<clinit>(Shell.java: 364 )
     at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java: 80 )
     at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java: 611 )
     at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java: 272 )
     at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java: 260 )
     at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java: 790 )
     at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java: 760 )
     at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java: 633 )
     at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$ 1 .apply(Utils.scala: 2001 )
     at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$ 1 .apply(Utils.scala: 2001 )
     at scala.Option.getOrElse(Option.scala: 120 )
     at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala: 2001 )
     at org.apache.spark.SecurityManager.<init>(SecurityManager.scala: 207 )
     at org.apache.spark.SparkEnv$.create(SparkEnv.scala: 218 )
     at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala: 163 )
     at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala: 269 )
     at org.apache.spark.SparkContext.<init>(SparkContext.scala: 272 )
     at org.apache.spark.SparkContext.<init>(SparkContext.scala: 154 )
     at SparkFromHbase$.main(SparkFromHbase.scala: 15 )
     at SparkFromHbase.main(SparkFromHbase.scala)
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: 57 )
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java: 43 )
     at java.lang.reflect.Method.invoke(Method.java: 606 )
     at com.intellij.rt.execution.application.AppMain.main(AppMain.java: 134 )

查看hadoop源碼發現裏有這麼一段: java

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
   public static final String getQualifiedBinPath(String executable)
   throws IOException {
     // construct hadoop bin path to the specified executable
     String fullExeName = HADOOP_HOME_DIR + File.separator + "bin"
       + File.separator + executable;
 
     File exeFile = new File(fullExeName);
     if (!exeFile.exists()) {
       throw new IOException( "Could not locate executable " + fullExeName
         + " in the Hadoop binaries." );
     }
 
     return exeFile.getCanonicalPath();
   }
 
private static String HADOOP_HOME_DIR = checkHadoopHome();
private static String checkHadoopHome() {
 
     // first check the Dflag hadoop.home.dir with JVM scope
     String home = System.getProperty( "hadoop.home.dir" );
 
     // fall back to the system/user-global env variable
     if (home == null ) {
       home = System.getenv( "HADOOP_HOME" );
     }
      ...
}

很明顯應該是HADOOP_HOME的問題。若是HADOOP_HOME爲空,必然fullExeName爲null\bin\winutils.exe。解決方法很簡單,配置環境變量,不想重啓電腦能夠在程序里加上:git

?
1
System.setProperty( "hadoop.home.dir" , "E:\\Program Files\\hadoop-2.7.0" );

注:E:\\Program Files\\hadoop-2.7.0是我本機解壓的hadoop的路徑。github

稍後再執行,你可能仍是會出現一樣的錯誤,這個時候你可能會要怪我了。其實一開始我是拒絕的,由於你進入你的hadoop-x.x.x/bin目錄下看,你會發現你壓根就沒有winutils.exe這個東東。 apache

因而我告訴你,你能夠去github下載一個,地球人都知道的地址發你一個。 app

地址:https://github.com/srccodes/hadoop-common-2.2.0-bin 工具

不要顧慮它的版本,不用怕,由於我用的最新的hadoop-2.7.0都沒問題!下載好後,把winutils.exe加入你的hadoop-x.x.x/bin下。 oop

至此問題解決了,若是還沒解決,那你是奇葩哥了,能夠加個人QQ! 開發工具

相關文章
相關標籤/搜索