【環境信息】html
Hadoop版本:2.4.0java
客戶端OS:Windows Server 2008 R2node
服務器端OS:CentOS 6.4apache
【問題現象】bash
在經過Windows客戶端向Linux服務器提交Hadoop應用時,會提示以下錯誤:服務器
org.apache.hadoop.util.Shell$ExitCodeException: /bin/bash: line 0: fg: no job control at org.apache.hadoop.util.Shell.runCommand(Shell.java:505) at org.apache.hadoop.util.Shell.run(Shell.java:418) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650) at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745)
【解決辦法】app
在客戶端配置文件中添加以下屬性:oop
<property> <name>mapreduce.app-submission.cross-platform</name> <value>true</value> </property>
注意:必須添加到Hadoop程序讀取的客戶端本地配置文件中,添加到客戶端Hadoop安裝路徑中的「core-site.xml」,「mapred-site.xml」等文件中不起做用。spa
另,下面的連接未提供正確的解決方案,但爲我找到解決辦法提供了思路,可供參考:code