問題1:Cannot run program "/bin/ls": error=11, Resource temporarily unavailablejava
1 15/04/22 14:46:46 INFO mapred.JobClient: Task Id : attempt_201504221017_0006_r_000077_0, Status : FAILED 2 java.lang.RuntimeException: Error while running command to get file permissions : java.io.IOException: Cannot run program "/bin/ls": error=11, Resource temporarily unavailable 3 at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047) 4 at org.apache.hadoop.util.Shell.runCommand(Shell.java:200) 5 at org.apache.hadoop.util.Shell.run(Shell.java:182) 6 at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375) 7 at org.apache.hadoop.util.Shell.execCommand(Shell.java:461) 8 at org.apache.hadoop.util.Shell.execCommand(Shell.java:444) 9 at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:712) 10 at org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:448) 11 at org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getOwner(RawLocalFileSystem.java:431) 12 at org.apache.hadoop.mapred.TaskLog.obtainLogDirOwner(TaskLog.java:267) 13 at org.apache.hadoop.mapred.TaskLogsTruncater.truncateLogs(TaskLogsTruncater.java:124) 14 at org.apache.hadoop.mapred.Child$4.run(Child.java:260) 15 at java.security.AccessController.doPrivileged(Native Method) 16 at javax.security.auth.Subject.doAs(Subject.java:415) 17 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190) 18 at org.apache.hadoop.mapred.Child.main(Child.java:249) 19 Caused by: java.io.IOException: error=11, Resource temporarily unavailable 20 at java.lang.UNIXProcess.forkAndExec(Native Method) 21 at java.lang.UNIXProcess.<init>(UNIXProcess.java:186) 22 at java.lang.ProcessImpl.start(ProcessImpl.java:130) 23 at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028) 24 ... 15 more
該問題極可能是由於當前提交Hadoop做業的用戶可以打開的文件個數限制(主要是slave結點),能夠使用ulimit -n查看當前用戶可以打開的文件個數。sql
參考:http://mail-archives.apache.org/mod_mbox/nutch-user/201312.mbox/%3C1386546180.6104.5.camel@senf.fritz.box%3Eshell
問題2:java.lang.OutOfMemoryError: unable to create new native threadapache
1 15/04/22 11:08:16 WARN hdfs.DFSClient: DataStreamer Exception: java.lang.OutOfMemoryError: unable to create new native thread 2 at java.lang.Thread.start0(Native Method) 3 at java.lang.Thread.start(Thread.java:714) 4 at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:3030)
該問題並不必定是內存不夠了,極可能是當前提交Hadoop做業的用戶可以建立的進程個數限制(主要是slave結點),能夠使用ulimit -u查看當前用戶可以建立的文件個數。vim
參考:http://www.nosql.se/2011/10/hadoop-tasktracker-java-lang-outofmemoryerror/nosql
如何修改當前用戶可以建立的進程個數:oop
在CentOS中,在vim /etc/security/limits.d/90-nproc.conf文件末尾添加以下語句:ui
1 * soft nproc 102400 2 * hard nproc 102400
修改完成後須要使用該用戶從新登陸。(若是使用Xshell這種遠程軟件,須要關閉該主機遠程鏈接的標籤卡,而後從新登陸便可)spa