在hadoop的集羣中運行sqoop時報錯以下:java
16/04/28 06:21:41 ERROR tool.ImportTool: Encountered IOException running import job: node
java.io.FileNotFoundException: File does not exist: <span style="color:#ff0000;">hdfs://mycluster/home/sqoop-apache
1.4.6/lib/commons-codec-1</span>
.4.jar at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCallide
(DistributedFileSystem.java:1072)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCalloop
(DistributedFileSystem.java:1064)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolvespa
(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatuscode
(DistributedFileSystem.java:1064)
at orm
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatusxml
(ClientDistributedCacheManager.java:288)
at hadoop
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus
(ClientDistributedCacheManager.java:224)
at
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamp
s(ClientDistributedCacheManager.java:93)
at
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamp
sAndCacheVisibilities(ClientDistributedCacheManager.java:57)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles
(JobSubmitter.java:265)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles
(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal
(JobSubmitter.java:389)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs
(UserGroupInformation.java:1614)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob
(ImportJobBase.java:196)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
錯誤中能夠看到有這麼語句話
hdfs://mycluster/home/sqoop-1.4.6/lib/commons-codec-1.4.jar 紅色部分「hdfs://mycluster」 說明查找commons-codec-1.4.jar 是在hadoop集羣中查找,同時jar包是個資源是由resourcemanager統一管理,如此推測頗有多是yarn-site.xml中配置不正確所致使。
解決辦法以下:
1.修改hadoop配置目錄中的yarn-site.xml的內容以下:紅色部分需根據本身的集羣配置狀況而定
<configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.resourcemanager.ha.enabled</name>
<value>true</value>
</property>
<property>
<name>yarn.resourcemanager.cluster-id</name>
<value><span style="color:#ff0000;">mycluste</span>r</value>
</property>
<property>
<name>yarn.resourcemanager.ha.rm-ids</name>
<value>rm1,rm2</value>
</property>
<property>
<name>yarn.resourcemanager.hostname.rm1</name>
<value><span style="color:#ff0000;">node5</span></value>
</property>
<property>
<name>yarn.resourcemanager.hostname.rm2</name>
<value><span style="color:#ff0000;">node8</span></value>
</property>
<property>
<name>yarn.resourcemanager.zk-address</name>
<value><span style="color:#ff0000;">node5,node6,node7</span></value>
</property>
</configuration>
2.修改mapred-site.xml內容以下:
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
修改完以上兩個文件的內容後從新啓動hadoop集羣和運行sqoop命令便可,若是此文章對你有幫助請點個贊寫這麼多也不容易旨在分享本身的知識,與你們共勉,點贊哦---------------------