apache的hadoop升級到CDH hadoop2.0時遇到的問題及解決

1:引入的jar包apache

1.X版本有hadoop-core包;而2.x沒有oop

若是你須要hdfs就引入\share\hadoop\common\lib + post

hadoop-common-2.0.0-cdh4.6.0.jar +orm

hadoop-hdfs-2.0.0-cdh4.6.0.jar +blog

\share\hadoop\yarn\*hadoop

 

若是你要MR則加上get

\share\hadoop\mapreduce1\* 或it

\share\hadoop\mapreduce2\* io

 

2:遇到權限問題的時候最好是把權限賦給當前的執行用戶form

chown hadoop.hadoop ./logs

 

3:No FileSystem for scheme: hdfs,No FileSystem for scheme: file

http://blog.newitfarmer.com/big_data/big-data-platform/hadoop/13953/repost-no-filesystem-for-scheme-hdfsno-filesystem-for-scheme-file

個人辦法是加上

Configuration config = new Configuration();config.set("fs.hdfs.impl",org.apache.hadoop.hdfs.DistributedFileSystem.class.getName()); config.set("fs.file.impl",org.apache.hadoop.fs.LocalFileSystem.class.getName());

相關文章
相關標籤/搜索