1.首先啓動HDFS windows
$HADOOP_HOME/sbin/start-dfs.shoop
2.關防火牆spa
切換到root用戶,執行service iptables stopcode
3.拷貝文件到HDFS xml
bin/hadoop fs -put 本地 HDFSblog
4.查看HDFS根目錄的文件ip
bin/hadoop fs -ls /hadoop
在hadoop解壓包中的hadoop-2.6.0\share\hadoop\common目錄下紅色標註的文件所有拷貝get
在hadoop-2.6.0\share\hadoop\hdfs目錄下紅色標註的文件所有拷貝it
而後在Java項目中構建配置路徑
FileSystem fileSystem; /* * 初始化 */ @Before public void init() throws Exception{ //讀取數據由平臺上的協議肯定 URI uri = new URI("hdfs://192.168.*.*:9000"); Configuration conf = new Configuration(); fileSystem = FileSystem.get(uri, conf); } /* * 查看目錄 */ @Test public void Catalog() throws Exception{ Path path = new Path("/poker"); FileStatus fileStatus = fileSystem.getFileStatus(path); System.out.println("*************************************"); System.out.println("文件根目錄: "+fileStatus.getPath()); System.out.println("這文件目錄爲:"); for(FileStatus fs : fileSystem.listStatus(path)){ System.out.println(fs.getPath()); } } /* * 瀏覽文件 */ @Test public void look() throws Exception{ Path path = new Path("/core-site.xml"); FSDataInputStream fsDataInputStream = fileSystem.open(path); System.out.println("*************************************"); System.out.println("瀏覽文件:"); int c; while((c = fsDataInputStream.read()) != -1){ System.out.print((char)c); } fsDataInputStream.close(); } /* * 上傳文件 */ @Test public void upload() throws Exception{ Path srcPath = new Path("C:/Users/Administrator/Desktop/hadoop/hadoop.txt"); Path dstPath = new Path("/"); fileSystem.copyFromLocalFile(false, srcPath, dstPath); fileSystem.close(); System.out.println("*************************************"); System.out.println("上傳成功!"); } /* * 下載文件 */ @Test public void download() throws Exception{ InputStream in = fileSystem.open(new Path("/hadoop.txt")); OutputStream out = new FileOutputStream("E://hadoop.txt"); IOUtils.copyBytes(in, out, 4096, true); } /* * 刪除文件 */ @Test public void delete() throws Exception{ Path path = new Path("hdfs://192.168.*.*:9000/hadoop.txt"); fileSystem.delete(path,true); System.out.println("*************************************"); System.out.println("刪除成功!"); }
解決方法:
1.修改HDFS根目錄的權限
2.把Hadoop權限驗證關閉,把hadoop.dll文件放到C:/windows/system32中,而後修改hdfs-site.xml文件,把驗證關閉
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
3.僞造用戶 -DHADOOP_USER_NAME=用戶名