使用 FileSystem JAVA API 對 HDFS 進行讀、寫、刪除等操做

Hadoop文件系統 
基本的文件系統命令操做, 經過hadoop fs -help能夠獲取全部的命令的詳細幫助文件。 

Java抽象類org.apache.hadoop.fs.FileSystem定義了hadoop的一個文件系統接口。該類是一個抽象類,經過如下兩種靜態工廠方法能夠過去FileSystem實例: 
public static FileSystem.get(Configuration conf) throws IOException 
public static FileSystem.get(URI uri, Configuration conf) throws IOException
 

具體方法實現: 
一、public boolean mkdirs(Path f) throws IOException 
一次性新建全部目錄(包括父目錄), f是完整的目錄路徑。 

二、public FSOutputStream create(Path f) throws IOException 
建立指定path對象的一個文件,返回一個用於寫入數據的輸出流 
create()有多個重載版本,容許咱們指定是否強制覆蓋已有的文件、文件備份數量、寫入文件緩衝區大小、文件塊大小以及文件權限。 

三、public boolean copyFromLocal(Path src, Path dst) throws IOException 
將本地文件拷貝到文件系統 

四、public boolean exists(Path f) throws IOException 
檢查文件或目錄是否存在 

五、public boolean delete(Path f, Boolean recursive) 
永久性刪除指定的文件或目錄,若是f是一個空目錄或者文件,那麼recursive的值就會被忽略。只有recursive=true時,一個非空目錄及其內容纔會被刪除。 

六、FileStatus類封裝了文件系統中文件和目錄的元數據,包括文件長度、塊大小、備份、修改時間、全部者以及權限信息。 

經過"FileStatus.getPath()"可查看指定HDFS中某個目錄下全部文件。 html

01 package hdfsTest;
02  
03 import java.io.IOException;
04  
05 import org.apache.hadoop.conf.Configuration;
06 import org.apache.hadoop.fs.FSDataOutputStream;
07 import org.apache.hadoop.fs.FileStatus;
08 import org.apache.hadoop.fs.FileSystem;
09 import org.apache.hadoop.fs.Path;
10  
11 public class OperatingFiles {
12     //initialization
13     static Configuration conf = new Configuration();
14     static FileSystem hdfs;
15     static {
16         String path = "/usr/java/hadoop-1.0.3/conf/";
17         conf.addResource(new Path(path + "core-site.xml"));
18         conf.addResource(new Path(path + "hdfs-site.xml"));
19         conf.addResource(new Path(path + "mapred-site.xml"));
20         path = "/usr/java/hbase-0.90.3/conf/";
21         conf.addResource(new Path(path + "hbase-site.xml"));
22         try {
23             hdfs = FileSystem.get(conf);
24         catch (IOException e) {
25             e.printStackTrace();
26         }
27     }
28      
29     //create a direction
30     public void createDir(String dir) throws IOException {
31         Path path = new Path(dir);
32         hdfs.mkdirs(path);
33         System.out.println("new dir \t" + conf.get("fs.default.name") + dir);
34     }  
35      
36     //copy from local file to HDFS file
37     public void copyFile(String localSrc, String hdfsDst) throws IOException{
38         Path src = new Path(localSrc);     
39         Path dst = new Path(hdfsDst);
40         hdfs.copyFromLocalFile(src, dst);
41          
42         //list all the files in the current direction
43         FileStatus files[] = hdfs.listStatus(dst);
44         System.out.println("Upload to \t" + conf.get("fs.default.name") + hdfsDst);
45         for (FileStatus file : files) {
46             System.out.println(file.getPath());
47         }
48     }
49      
50     //create a new file
51     public void createFile(String fileName, String fileContent) throws IOException {
52         Path dst = new Path(fileName);
53         byte[] bytes = fileContent.getBytes();
54         FSDataOutputStream output = hdfs.create(dst);
55         output.write(bytes);
56         System.out.println("new file \t" + conf.get("fs.default.name") + fileName);
57     }
58      
59     //list all files
60     public void listFiles(String dirName) throws IOException {
61         Path f = new Path(dirName);
62         FileStatus[] status = hdfs.listStatus(f);
63         System.out.println(dirName + " has all files:");
64         for (int i = 0; i< status.length; i++) {
65             System.out.println(status[i].getPath().toString());
66         }
67     }
68  
69     //judge a file existed? and delete it!
70     public void deleteFile(String fileName) throws IOException {
71         Path f = new Path(fileName);
72         boolean isExists = hdfs.exists(f);
73         if (isExists) { //if exists, delete
74             boolean isDel = hdfs.delete(f,true);
75             System.out.println(fileName + "  delete? \t" + isDel);
76         else {
77             System.out.println(fileName + "  exist? \t" + isExists);
78         }
79     }
80  
81     public static void main(String[] args) throws IOException {
82         OperatingFiles ofs = new OperatingFiles();
83         System.out.println("\n=======create dir=======");
84         String dir = "/test";
85         ofs.createDir(dir);
86         System.out.println("\n=======copy file=======");
87         String src = "/home/ictclas/Configure.xml";
88         ofs.copyFile(src, dir);
89         System.out.println("\n=======create a file=======");
90         String fileContent = "Hello, world! Just a test.";
91         ofs.createFile(dir+"/word.txt", fileContent);
92     }
93 }

Using HDFS in java (0.20.0)

Below is a code sample of how to read from and write to HDFS in java. 

1. Creating a configuration object:   To be able to read from or write to HDFS, you need to create a Configuration object and pass configuration parameter to it using hadoop configuration files.  
  
    // Conf object will read the HDFS configuration parameters from these   XML
    // files. You may specify the parameters for your own if you want.
 

    Configuration conf = new Configuration(); 
    conf.addResource(new Path("/opt/hadoop-0.20.0/conf/core-site.xml")); 
    conf.addResource(new Path("/opt/hadoop-0.20.0/conf/hdfs-site.xml")); 

    If you do not assign the configurations to conf object (using hadoop xml file) your HDFS operation will be performed on the local file system and not on the HDFS. 

2. Adding file to HDFS:
 Create a FileSystem object and use a file stream to add a file. 

    FileSystem fileSystem = FileSystem.get(conf);
    
    // Check if the file already exists

    Path path = new Path("/path/to/file.ext");
    if (fileSystem.exists(path)) {
        System.out.println("File " + dest + " already exists");
        return;
    }

    // Create a new file and write data to it.
    FSDataOutputStream out = fileSystem.create(path);
    InputStream in = new BufferedInputStream(new FileInputStream(
        new File(source)));


    byte[] b = new byte[1024];
    int numBytes = 0;
    while ((numBytes = in.read(b)) > 0) {
        out.write(b, 0, numBytes);
    }

    // Close all the file descripters
    in.close();
    out.close();
    fileSystem.close();

3. Reading file from HDFS: Create a file stream object to a file in HDFS and read it. 

    FileSystem fileSystem = FileSystem.get(conf);

    Path path = new Path("/path/to/file.ext");
 
    if (!fileSystem.exists(path)) { 
        System.out.println("File does not exists"); 
        return; 
    }

    FSDataInputStream in = fileSystem.open(path);
 

    String filename = file.substring(file.lastIndexOf('/') + 1,
        file.length());
 

    OutputStream out = new BufferedOutputStream(new FileOutputStream(
        new File(filename)));
 

    byte[] b = new byte[1024]; 
    int numBytes = 0; 
    while ((numBytes = in.read(b)) > 0) { 
        out.write(b, 0, numBytes); 
    } 

    in.close(); 
    out.close(); 
    fileSystem.close(); 

3. Deleting file from HDFS: Create a file stream object to a file in HDFS and delete it. 

    FileSystem fileSystem = FileSystem.get(conf); 

    Path path = new Path("/path/to/file.ext"); 
    if (!fileSystem.exists(path)) { 
        System.out.println("File does not exists"); 
        return; 
    }

    // Delete file
    fileSystem.delete(new Path(file), true);
 

    fileSystem.close(); 

3. Create dir in HDFS: Create a file stream object to a file in HDFS and read it. 

    FileSystem fileSystem = FileSystem.get(conf); 

    Path path = new Path(dir); 
    if (fileSystem.exists(path)) { 
        System.out.println("Dir " + dir + " already not exists"); 
        return; 
    }

    // Create directories
    fileSystem.mkdirs(path);
 

    fileSystem.close(); 

Code:

001 import java.io.BufferedInputStream;
002 import java.io.BufferedOutputStream;
003 import java.io.File;
004 import java.io.FileInputStream;
005 import java.io.FileOutputStream;
006 import java.io.IOException;
007 import java.io.InputStream;
008 import java.io.OutputStream;
009  
010 import org.apache.hadoop.conf.Configuration;
011 import org.apache.hadoop.fs.FSDataInputStream;
012 import org.apache.hadoop.fs.FSDataOutputStream;
013 import org.apache.hadoop.fs.FileSystem;
014 import org.apache.hadoop.fs.Path;
015  
016 public class HDFSClient {
017     public HDFSClient() {
018  
019     }
020  
021     public void addFile(String source, String dest) throws IOException {
022         Configuration conf = new Configuration();
023  
024         // Conf object will read the HDFS configuration parameters from these
025         // XML files.
026         conf.addResource(new Path("/opt/hadoop-0.20.0/conf/core-site.xml"));
027         conf.addResource(new Path("/opt/hadoop-0.20.0/conf/hdfs-site.xml"));
028  
029         FileSystem fileSystem = FileSystem.get(conf);
030  
031         // Get the filename out of the file path
032         String filename = source.substring(source.lastIndexOf('/') + 1,
033             source.length());
034  
035         // Create the destination path including the filename.
036         if (dest.charAt(dest.length() - 1) != '/') {
037             dest = dest + "/" + filename;
038         else {
039             dest = dest + filename;
040         }
041  
042         // System.out.println("Adding file to " + destination);
043  
044         // Check if the file already exists
045         Path path = new Path(dest);
046         if (fileSystem.exists(path)) {
047             System.out.println("File " + dest + " already exists");
048             return;
049         }
050  
051         // Create a new file and write data to it.
052         FSDataOutputStream out = fileSystem.create(path);
053         InputStream in = new BufferedInputStream(new FileInputStream(
054             new File(source)));
055  
056         byte[] b = new byte[1024];
057         int numBytes = 0;
058         while ((numBytes = in.read(b)) > 0) {
059             out.write(b, 0, numBytes);
060         }
061  
062         // Close all the file descripters
063         in.close();
064         out.close();
065         fileSystem.close();
066     }
067  
068     public void readFile(String file) throws IOException {
069         Configuration conf = new Configuration();
070         conf.addResource(new Path("/opt/hadoop-0.20.0/conf/core-site.xml"));
071  
072         FileSystem fileSystem = FileSystem.get(conf);
073  
074         Path path = new Path(file);
075         if (!fileSystem.exists(path)) {
076             System.out.println("File " + file + " does not exists");
077             return;
078         }
079  
080         FSDataInputStream in = fileSystem.open(path);
081  
082         String filename = file.substring(file.lastIndexOf('/') + 1,
083             file.length());
084  
085         OutputStream out = new BufferedOutputStream(new FileOutputStream(
086             new File(filename)));
087  
088         byte[] b = new byte[1024];
089         int numBytes = 0;
090         while ((numBytes = in.read(b)) > 0) {
091             out.write(b, 0, numBytes);
092         }
093  
094         in.close();
095         out.close();
096         fileSystem.close();
097     }
098  
099     public void deleteFile(String file) throws IOException {
100         Configuration conf = new Configuration();
101         conf.addResource(new Path("/opt/hadoop-0.20.0/conf/core-site.xml"));
102  
103         FileSystem fileSystem = FileSystem.get(conf);
104  
105
相關文章
相關標籤/搜索