IDEA 在hdfs中建立目錄

1.先下載maven並配置好maven環境變量。以後在idea中配置好maven和tomcatjava

2.建立maven項目spring

hdfs-projacet

    在項目中的pom.xml中導入相應的依賴apache

 1 <dependencies>
 2 	<dependency>
 3 		<groupId>junit</groupId>
 4 		<artifactId>junit</artifactId>
 5 		<version>RELEASE</version>
 6 	</dependency>
 7 	<dependency>
 8 		<groupId>org.apache.logging.log4j</groupId>
 9 		<artifactId>log4j-core</artifactId>
 10 		<version>2.8.2</version>
 11 	</dependency>
 12 	<dependency>
 13 		<groupId>org.apache.hadoop</groupId>
 14 		<artifactId>hadoop-common</artifactId>
 15 		<version>2.7.2</version>
 16 	</dependency>
 17 	<dependency>
 18 		<groupId>org.apache.hadoop</groupId>
 19 		<artifactId>hadoop-client</artifactId>
 20 		<version>2.7.2</version>
 21 	</dependency>
 22 	<dependency>
 23 		<groupId>org.apache.hadoop</groupId>
 24 		<artifactId>hadoop-hdfs</artifactId>
 25 		<version>2.7.2</version>
 26 	</dependency>
 27 	<dependency>
 28 		<groupId>jdk.tools</groupId>
 29 		<artifactId>jdk.tools</artifactId>
 30 		<version>1.8</version>
 31 		<scope>system</scope>
 32 		<systemPath>${JAVA_HOME}/lib/tools.jar</systemPath>
 33 	</dependency>
 34 </dependencies>

    在項目中的src/main/resources目錄下新建一個文件——log4j.propertiestomcat

 1 log4j.rootLogger=INFO, stdout
 2 log4j.appender.stdout=org.apache.log4j.ConsoleAppender
 3 log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
 4 log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - %m%n
 5 log4j.appender.logfile=org.apache.log4j.FileAppender
 6 log4j.appender.logfile.File=target/spring.log
 7 log4j.appender.logfile.layout=org.apache.log4j.PatternLayout
 8 log4j.appender.logfile.layout.ConversionPattern=%d %p [%c] - %m%n

    在項目中的src/main/目錄下建立java文件夾,建立com.hadoop.demo包->HDFSClientDemoapp

 1 public class HDFSClientDemo {
 2     public static void main(String[] args) throws IOException, InterruptedException {
 3         //遠程操做hadoop集羣
 4         //獲取文件系統
 5         Configuration conf = new Configuration();
 6         //設置集羣配置信息,鍵值對 (由於在本機上沒有配置host,因此"hdfs://hadoop161:9000"沒法識別,需改爲要操做的機子的ip地址)
 7         conf.set("fs.defaultFS","hdfs://192.168.12.161:9000");
 8         //攔截本機user用戶而後改爲集羣上的hadoop用戶
 9         System.setProperty("HADOOP_USER_NAME","hadoop");
 10         //獲取HDFS客戶端對象
 11         FileSystem fs = FileSystem.get(conf);
 12         //在hdfs上建立目錄
 13         fs.mkdirs(new Path("/0300/abc"));
 14         //關閉資源(釋放資源)
 15         fs.close();
 16         //驗證程序是否結束
 17         System.out.println("over");
 18     }
 19 }
 20

3.hdfs上查看maven

hdfs

相關文章
相關標籤/搜索