使用"hdfs dfs"實用程序來管理HDFS

            使用"hdfs dfs"實用程序來管理HDFS
html

                                   做者:尹正傑node

版權聲明:原創做品,謝絕轉載!不然將追究法律責任。git

 

 

一.命令行是管理HDFS存儲的最經常使用方法算法

使用HDFS是一項最多見的Hadoop管理工做。雖然能夠經過不少方式訪問HDFS,但命令行是管理HDFS存儲的最經常使用方法。

能夠經過如下幾種方式訪問HDFS:
  (1)使用Java API方式訪問HDFS集羣;
  (2)在命令行使用簡單的相似Linux的文件系統命令行能夠管理HDFS集羣(基於命令行訪問HDFS的方式本質上是Hadoop官方對Java API的一種封裝);
  (3)使用NameNode的WebUI訪問HDFS集羣;
  (4)使用稱爲WebHDFS的Web界面訪問HDFS集羣;
  (5)使用HttpFS網關經過防火牆訪問HDFS集羣;
  (6)經過Hue的文件瀏覽器

雖然可使用多種方式訪問HDFS,可是多數時候使用命令行來管理HDFS文件和目錄。能夠在命令行使用hdfs dfs文件系統命令訪問HDFS集羣。

舒適提示:
  重要的是要記住,HDFS文件系統只是Hadoop實現文件系統的一種方式。還有其它幾個能夠在Hadoop上工做的使用Java實現的文件系統,包括本地文件系統(文件),WebHDFS,HAR(Hadoop規檔文件),View(viewfs)及S3等。
  對於每種文件系統,Hadoop使用不一樣的文件系統實例URI方案,以便於其鏈接。例如,使用文件URI方案列出本地文件中的文件,以下所示(這將獲得一個存儲在本地Linux文件系統上的文件列表)。

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls file:///
Found 20 items
dr-xr-xr-x   - root root      20480 2020-08-12 10:28 file:///bin
dr-xr-xr-x   - root root       4096 2020-01-20 04:21 file:///boot
drwxr-xr-x   - root root       3260 2020-08-14 00:06 file:///dev
drwxr-xr-x   - root root       8192 2020-08-12 10:28 file:///etc
drwxr-xr-x   - root root          6 2018-04-11 12:59 file:///home
dr-xr-xr-x   - root root       4096 2020-01-20 05:29 file:///lib
dr-xr-xr-x   - root root      24576 2020-08-12 10:28 file:///lib64
drwxr-xr-x   - root root          6 2018-04-11 12:59 file:///media
drwxr-xr-x   - root root          6 2018-04-11 12:59 file:///mnt
drwxr-xr-x   - root root          6 2018-04-11 12:59 file:///opt
dr-xr-xr-x   - root root          0 2020-08-14 00:05 file:///proc
dr-xr-x---   - root root        196 2020-08-14 03:11 file:///root
drwxr-xr-x   - root root        600 2020-08-14 00:06 file:///run
dr-xr-xr-x   - root root      12288 2020-08-06 06:33 file:///sbin
drwxr-xr-x   - root root          6 2018-04-11 12:59 file:///srv
dr-xr-xr-x   - root root          0 2020-08-14 00:05 file:///sys
drwxrwxrwt   - root root       4096 2020-08-14 03:32 file:///tmp
drwxr-xr-x   - root root        167 2020-01-21 01:45 file:///usr
drwxr-xr-x   - root root        267 2020-01-20 04:22 file:///var
drwxr-xr-x   - root root         35 2020-08-11 21:39 file:///yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 

 

二.hdfs dfs命令概述shell

有幾個Linux文件和目錄命名在HDFS中有對應的命令,如ls,cp和mv命令。可是,Linux文件系統命令和HDFS文件系統命令之間的一個很大的區別是,在HDFS中沒有有關於目錄位置的命令,例如,HDFS中沒有pwd命令或cd命令。

以下所示,可使用"hdfs dfs"使用程序在Hadoop中執行HDFS命令,接下來咱們會演示如何使用該命令。
[root@hadoop101.yinzhengjie.com
~]# hdfs dfs Usage: hadoop fs [generic options] [-appendToFile <localsrc> ... <dst>] [-cat [-ignoreCrc] <src> ...] [-checksum <src> ...] [-chgrp [-R] GROUP PATH...] [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...] [-chown [-R] [OWNER][:[GROUP]] PATH...] [-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>] [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>] [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ...] [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>] [-createSnapshot <snapshotDir> [<snapshotName>]] [-deleteSnapshot <snapshotDir> <snapshotName>] [-df [-h] [<path> ...]] [-du [-s] [-h] [-x] <path> ...] [-expunge] [-find <path> ... <expression> ...] [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>] [-getfacl [-R] <path>] [-getfattr [-R] {-n name | -d} [-e en] <path>] [-getmerge [-nl] [-skip-empty-file] <src> <localdst>] [-help [cmd ...]] [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]] [-mkdir [-p] <path> ...] [-moveFromLocal <localsrc> ... <dst>] [-moveToLocal <src> <localdst>] [-mv <src> ... <dst>] [-put [-f] [-p] [-l] [-d] <localsrc> ... <dst>] [-renameSnapshot <snapshotDir> <oldName> <newName>] [-rm [-f] [-r|-R] [-skipTrash] [-safely] <src> ...] [-rmdir [--ignore-fail-on-non-empty] <dir> ...] [-setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]] [-setfattr {-n name [-v value] | -x name} <path>] [-setrep [-R] [-w] <rep> <path> ...] [-stat [format] <path> ...] [-tail [-f] <file>] [-test -[defsz] <path>] [-text [-ignoreCrc] <src> ...] [-touchz <path> ...] [-truncate [-w] <length> <path> ...] [-usage [cmd ...]] Generic options supported are: -conf <configuration file> specify an application configuration file -D <property=value> define a value for a given property -fs <file:///|hdfs://namenode:port> specify default filesystem URL to use, overrides 'fs.defaultFS' property from configurations. -jt <local|resourcemanager:port> specify a ResourceManager -files <file1,...> specify a comma-separated list of files to be copied to the map reduce cluster -libjars <jar1,...> specify a comma-separated list of jar files to be included in the classpath -archives <archive1,...> specify a comma-separated list of archives to be unarchived on the compute machines The general command line syntax is: command [genericOptions] [commandOptions] [root@hadoop101.yinzhengjie.com ~]#

 

三.hdfs dfs實戰案例express

1>.查看給定命令的幫助信息vim

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -usage ls          #顯示ls命令的使用方法,該幫助信息相對(-help)來講比較簡潔。
Usage: hadoop fs [generic options] -ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help ls          #顯示ls命令的使用方法,該幫助信息相對(-usage)來講比較詳細。
-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...] :
  List the contents that match the specified file pattern. If path is not
  specified, the contents of /user/<currentUser> will be listed. For a directory a
  list of its direct children is returned (unless -d option is specified).
  
  Directory entries are of the form:
      permissions - userId groupId sizeOfDirectory(in bytes)
  modificationDate(yyyy-MM-dd HH:mm) directoryName
  
  and file entries are of the form:
      permissions numberOfReplicas userId groupId sizeOfFile(in bytes)
  modificationDate(yyyy-MM-dd HH:mm) fileName
  
    -C  Display the paths of files and directories only.
    -d  Directories are listed as plain files.
    -h  Formats the sizes of files in a human-readable fashion
        rather than a number of bytes.
    -q  Print ? instead of non-printable characters.
    -R  Recursively list the contents of directories.
    -t  Sort files by modification time (most recent first).
    -S  Sort files by size.
    -r  Reverse the order of the sort.
    -u  Use time of last access instead of modification for
        display and sorting.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help ls          #顯示ls命令的使用方法,該幫助信息相對(-usage)來講比較詳細。

2>.列出文件和目錄瀏覽器

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /                 #僅查看HDFS的根路徑下存在的文件或目錄(注意,新建的集羣默認是沒有文件和目錄的喲~)
Found 3 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 07:07 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /             #僅查看HDFS的根路徑下存在的文件或目錄(注意,新建的集羣默認是沒有文件和目錄的喲~)
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls -R /               #遞歸查看HDFS的根路徑下存在的文件或目錄
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
drwxr-xr-x   - root admingroup          0 2020-08-14 07:07 /yinzhengjie/data
drwxr-xr-x   - root admingroup          0 2020-08-14 07:07 /yinzhengjie/data/hadoop
drwxr-xr-x   - root admingroup          0 2020-08-14 07:07 /yinzhengjie/data/hadoop/hdfs
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie/yum.repos.d
-rw-r--r--   3 root admingroup       1664 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Base.repo
-rw-r--r--   3 root admingroup       1309 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-CR.repo
-rw-r--r--   3 root admingroup        649 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Debuginfo.repo
-rw-r--r--   3 root admingroup        630 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Media.repo
-rw-r--r--   3 root admingroup       1331 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Sources.repo
-rw-r--r--   3 root admingroup       5701 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Vault.repo
-rw-r--r--   3 root admingroup        314 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-fasttrack.repo
-rw-r--r--   3 root admingroup       1050 2020-08-14 07:13 /yinzhengjie/yum.repos.d/epel-testing.repo
-rw-r--r--   3 root admingroup        951 2020-08-14 07:13 /yinzhengjie/yum.repos.d/epel.repo
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls -R /           #遞歸查看HDFS的根路徑下存在的文件或目錄
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls hdfs://hadoop101.yinzhengjie.com:9000/        #列出文件時指定HDFS URI
Found 3 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 hdfs://hadoop101.yinzhengjie.com:9000/bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 hdfs://hadoop101.yinzhengjie.com:9000/hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 hdfs://hadoop101.yinzhengjie.com:9000/yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls hdfs://hadoop101.yinzhengjie.com:9000/        #列出文件時指定HDFS URI
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/data/ /yinzhengjie/yum.repos.d      #列出文件時能夠指定多個文件或目錄
Found 1 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:07 /yinzhengjie/data/hadoop
Found 9 items
-rw-r--r--   3 root admingroup       1664 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Base.repo
-rw-r--r--   3 root admingroup       1309 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-CR.repo
-rw-r--r--   3 root admingroup        649 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Debuginfo.repo
-rw-r--r--   3 root admingroup        630 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Media.repo
-rw-r--r--   3 root admingroup       1331 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Sources.repo
-rw-r--r--   3 root admingroup       5701 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Vault.repo
-rw-r--r--   3 root admingroup        314 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-fasttrack.repo
-rw-r--r--   3 root admingroup       1050 2020-08-14 07:13 /yinzhengjie/yum.repos.d/epel-testing.repo
-rw-r--r--   3 root admingroup        951 2020-08-14 07:13 /yinzhengjie/yum.repos.d/epel.repo
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/data/ /yinzhengjie/yum.repos.d      #列出文件時能夠指定多個文件或目錄
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /             #注意下面的文件信息喲,當列出文件時,將顯示每一個文件的複製因子,很顯然,默認的副本數是3.
Found 3 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts          
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls -d /yinzhengjie/ /bigdata                #查看與目錄相關的信息
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls -d /yinzhengjie/ /bigdata                #查看與目錄相關的信息
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls file:///root                      #查看本地文件系統的"/root"目錄下的文件或目錄
Found 14 items
drwx------   - root root         27 2020-08-12 10:47 file:///root/.ansible
-rw-------   1 root root      22708 2020-08-14 21:46 file:///root/.bash_history
-rw-r--r--   1 root root         18 2013-12-29 10:26 file:///root/.bash_logout
-rw-r--r--   1 root root        176 2013-12-29 10:26 file:///root/.bash_profile
-rw-r--r--   1 root root        176 2013-12-29 10:26 file:///root/.bashrc
-rw-r--r--   1 root root        100 2013-12-29 10:26 file:///root/.cshrc
drwxr-----   - root root         19 2020-08-12 10:27 file:///root/.pki
drwx------   - root root         80 2020-08-12 10:39 file:///root/.ssh
-rw-r--r--   1 root root        129 2013-12-29 10:26 file:///root/.tcshrc
-rw-------   1 root root      11632 2020-08-14 23:11 file:///root/.viminfo
-rw-r--r--   1 root root  392115733 2020-08-10 15:42 file:///root/hadoop-2.10.0.tar.gz
-rw-r--r--   1 root root         26 2020-08-14 23:42 file:///root/hostname
-rw-r--r--   1 root root        371 2020-08-14 23:41 file:///root/hosts
-rw-r--r--   1 root root        397 2020-08-14 23:44 file:///root/res.log
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls hdfs:///                    #查看HDFS文件系統的"/"目錄下的文件或目錄
Found 3 items
--w-------   2 jason yinzhengjie        371 2020-08-14 21:42 hdfs:///hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 hdfs:///user
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 hdfs:///yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /                        #不難發現,默認使用的是"hdfs://"協議
Found 3 items
--w-------   2 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls file:///root                      #查看本地文件系統的"/root"目錄下的文件或目錄

3>.獲取有關文件的詳細信息bash

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help stat          #查看stat命令的幫助信息
-stat [format] <path> ... :
  Print statistics about the file/directory at <path>
  in the specified format. Format accepts permissions in
  octal (%a) and symbolic (%A), filesize in
  bytes (%b), type (%F), group name of owner (%g),
  name (%n), block size (%o), replication (%r), user name
  of owner (%u), access date (%x, %X).
  modification date (%y, %Y).
  %x and %y show UTC date as "yyyy-MM-dd HH:mm:ss" and
  %X and %Y show milliseconds since January 1, 1970 UTC.
  If the format is not specified, %y is used by default.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help stat          #查看stat命令的幫助信息
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 3 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -stat "%a | %A | %b | %F | %g | %n | %o  | %r | %u | %y | %Y" /hosts    #獲取文件的詳細信息
644 | rw-r--r-- | 371 | regular file | admingroup | hosts | 536870912  | 3 | root | 2020-08-13 23:09:55 | 1597360195058
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -stat "%a | %A | %b | %F | %g | %n | %o | %r | %u | %y | %Y" /hosts    #獲取文件的詳細信息
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 3 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -stat "%a | %A | %b | %F | %g | %n | %o  | %r | %u | %y | %Y" /yinzhengjie/      #若是對目錄運行"stat"指令,它會指出其的確是一個目錄。
755 | rwxr-xr-x | 0 | directory | admingroup | yinzhengjie | 0  | 0 | root | 2020-08-13 23:13:11 | 1597360391417
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -stat "%a %A %b %F %g %n %o %r %u %y %Y" /bigdata
755 rwxr-xr-x 0 directory admingroup bigdata 0 0 root 2020-08-13 23:08:33 1597360113877
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -stat "%a | %A | %b | %F | %g | %n | %o | %r | %u | %y | %Y" /yinzhengjie/      #若是對目錄運行"stat"指令,它會指出其的確是一個目錄。

4>.建立目錄app

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help mkdir
-mkdir [-p] <path> ... :
  Create a directory in specified location.
                                                  
  -p  Do not fail if the directory already exists 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help mkdir
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 3 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -mkdir /test          #在根路徑下建立一個"test"目錄
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 19:03 /test
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -mkdir /test                #在根路徑下建立一個"test"目錄
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 19:03 /test
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -mkdir -p /test2/sub1/sub2        #遞歸建立一個目錄
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls  /
Found 5 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 19:03 /test
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /test2
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls -R /test2
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /test2/sub1
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /test2/sub1/sub2
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -mkdir -p /test2/sub1/sub2        #遞歸建立一個目錄
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls  /
Found 5 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 19:03 /test
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /test2
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -mkdir hdfs://hadoop101.yinzhengjie.com:9000/test3        #建立目錄時也能夠指定帶有URI的目錄
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls  /
Found 6 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 19:03 /test
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /test2
drwxr-xr-x   - root admingroup          0 2020-08-14 19:09 /test3
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -mkdir hdfs://hadoop101.yinzhengjie.com:9000/test3        #建立目錄時也能夠指定帶有URI的目錄
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls  /
Found 6 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 19:03 /test
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /test2
drwxr-xr-x   - root admingroup          0 2020-08-14 19:09 /test3
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -mkdir /dir001 /dir002 /dir003            #能夠同時指定多個參數(用空格分割)以建立多個目錄。
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls  /
Found 9 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
drwxr-xr-x   - root admingroup          0 2020-08-14 19:10 /dir001
drwxr-xr-x   - root admingroup          0 2020-08-14 19:10 /dir002
drwxr-xr-x   - root admingroup          0 2020-08-14 19:10 /dir003
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 19:03 /test
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /test2
drwxr-xr-x   - root admingroup          0 2020-08-14 19:09 /test3
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -mkdir /dir001 /dir002 /dir003      #能夠同時指定多個參數(用空格分割)以建立多個目錄。

5>.建立文件

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help touchz 
-touchz <path> ... :
  Creates a file of zero length at <path> with current time as the timestamp of
  that <path>. An error is returned if the file exists with non-zero length
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help touchz
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 3 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -touchz /hdfs.log          #建立一個空文件
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root  admingroup           0 2020-08-14 22:58 /hdfs.log
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -touchz /hdfs.log          #建立一個空文件

6>.刪除文件和目錄

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help rmr            #rmr已被官方廢棄,推薦使用"rm -r"功能
-rmr :
  (DEPRECATED) Same as '-rm -r'
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help rmr            #rmr已被官方廢棄,推薦使用"rm -r"功能
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help rm
-rm [-f] [-r|-R] [-skipTrash] [-safely] <src> ... :
  Delete all files that match the specified file pattern. Equivalent to the Unix
  command "rm <src>"
                                                                                 
  -f          If the file does not exist, do not display a diagnostic message or 
              modify the exit status to reflect an error.                        
  -[rR]       Recursively deletes directories.                                   
  -skipTrash  option bypasses trash, if enabled, and immediately deletes <src>.  
  -safely     option requires safety confirmation, if enabled, requires          
              confirmation before deleting large directory with more than        
              <hadoop.shell.delete.limit.num.files> files. Delay is expected when
              walking over large directory recursively to count the number of    
              files to be deleted before the confirmation.                       
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help rm
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls  /
Found 6 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 19:03 /test
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /test2
drwxr-xr-x   - root admingroup          0 2020-08-14 19:09 /test3
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -rmdir /test3                    #僅刪除一個空目錄
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls  /
Found 5 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 19:03 /test
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /test2
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -rmdir /test3                   #僅刪除一個空目錄
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls  /
Found 9 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
drwxr-xr-x   - root admingroup          0 2020-08-14 19:10 /dir001
drwxr-xr-x   - root admingroup          0 2020-08-14 19:10 /dir002
drwxr-xr-x   - root admingroup          0 2020-08-14 19:10 /dir003
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 19:03 /test
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /test2
drwxr-xr-x   - root admingroup          0 2020-08-14 19:09 /test3
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -rmdir /dir001 /dir002 /dir003          #同時刪除多個空目錄
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls  /
Found 6 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 19:03 /test
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /test2
drwxr-xr-x   - root admingroup          0 2020-08-14 19:09 /test3
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -rmdir /dir001 /dir002 /dir003         #同時刪除多個空目錄
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls  /yinzhengjie/
Found 2 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:07 /yinzhengjie/data
drwxr-xr-x   - root admingroup          0 2020-08-14 07:13 /yinzhengjie/yum.repos.d
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls  /yinzhengjie/yum.repos.d
Found 9 items
-rw-r--r--   3 root admingroup       1664 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Base.repo
-rw-r--r--   3 root admingroup       1309 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-CR.repo
-rw-r--r--   3 root admingroup        649 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Debuginfo.repo
-rw-r--r--   3 root admingroup        630 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Media.repo
-rw-r--r--   3 root admingroup       1331 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Sources.repo
-rw-r--r--   3 root admingroup       5701 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-Vault.repo
-rw-r--r--   3 root admingroup        314 2020-08-14 07:13 /yinzhengjie/yum.repos.d/CentOS-fasttrack.repo
-rw-r--r--   3 root admingroup       1050 2020-08-14 07:13 /yinzhengjie/yum.repos.d/epel-testing.repo
-rw-r--r--   3 root admingroup        951 2020-08-14 07:13 /yinzhengjie/yum.repos.d/epel.repo
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -rm -R  /yinzhengjie/yum.repos.d          #遞歸刪除非空目錄,若是開啓了回收站功能,數據並無被馬上刪除,而是被移動到回收站啦~
20/08/14 19:19:46 INFO fs.TrashPolicyDefault: Moved: 'hdfs://hadoop101.yinzhengjie.com:9000/yinzhengjie/yum.repos.d' to trash at: hdfs://hadoop101.yinzhengjie.com:9000/user/root/.Trash/Current/yinzhengjie/yum.repos.d
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls  /yinzhengjie/
Found 1 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:07 /yinzhengjie/data
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls  /user/root/.Trash/Current/yinzhengjie/yum.repos.d        #根據上面的提示信息,不難直到數據被移動到回收站所對應的路徑喲~
Found 9 items
-rw-r--r--   3 root admingroup       1664 2020-08-14 07:13 /user/root/.Trash/Current/yinzhengjie/yum.repos.d/CentOS-Base.repo
-rw-r--r--   3 root admingroup       1309 2020-08-14 07:13 /user/root/.Trash/Current/yinzhengjie/yum.repos.d/CentOS-CR.repo
-rw-r--r--   3 root admingroup        649 2020-08-14 07:13 /user/root/.Trash/Current/yinzhengjie/yum.repos.d/CentOS-Debuginfo.repo
-rw-r--r--   3 root admingroup        630 2020-08-14 07:13 /user/root/.Trash/Current/yinzhengjie/yum.repos.d/CentOS-Media.repo
-rw-r--r--   3 root admingroup       1331 2020-08-14 07:13 /user/root/.Trash/Current/yinzhengjie/yum.repos.d/CentOS-Sources.repo
-rw-r--r--   3 root admingroup       5701 2020-08-14 07:13 /user/root/.Trash/Current/yinzhengjie/yum.repos.d/CentOS-Vault.repo
-rw-r--r--   3 root admingroup        314 2020-08-14 07:13 /user/root/.Trash/Current/yinzhengjie/yum.repos.d/CentOS-fasttrack.repo
-rw-r--r--   3 root admingroup       1050 2020-08-14 07:13 /user/root/.Trash/Current/yinzhengjie/yum.repos.d/epel-testing.repo
-rw-r--r--   3 root admingroup        951 2020-08-14 07:13 /user/root/.Trash/Current/yinzhengjie/yum.repos.d/epel.repo
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -rm -R /user/root/.Trash/Current/yinzhengjie/yum.repos.d       #此時,咱們纔是真正意義上的刪除目錄的數據啦~
Deleted /user/root/.Trash/Current/yinzhengjie/yum.repos.d
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -rm -R /yinzhengjie/yum.repos.d         #遞歸刪除非空目錄,若是開啓了回收站功能,數據並無被馬上刪除,而是被移動到回收站啦~
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 6 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /hosts
drwxr-xr-x   - root admingroup          0 2020-08-14 19:03 /test
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /test2
drwx------   - root admingroup          0 2020-08-14 19:19 /user
drwxr-xr-x   - root admingroup          0 2020-08-14 19:19 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -rm /hosts          #刪除文件,若是開啓了回收站,則刪除文件就是將該文件移動到回收站裏
20/08/14 19:26:07 INFO fs.TrashPolicyDefault: Moved: 'hdfs://hadoop101.yinzhengjie.com:9000/hosts' to trash at: hdfs://hadoop101.yinzhengjie.com:9000/user/root/.Trash/Current/hosts
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 5 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
drwxr-xr-x   - root admingroup          0 2020-08-14 19:03 /test
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /test2
drwx------   - root admingroup          0 2020-08-14 19:19 /user
drwxr-xr-x   - root admingroup          0 2020-08-14 19:19 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /user/root/.Trash/Current/hosts
-rw-r--r--   3 root admingroup        371 2020-08-14 07:09 /user/root/.Trash/Current/hosts
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -rm  /user/root/.Trash/Current/hosts
Deleted /user/root/.Trash/Current/hosts
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -rm /hosts                      #刪除文件,若是開啓了回收站,則刪除文件就是將該文件移動到回收站裏
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        540 2020-08-14 19:33 /limits.conf
drwx------   - root admingroup          0 2020-08-14 19:19 /user
drwxr-xr-x   - root admingroup          0 2020-08-14 19:19 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -rm -skipTrash /limits.conf            #儘管咱們開啓了回收站功能,若使用"-skipTrash"選項時將繞過HDFS回收站,即當即刪除指定的文件或目錄。
Deleted /limits.conf
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 3 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
drwx------   - root admingroup          0 2020-08-14 19:19 /user
drwxr-xr-x   - root admingroup          0 2020-08-14 19:19 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -rm -skipTrash /limits.conf            #儘管咱們開啓了回收站功能,若使用"-skipTrash"選項時將繞過HDFS回收站,即當即刪除指定的文件或目錄。

7>.清空回收站

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help expunge
-expunge :
  Delete files from the trash that are older than the retention threshold
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help expunge
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /user/root/.Trash/Current          #查看當前回收站的內容
Found 4 items
-rw-r--r--   3 root admingroup        490 2020-08-14 19:31 /user/root/.Trash/Current/fstab
-rw-r--r--   3 root admingroup      10779 2020-08-14 19:32 /user/root/.Trash/Current/sysctl.conf
drwxr-xr-x   - root admingroup          0 2020-08-14 19:04 /user/root/.Trash/Current/test2
drwx------   - root admingroup          0 2020-08-14 19:21 /user/root/.Trash/Current/yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -expunge               #該命令將刪除配置的時間間隔前的全部文件,即清空回收站。         
20/08/14 19:37:33 INFO fs.TrashPolicyDefault: TrashPolicyDefault#deleteCheckpoint for trashRoot: hdfs://hadoop101.yinzhengjie.com:9000/user/root/.Trash
20/08/14 19:37:33 INFO fs.TrashPolicyDefault: TrashPolicyDefault#deleteCheckpoint for trashRoot: hdfs://hadoop101.yinzhengjie.com:9000/user/root/.Trash
20/08/14 19:37:33 INFO fs.TrashPolicyDefault: TrashPolicyDefault#createCheckpoint for trashRoot: hdfs://hadoop101.yinzhengjie.com:9000/user/root/.Trash
20/08/14 19:37:33 INFO fs.TrashPolicyDefault: Created trash checkpoint: /user/root/.Trash/200814193733
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /user/root/.Trash/Current      #不難發現該目錄已經被刪除啦~
ls: `/user/root/.Trash/Current': No such file or directory
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /user/root/.Trash/
Found 1 items
drwx------   - root admingroup          0 2020-08-14 19:32 /user/root/.Trash/200814193733
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -expunge               #該命令將刪除配置的時間間隔前的全部文件,即清空回收站。

8>.重命名文件或目錄

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help mv
-mv <src> ... <dst> :
  Move files that match the specified file pattern <src> to a destination <dst>. 
  When moving multiple files, the destination must be a directory.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help mv
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root  admingroup           0 2020-08-14 22:58 /hdfs.log
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -mv /hdfs.log /hdfs2020.log      #將"hdfs.log"文件重命名爲"hdfs2020.log"
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root  admingroup           0 2020-08-14 22:58 /hdfs2020.log
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -mv /hdfs.log /hdfs2020.log      #將"hdfs.log"文件重命名爲"hdfs2020.log"
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root  admingroup           0 2020-08-14 22:58 /hdfs2020.log
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -mv /bigdata /yinzhengjie        #將"/bigdata"目錄改名爲"/yinzhengjie"
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
-rw-r--r--   3 root  admingroup           0 2020-08-14 22:58 /hdfs2020.log
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -mv /bigdata /yinzhengjie        #將"/bigdata"目錄改名爲"/yinzhengjie"

9>.拷貝文件或目錄

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help cp 
-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst> :
  Copy files that match the file pattern <src> to a destination.  When copying
  multiple files, the destination must be a directory. Passing -p preserves status
  [topax] (timestamps, ownership, permission, ACLs, XAttr). If -p is specified
  with no <arg>, then preserves timestamps, ownership, permission. If -pa is
  specified, then preserves permission also because ACL is a super-set of
  permission. Passing -f overwrites the destination if it already exists. raw
  namespace extended attributes are preserved if (1) they are supported (HDFS
  only) and, (2) all of the source and target pathnames are in the /.reserved/raw
  hierarchy. raw namespace xattr preservation is determined solely by the presence
  (or absence) of the /.reserved/raw prefix and not by the -p option. Passing -d
  will skip creation of temporary file(<dst>._COPYING_).
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help cp
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 6 items
-rw-r--r--   3 root  admingroup   392115733 2020-08-14 23:25 /hadoop-2.10.0.tar.gz
-rw-r--r--   3 root  admingroup          26 2020-08-14 23:42 /hostname
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
-rw-r--r--   3 root  admingroup          69 2020-08-14 23:14 /wc.txt.gz
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -cp /yinzhengjie/ /yinzhengjie2020          #在HDFS集羣上拷貝目錄
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 7 items
-rw-r--r--   3 root  admingroup   392115733 2020-08-14 23:25 /hadoop-2.10.0.tar.gz
-rw-r--r--   3 root  admingroup          26 2020-08-14 23:42 /hostname
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
-rw-r--r--   3 root  admingroup          69 2020-08-14 23:14 /wc.txt.gz
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:48 /yinzhengjie2020
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -cp /yinzhengjie/ /yinzhengjie2020          #在HDFS集羣上拷貝目錄
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 7 items
-rw-r--r--   3 root  admingroup   392115733 2020-08-14 23:25 /hadoop-2.10.0.tar.gz
-rw-r--r--   3 root  admingroup          26 2020-08-14 23:42 /hostname
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
-rw-r--r--   3 root  admingroup          69 2020-08-14 23:14 /wc.txt.gz
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:48 /yinzhengjie2020
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -cp /hosts /hosts2020          #拷貝HDFS的文件
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 8 items
-rw-r--r--   3 root  admingroup   392115733 2020-08-14 23:25 /hadoop-2.10.0.tar.gz
-rw-r--r--   3 root  admingroup          26 2020-08-14 23:42 /hostname
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
-rw-r--r--   3 root  admingroup         371 2020-08-14 23:49 /hosts2020
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
-rw-r--r--   3 root  admingroup          69 2020-08-14 23:14 /wc.txt.gz
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:48 /yinzhengjie2020
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -cp /hosts /hosts2020                  #拷貝HDFS的文件

10>.將本地文件或目錄上傳到HDFS集羣

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help put
-put [-f] [-p] [-l] [-d] <localsrc> ... <dst> :
  Copy files from the local file system into fs. Copying fails if the file already
  exists, unless the -f flag is given.
  Flags:
                                                                       
  -p  Preserves access and modification times, ownership and the mode. 
  -f  Overwrites the destination if it already exists.                 
  -l  Allow DataNode to lazily persist the file to disk. Forces        
         replication factor of 1. This flag will result in reduced
         durability. Use with care.
                                                        
  -d  Skip creation of temporary file(<dst>._COPYING_). 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help put
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help copyFromLocal 
-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst> :
  Identical to the -put command.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help copyFromLocal        #功能基本上和put相似
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help moveFromLocal 
-moveFromLocal <localsrc> ... <dst> :
  Same as -put, except that the source is deleted after it's copied.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help moveFromLocal        #功能基本上和put相似,只不過它相比上面兩個命令,區別在於上傳文件後會刪除源文件
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 3 items
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -put /etc/yum.repos.d/ /yinzhengjie/        #咱們將本地的yum倉庫配置文件目錄上傳到HDFS的"/yinzhengjie"路徑下
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 1 items
drwxr-xr-x   - root admingroup          0 2020-08-14 23:13 /yinzhengjie/yum.repos.d
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/yum.repos.d
Found 9 items
-rw-r--r--   3 root admingroup       1664 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Base.repo
-rw-r--r--   3 root admingroup       1309 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-CR.repo
-rw-r--r--   3 root admingroup        649 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Debuginfo.repo
-rw-r--r--   3 root admingroup        630 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Media.repo
-rw-r--r--   3 root admingroup       1331 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Sources.repo
-rw-r--r--   3 root admingroup       5701 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Vault.repo
-rw-r--r--   3 root admingroup        314 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-fasttrack.repo
-rw-r--r--   3 root admingroup       1050 2020-08-14 23:13 /yinzhengjie/yum.repos.d/epel-testing.repo
-rw-r--r--   3 root admingroup        951 2020-08-14 23:13 /yinzhengjie/yum.repos.d/epel.repo
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -put /etc/yum.repos.d/ /yinzhengjie/        #咱們將本地的yum倉庫配置文件目錄上傳到HDFS的"/yinzhengjie"路徑下
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 3 items
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382932
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
-rw-r--r-- 1 root root        69 Aug 14 23:11 wc.txt.gz
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -put wc.txt.gz /              #將Linux的文件上傳到HDFS的"/"路徑下
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
-rw-r--r--   3 root  admingroup          69 2020-08-14 23:14 /wc.txt.gz
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:13 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -put wc.txt.gz /                    #將Linux的文件上傳到HDFS的"/"路徑下
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 1 items
drwxr-xr-x   - root admingroup          0 2020-08-14 23:13 /yinzhengjie/yum.repos.d
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382932
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
-rw-r--r-- 1 root root        69 Aug 14 23:11 wc.txt.gz
[root@hadoop101.yinzhengjie.com ~]#   
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -moveFromLocal wc.txt.gz /yinzhengjie/      #將文件上傳到HDFS集羣后,會刪除源文件
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382928
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
-rw-r--r--   3 root admingroup         69 2020-08-14 23:22 /yinzhengjie/wc.txt.gz
drwxr-xr-x   - root admingroup          0 2020-08-14 23:13 /yinzhengjie/yum.repos.d
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -moveFromLocal wc.txt.gz /yinzhengjie/      #將文件上傳到HDFS集羣后,會刪除源文件
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
-rw-r--r--   3 root  admingroup          69 2020-08-14 23:14 /wc.txt.gz
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382928
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -copyFromLocal hadoop-2.10.0.tar.gz /        #將Linux本地文件上傳到HDFS上
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382928
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 5 items
-rw-r--r--   3 root  admingroup   392115733 2020-08-14 23:25 /hadoop-2.10.0.tar.gz
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
-rw-r--r--   3 root  admingroup          69 2020-08-14 23:14 /wc.txt.gz
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -copyFromLocal hadoop-2.10.0.tar.gz /        #將Linux本地文件上傳到HDFS上

11>.下載文件到本地

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help get
-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst> :
  Copy files that match the file pattern <src> to the local name.  <src> is kept. 
  When copying multiple files, the destination must be a directory. Passing -f
  overwrites the destination if it already exists and -p preserves access and
  modification times, ownership and the mode.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help get
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help copyToLocal        #和"-get"命令相似
-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst> :
  Identical to the -get command.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help copyToLocal              #和"-get"命令相似
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help getmerge           #可用同時下載多個文件併合並程一個文件到本地
-getmerge [-nl] [-skip-empty-file] <src> <localdst> :
  Get all the files in the directories that match the source file pattern and
  merge and sort them to only one file on local fs. <src> is kept.
                                                                     
  -nl               Add a newline character at the end of each file. 
  -skip-empty-file  Do not add new line character for empty file.    
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help getmerge               #可用同時下載多個文件併合並程一個文件到本地
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 5 items
-rw-r--r--   3 root  admingroup   392115733 2020-08-14 23:25 /hadoop-2.10.0.tar.gz
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
-rw-r--r--   3 root  admingroup          69 2020-08-14 23:14 /wc.txt.gz
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382928
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -get /hosts                #下載文件到本地
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382932
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
-rw-r--r-- 1 root root       371 Aug 14 23:41 hosts
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -get /hosts                 #下載文件到本地
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 6 items
-rw-r--r--   3 root  admingroup   392115733 2020-08-14 23:25 /hadoop-2.10.0.tar.gz
-rw-r--r--   3 root  admingroup          26 2020-08-14 23:42 /hostname
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
-rw-r--r--   3 root  admingroup          69 2020-08-14 23:14 /wc.txt.gz
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382932
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
-rw-r--r-- 1 root root       371 Aug 14 23:41 hosts
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -copyToLocal /hostname        #功能和get命令相似,都是下載文件到本地,推薦使用get命令
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382936
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
-rw-r--r-- 1 root root        26 Aug 14 23:42 hostname
-rw-r--r-- 1 root root       371 Aug 14 23:41 hosts
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -copyToLocal /hostname          #功能和get命令相似,都是下載文件到本地,推薦使用get命令
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 6 items
-rw-r--r--   3 root  admingroup   392115733 2020-08-14 23:25 /hadoop-2.10.0.tar.gz
-rw-r--r--   3 root  admingroup          26 2020-08-14 23:42 /hostname
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
-rw-r--r--   3 root  admingroup          69 2020-08-14 23:14 /wc.txt.gz
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382936
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
-rw-r--r-- 1 root root        26 Aug 14 23:42 hostname
-rw-r--r-- 1 root root       371 Aug 14 23:41 hosts
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -getmerge /hosts /hostname res.log        #將"/hosts""/hostname"文件的內容下載併合並res.log到文件中
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382940
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
-rw-r--r-- 1 root root        26 Aug 14 23:42 hostname
-rw-r--r-- 1 root root       371 Aug 14 23:41 hosts
-rw-r--r-- 1 root root       397 Aug 14 23:44 res.log
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -getmerge /hosts /hostname res.log   #將"/hosts"和"/hostname"文件的內容下載併合並res.log到文件中

12>.查看某個文本文件的內容

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help cat            #多用於查看普通文件內容
-cat [-ignoreCrc] <src> ... :
  Fetch all files that match the file pattern <src> and display their content on
  stdout.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help cat            #多用於查看普通文件內容
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help text             #獲取源文件(容許的格式爲zip和TextRecordInputStream和Avro。)並以文本格式輸出該文件。
-text [-ignoreCrc] <src> ... :
  Takes a source file and outputs the file in text format.
  The allowed formats are zip and TextRecordInputStream and Avro.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help text            #獲取源文件(容許的格式爲zip和TextRecordInputStream和Avro。)並以文本格式輸出該文件。
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 5 items
-rw-r--r--   3 root  admingroup   392115733 2020-08-14 23:25 /hadoop-2.10.0.tar.gz
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
-rw-r--r--   3 root  admingroup          69 2020-08-14 23:14 /wc.txt.gz
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -cat /hosts            #查看普通文件文件內容
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6

#Hadoop 2.x
172.200.6.101 hadoop101.yinzhengjie.com
172.200.6.102 hadoop102.yinzhengjie.com
172.200.6.103 hadoop103.yinzhengjie.com
172.200.6.104 hadoop104.yinzhengjie.com
172.200.6.105 hadoop105.yinzhengjie.com
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -cat /hosts            #查看普通文件文件內容
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 5 items
-rw-r--r--   3 root  admingroup   392115733 2020-08-14 23:25 /hadoop-2.10.0.tar.gz
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
-rw-r--r--   3 root  admingroup          69 2020-08-14 23:14 /wc.txt.gz
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -text /hosts
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6

#Hadoop 2.x
172.200.6.101 hadoop101.yinzhengjie.com
172.200.6.102 hadoop102.yinzhengjie.com
172.200.6.103 hadoop103.yinzhengjie.com
172.200.6.104 hadoop104.yinzhengjie.com
172.200.6.105 hadoop105.yinzhengjie.com
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -text /wc.txt.gz        #text不只能夠查看普通文本文件內容,還能夠查看Hadoop支持的序列化文件或者壓縮文件內容
hadoop spark flink
hive imapla
clickhouse
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -cat /wc.txt.gz          #cat命令沒法查看壓縮文件內容                                                                                                                                                                                         ¸
©6_wc.txtʈLʏ/P(.H,ɖH̉͋狽,KUɌM,lj㋎ȌώƯ-NァE´¢*[root@hadoop101.yinzhengjie.com ~]# [root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -text /wc.txt.gz        #text不只能夠查看普通文本文件內容,還能夠查看Hadoop支持的序列化文件或者壓縮文件內容

13>.更改文件和目錄全部者和所屬組信息

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help chown
-chown [-R] [OWNER][:[GROUP]] PATH... :
  Changes owner and group of a file. This is similar to the shell's chown command
  with a few exceptions.
                                                                                 
  -R  modifies the files recursively. This is the only option currently          
      supported.                                                                 
  
  If only the owner or group is specified, then only the owner or group is
  modified. The owner and group names may only consist of digits, alphabet, and
  any of [-_./@a-zA-Z0-9]. The names are case sensitive.
  
  WARNING: Avoid using '.' to separate user name and group though Linux allows it.
  If user names have dots in them and you are using local file system, you might
  see surprising results since the shell command 'chown' is used for local files.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help chown
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 root admingroup        371 2020-08-14 21:42 /hosts
drwx------   - root admingroup          0 2020-08-14 19:19 /user
drwxr-xr-x   - root admingroup          0 2020-08-14 19:19 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chown jason:jason /hosts          #更改"/hosts"文件的所屬者和所屬組信息,儘管本地的Linux操做系統沒有對應的用戶也能夠修改爲功喲~
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 jason jason             371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup          0 2020-08-14 19:19 /user
drwxr-xr-x   - root  admingroup          0 2020-08-14 19:19 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# id jason
id: jason: no such user
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382928
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# chown jason:jason hadoop-2.10.0.tar.gz 
chown: invalid user: ‘jason:jason’
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chown jason:jason /hosts              #更改"/hosts"文件的所屬者和所屬組信息,儘管本地的Linux操做系統沒有對應的用戶也能夠修改爲功喲~
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup          0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 jason jason             371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup          0 2020-08-14 19:19 /user
drwxr-xr-x   - root  admingroup          0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwxr-xr-x   - root admingroup          0 2020-08-14 07:07 /yinzhengjie/data
drwxr-xr-x   - root admingroup          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chown -R jason:yinzhengjie  /yinzhengjie/                #遞歸更改"/yinzhengjie"目錄的所屬者和所屬組信息
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 jason jason              371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwxr-xr-x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwxr-xr-x   - jason yinzhengjie          0 2020-08-14 07:07 /yinzhengjie/data
drwxr-xr-x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chown -R jason:yinzhengjie /yinzhengjie/     #遞歸更改"/yinzhengjie"目錄的所屬者和所屬組信息

14>. 更改文件和目錄的權限信息

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help chmod
-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH... :
  Changes permissions of a file. This works similar to the shell's chmod command
  with a few exceptions.
                                                                                 
  -R           modifies the files recursively. This is the only option currently 
               supported.                                                        
  <MODE>       Mode is the same as mode used for the shell's command. The only   
               letters recognized are 'rwxXt', e.g. +t,a+r,g-w,+rwx,o=r.         
  <OCTALMODE>  Mode specifed in 3 or 4 digits. If 4 digits, the first may be 1 or
               0 to turn the sticky bit on or off, respectively.  Unlike the     
               shell command, it is not possible to specify only part of the     
               mode, e.g. 754 is same as u=rwx,g=rx,o=r.                         
  
  If none of 'augo' is specified, 'a' is assumed and unlike the shell command, no
  umask is applied.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help chmod
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-r--r--   3 jason jason              371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwxr-xr-x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chmod 600 /hosts              #更改"/hosts"文件的權限,HDFS文件的默認權限和Linux相似,均是644,我這裏將其更改成600
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason jason              371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwxr-xr-x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chmod 600 /hosts              #更改"/hosts"文件的權限,HDFS文件的默認權限和Linux相似,均是644,我這裏將其更改成600
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason jason              371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwxr-xr-x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwxr-xr-x   - jason yinzhengjie          0 2020-08-14 07:07 /yinzhengjie/data
drwxr-xr-x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chmod -R 700 /yinzhengjie/          #遞歸更改"/yinzhengjie"目錄的權限,HDFS的目錄默認權限是755,我更改成700.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason jason              371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwx------   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwx------   - jason yinzhengjie          0 2020-08-14 07:07 /yinzhengjie/data
drwx------   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chmod -R 700 /yinzhengjie/        #遞歸更改"/yinzhengjie"目錄的權限,HDFS的目錄默認權限是755,我更改成700.
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason jason              371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwx------   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwx------   - jason yinzhengjie          0 2020-08-14 07:07 /yinzhengjie/data
drwx------   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chmod -R a+x /yinzhengjie/        #爲所屬這和所屬組及其餘人均添加執行權限
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason jason              371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwx--x--x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwx--x--x   - jason yinzhengjie          0 2020-08-14 07:07 /yinzhengjie/data
drwx--x--x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]#
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chmod -R a+x /yinzhengjie/        #爲所屬這和所屬組及其餘人均添加執行權限
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwx--x--x   - jason yinzhengjie          0 2020-08-14 07:07 /yinzhengjie/data
drwx--x--x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chmod -R g-x /yinzhengjie/          #將所屬組的執行權限去掉
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwx-----x   - jason yinzhengjie          0 2020-08-14 07:07 /yinzhengjie/data
drwx-----x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chmod -R g-x /yinzhengjie/          #將所屬組的執行權限去掉
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwx-----x   - jason yinzhengjie          0 2020-08-14 07:07 /yinzhengjie/data
drwx-----x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chmod -R o+r /yinzhengjie/          #爲其餘人添加讀取權限
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwx---r-x   - jason yinzhengjie          0 2020-08-14 07:07 /yinzhengjie/data
drwx---r-x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chmod -R o+r /yinzhengjie/          #爲其餘人添加讀取權限
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwx---r-x   - jason yinzhengjie          0 2020-08-14 07:07 /yinzhengjie/data
drwx---r-x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chmod -R g+w /yinzhengjie/            #爲所屬組添加寫入權限
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwx-w-r-x   - jason yinzhengjie          0 2020-08-14 07:07 /yinzhengjie/data
drwx-w-r-x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chmod -R g+w /yinzhengjie/            #爲所屬組添加寫入權限

15>.更改文件和目錄的組信息

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help chgrp
-chgrp [-R] GROUP PATH... :
  This is equivalent to -chown ... :GROUP ...
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help chgrp
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason jason              371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwx-w-r-x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chgrp yinzhengjie /hosts            #將"/hosts"文件的所屬組信息更改成"yinzhengjie"組
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwx-w-r-x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chgrp yinzhengjie /hosts            #將"/hosts"文件的所屬組信息更改成"yinzhengjie"組
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwx-w-r-x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwx-w-r-x   - jason yinzhengjie          0 2020-08-14 07:07 /yinzhengjie/data
drwx-w-r-x   - jason yinzhengjie          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chgrp -R admingroup /yinzhengjie                #遞歸將"/yinzhengjie"目錄的所屬組信息更改成"admingroup"組
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwx-w-r-x   - jason admingroup           0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /yinzhengjie/
Found 2 items
drwx-w-r-x   - jason admingroup          0 2020-08-14 07:07 /yinzhengjie/data
drwx-w-r-x   - jason admingroup          0 2020-08-14 21:46 /yinzhengjie/softwares
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -chgrp -R admingroup /yinzhengjie        #遞歸將"/yinzhengjie"目錄的所屬組信息更改成"admingroup"組

16>.查看HDFS的可用空間

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help df
-df [-h] [<path> ...] :
  Shows the capacity, free and used space of the filesystem. If the filesystem has
  multiple partitions, and no path to a particular partition is specified, then
  the status of the root partitions will be shown.
                                                                                 
  -h  Formats the sizes of files in a human-readable fashion rather than a number
      of bytes.                                                                  
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help df
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -df                #查看HDFS中已配置的容量,已用空間,可用空間及已使用空間的百分比。
Filesystem                                       Size    Used       Available  Use%
hdfs://hadoop101.yinzhengjie.com:9000  24740939366400  282624  24740939083776    0%
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -df                #查看HDFS中已配置的容量,已用空間,可用空間及已使用空間的百分比。
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -df 
Filesystem                                       Size    Used       Available  Use%
hdfs://hadoop101.yinzhengjie.com:9000  24740939366400  282624  24740939083776    0%
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -df -h              #使用"-h"選線能夠以人性化可讀的方式輸出
Filesystem                               Size   Used  Available  Use%
hdfs://hadoop101.yinzhengjie.com:9000  22.5 T  276 K     22.5 T    0%
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -df -h              #使用"-h"選線能夠以人性化可讀的方式輸出

17>.查看HDFS的已用空間

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help du
-du [-s] [-h] [-x] <path> ... :
  Show the amount of space, in bytes, used by the files that match the specified
  file pattern. The following flags are optional:
                                                                                 
  -s  Rather than showing the size of each individual file that matches the      
      pattern, shows the total (summary) size.                                   
  -h  Formats the sizes of files in a human-readable fashion rather than a number
      of bytes.                                                                  
  -x  Excludes snapshots from being counted.                                     
  
  Note that, even without the -s option, this only shows size summaries one level
  deep into a directory.
  
  The output is in the form 
      size    name(full path)
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help du
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwx-w-r-x   - jason admingroup           0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -du /            #查看整個HDFS文件系統中使用的存儲
0      /bigdata
371    /hosts
11269  /user
0      /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -du /            #查看整個HDFS文件系統中使用的存儲
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwx-w-r-x   - jason admingroup           0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -du -h /          #使用"-h"選線可用以人性化可讀的方式顯示文件和目錄佔用的空間大小,默認以字節爲單位顯示
0       /bigdata
371     /hosts
11.0 K  /user
0       /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -du -h /          #使用"-h"選線可用以人性化可讀的方式顯示文件和目錄佔用的空間大小,默認以字節爲單位顯示
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwx-w-r-x   - jason admingroup           0 2020-08-14 21:46 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -du -h /
0       /bigdata
371     /hosts
11.0 K  /user
0       /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -du -s -h /          #僅查看"/"目錄自己已使用的空間大小
11.4 K  /
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -du -s -h /          #僅查看"/"目錄自己已使用的空間大小

18>.測試文件

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help test 
-test -[defsz] <path> :
  Answer various questions about <path>, with result via exit status.
    -d  return 0 if <path> is a directory.
    -e  return 0 if <path> exists.
    -f  return 0 if <path> is a file.
    -s  return 0 if file <path> is greater         than zero bytes in size.
    -w  return 0 if file <path> exists         and write permission is granted.
    -r  return 0 if file <path> exists         and read permission is granted.
    -z  return 0 if file <path> is         zero bytes in size, else return 1.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help test
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 3 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -e /hosts        #若是"/hosts"路徑存在則返回"0",不存在則返回"1"。
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# echo $?
0
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -e /hosts2020
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# echo $?
1
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -e /hosts        #若是"/hosts"路徑存在則返回"0",不存在則返回"1"。
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 3 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -f /hosts         #若是"/hosts"是文件則返回"0",若不存在或者是目錄均返回"1"。
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# echo $?
0
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -f /bigdata
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# echo $?
1
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -f /bigdata2020
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# echo $?
1
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -f /hosts         #若是"/hosts"是文件則返回"0",若不存在或者是目錄均返回"1"。
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 3 items
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -d /bigdata        #若是"/bigdata"是目錄則返回"0",若不存在或者是文件軍返回"1"。
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# echo $?
0
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -d /hosts
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# echo $?
1
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -d /bigdata2020
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# echo $?
1
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -d /bigdata        #若是"/bigdata"是目錄則返回"0",若不存在或者是文件軍返回"1"。
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 4 items
-rw-r--r--   3 root  admingroup           0 2020-08-14 22:47 /a.txt
drwxr-xr-x   - root  admingroup           0 2020-08-14 07:08 /bigdata
-rw-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -z /a.txt          #若是文件的大小爲"0"(即空文件或者目錄),則返回"0",如果路徑不存在或者文件大小大於0均會返回"1"。
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# echo $?
0
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -z /hosts
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# echo $?
1
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -z /bigdata
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# echo $?
0
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -z /bigdata2020
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# echo $?
1
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -test -z /a.txt          #若是文件的大小爲"0"(即空文件或者目錄),則返回"0",如果路徑不存在或者文件大小大於0均會返回"1"。

19>.查看文件的校驗和

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help checksum
-checksum <src> ... :
  Dump checksum information for files that match the file pattern <src> to stdout.
  Note that this requires a round-trip to a datanode storing each block of the
  file, and thus is not efficient to run on a large number of files. The checksum
  of a file depends on its content, block size and the checksum algorithm and
  parameters used for creating the fil
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help checksum
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 5 items
-rw-r--r--   3 root  admingroup   392115733 2020-08-14 23:25 /hadoop-2.10.0.tar.gz
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
-rw-r--r--   3 root  admingroup          69 2020-08-14 23:14 /wc.txt.gz
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -checksum /wc.txt.gz        #查看HDFS文件的MD5校驗和,校驗和取決於文件的內容、塊大小和校驗和算法用於建立文件的參數。
/wc.txt.gz    MD5-of-0MD5-of-512CRC32C    00000200000000000000000081c79e60ede6f33e67d79a84e77eebeb
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -checksum /wc.txt.gz        #查看HDFS文件的MD5校驗和,校驗和取決於文件的內容、塊大小和校驗和算法用於建立文件的參數。

20>.設置HDFS中文件的副本數量

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help setrep
-setrep [-R] [-w] <rep> <path> ... :
  Set the replication level of a file. If <path> is a directory then the command
  recursively changes the replication factor of all files under the directory tree
  rooted at <path>.
                                                                                 
  -w  It requests that the command waits for the replication to complete. This   
      can potentially take a very long time.                                     
  -R  It is accepted for backwards compatibility. It has no effect.              
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help setrep
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 3 items
--w-------   3 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -setrep -R -w 2 /hosts          #將"/hosts"的副本銀子設置爲2,使用"-w"參數會等待副本修改完成後當前終端纔不會阻塞,"-R"參數並無效果,所以能夠不使用。
Replication 2 set: /hosts
Waiting for /hosts ...
WARNING: the waiting time may be long for DECREASING the number of replications.
. done
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 3 items
--w-------   2 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -setrep -R -w 2 /hosts       #將"/hosts"的副本銀子設置爲2,使用"-w"參數會等待副本修改完成後當前終端纔不會阻塞,"-R"參數並無效果,所以能夠不使用。
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls /
Found 3 items
--w-------   2 jason yinzhengjie        371 2020-08-14 21:42 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls -R /yinzhengjie/
-rw-r--r--   3 root admingroup         69 2020-08-14 23:22 /yinzhengjie/wc.txt.gz
drwxr-xr-x   - root admingroup          0 2020-08-14 23:13 /yinzhengjie/yum.repos.d
-rw-r--r--   3 root admingroup       1664 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Base.repo
-rw-r--r--   3 root admingroup       1309 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-CR.repo
-rw-r--r--   3 root admingroup        649 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Debuginfo.repo
-rw-r--r--   3 root admingroup        630 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Media.repo
-rw-r--r--   3 root admingroup       1331 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Sources.repo
-rw-r--r--   3 root admingroup       5701 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Vault.repo
-rw-r--r--   3 root admingroup        314 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-fasttrack.repo
-rw-r--r--   3 root admingroup       1050 2020-08-14 23:13 /yinzhengjie/yum.repos.d/epel-testing.repo
-rw-r--r--   3 root admingroup        951 2020-08-14 23:13 /yinzhengjie/yum.repos.d/epel.repo
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -setrep 2 /yinzhengjie     #若是指定的路徑是目錄,則setrep命令將遞歸更改所指定目錄下的全部文件的複製因子(咱們能夠不使用"-w"參數,當前終端所以並不會阻塞。)
Replication 2 set: /yinzhengjie/wc.txt.gz
Replication 2 set: /yinzhengjie/yum.repos.d/CentOS-Base.repo
Replication 2 set: /yinzhengjie/yum.repos.d/CentOS-CR.repo
Replication 2 set: /yinzhengjie/yum.repos.d/CentOS-Debuginfo.repo
Replication 2 set: /yinzhengjie/yum.repos.d/CentOS-Media.repo
Replication 2 set: /yinzhengjie/yum.repos.d/CentOS-Sources.repo
Replication 2 set: /yinzhengjie/yum.repos.d/CentOS-Vault.repo
Replication 2 set: /yinzhengjie/yum.repos.d/CentOS-fasttrack.repo
Replication 2 set: /yinzhengjie/yum.repos.d/epel-testing.repo
Replication 2 set: /yinzhengjie/yum.repos.d/epel.repo
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls -R /yinzhengjie/
-rw-r--r--   2 root admingroup         69 2020-08-14 23:22 /yinzhengjie/wc.txt.gz
drwxr-xr-x   - root admingroup          0 2020-08-14 23:13 /yinzhengjie/yum.repos.d
-rw-r--r--   2 root admingroup       1664 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Base.repo
-rw-r--r--   2 root admingroup       1309 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-CR.repo
-rw-r--r--   2 root admingroup        649 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Debuginfo.repo
-rw-r--r--   2 root admingroup        630 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Media.repo
-rw-r--r--   2 root admingroup       1331 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Sources.repo
-rw-r--r--   2 root admingroup       5701 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-Vault.repo
-rw-r--r--   2 root admingroup        314 2020-08-14 23:13 /yinzhengjie/yum.repos.d/CentOS-fasttrack.repo
-rw-r--r--   2 root admingroup       1050 2020-08-14 23:13 /yinzhengjie/yum.repos.d/epel-testing.repo
-rw-r--r--   2 root admingroup        951 2020-08-14 23:13 /yinzhengjie/yum.repos.d/epel.repo
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -setrep 2 /yinzhengjie       #若是指定的路徑是目錄,則setrep命令將遞歸更改所指定目錄下的全部文件的複製因子(咱們能夠不使用"-w"參數,當前終端所以並不會阻塞。)

21>.將Linux本地文件內容追加到hdfs集羣中已經存在的文件中

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help appendToFile 
-appendToFile <localsrc> ... <dst> :
  Appends the contents of all the given local files to the given dst file. The dst
  file will be created if it does not exist. If <localSrc> is -, then the input is
  read from stdin.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help appendToFile
[root@hadoop101.yinzhengjie.com ~]# vim host.txt
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# cat host.txt
#擴容節點
172.200.6.106 hadoop106.yinzhengjie.com
172.200.6.107 hadoop107.yinzhengjie.com
172.200.6.108 hadoop108.yinzhengjie.com
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -cat /hosts                    #未追加文件以前,查看其內容
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6

#Hadoop 2.x
172.200.6.101 hadoop101.yinzhengjie.com
172.200.6.102 hadoop102.yinzhengjie.com
172.200.6.103 hadoop103.yinzhengjie.com
172.200.6.104 hadoop104.yinzhengjie.com
172.200.6.105 hadoop105.yinzhengjie.com
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382932
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
-rw-r--r-- 1 root root       134 Aug 15 00:23 host.txt
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -appendToFile host.txt /hosts            #將本地文件內容追加到HDFS文件系統中的"/hosts"文件中
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# ll
total 382932
-rw-r--r-- 1 root root 392115733 Aug 10 15:42 hadoop-2.10.0.tar.gz
-rw-r--r-- 1 root root       134 Aug 15 00:23 host.txt
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -text /hosts                            #不難發現,文件被追加成功啦~
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6

#Hadoop 2.x
172.200.6.101 hadoop101.yinzhengjie.com
172.200.6.102 hadoop102.yinzhengjie.com
172.200.6.103 hadoop103.yinzhengjie.com
172.200.6.104 hadoop104.yinzhengjie.com
172.200.6.105 hadoop105.yinzhengjie.com
#擴容節點
172.200.6.106 hadoop106.yinzhengjie.com
172.200.6.107 hadoop107.yinzhengjie.com
172.200.6.108 hadoop108.yinzhengjie.com
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -appendToFile host.txt /hosts #將本地文件內容追加到HDFS文件系統中的"/hosts"文件中

22>.顯示一個文件的末尾(最後1KB的內容)

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help tail
-tail [-f] <file> :
  Show the last 1KB of the file.
                                             
  -f  Shows appended data as the file grows. 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help tail
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -tail /hosts            #查看文件的尾部內容(默認只查看最後1KB的內容)
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6

#Hadoop 2.x
172.200.6.101 hadoop101.yinzhengjie.com
172.200.6.102 hadoop102.yinzhengjie.com
172.200.6.103 hadoop103.yinzhengjie.com
172.200.6.104 hadoop104.yinzhengjie.com
172.200.6.105 hadoop105.yinzhengjie.com
#擴容節點
172.200.6.106 hadoop106.yinzhengjie.com
172.200.6.107 hadoop107.yinzhengjie.com
172.200.6.108 hadoop108.yinzhengjie.com
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -tail /hosts            #查看文件的尾部內容(默認只查看最後1KB的內容)
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -tail -f /hosts           #"-f"選項和Linux操做系統中的"tail"相似,當該文件末尾發生變化時,咱們在終端是能夠看到對應的新增數據
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6

#Hadoop 2.x
172.200.6.101 hadoop101.yinzhengjie.com
172.200.6.102 hadoop102.yinzhengjie.com
172.200.6.103 hadoop103.yinzhengjie.com
172.200.6.104 hadoop104.yinzhengjie.com
172.200.6.105 hadoop105.yinzhengjie.com
#擴容節點
172.200.6.106 hadoop106.yinzhengjie.com
172.200.6.107 hadoop107.yinzhengjie.com
172.200.6.108 hadoop108.yinzhengjie.com
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -tail -f /hosts           #"-f"選項和Linux操做系統中的"tail"相似,當該文件末尾發生變化時,咱們在終端是能夠看到對應的新增數據

23>.計算路徑下的目錄、文件和字節數與指定的文件模式匹配。

[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help count
-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ... :
  Count the number of directories, files and bytes under the paths
  that match the specified file pattern.  The output columns are:
  DIR_COUNT FILE_COUNT CONTENT_SIZE PATHNAME
  or, with the -q option:
  QUOTA REM_QUOTA SPACE_QUOTA REM_SPACE_QUOTA
        DIR_COUNT FILE_COUNT CONTENT_SIZE PATHNAME
  The -h option shows file sizes in human readable format.
  The -v option displays a header line.
  The -x option excludes snapshots from being calculated. 
  The -t option displays quota by storage types.
  It should be used with -q or -u option, otherwise it will be ignored.
  If a comma-separated list of storage types is given after the -t option, 
  it displays the quota and usage for the specified types. 
  Otherwise, it displays the quota and usage for all the storage 
  types that support quota. The list of possible storage types(case insensitive):
  ram_disk, ssd, disk and archive.
  It can also pass the value '', 'all' or 'ALL' to specify all the storage types.
  The -u option shows the quota and 
  the usage against the quota without the detailed content summary.
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -help count
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls -h /
Found 3 items
--w-------   2 jason yinzhengjie    309.5 K 2020-08-16 11:37 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -count -h /hosts
           0            1            309.5 K /hosts
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -count -h /          #以人性化可讀的方式統計("-h"選項)根("/")路徑的信息,輸出格式爲:目錄數量,文件數量,路徑的總大小,路徑名稱。
          19           29            374.3 M /
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -count -h /            #以人性化可讀的方式統計("-h"選項)根("/")路徑的信息,輸出格式爲:目錄數量,文件數量,路徑的總大小,路徑名稱。
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -ls -h /
Found 3 items
--w-------   2 jason yinzhengjie    309.5 K 2020-08-16 11:37 /hosts
drwx------   - root  admingroup           0 2020-08-14 19:19 /user
drwxr-xr-x   - root  admingroup           0 2020-08-14 23:22 /yinzhengjie
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -count -h  /
          19           29            374.3 M /
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -count -h -v /          #"-v"選項能夠顯示標題欄
   DIR_COUNT   FILE_COUNT       CONTENT_SIZE PATHNAME
          19           29            374.3 M /
[root@hadoop101.yinzhengjie.com ~]# 
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -count -h -v /          #"-v"選項能夠顯示標題欄
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -count -h -v -q /user/root             #使用"-q"參數檢查配額信息
       QUOTA       REM_QUOTA     SPACE_QUOTA REM_SPACE_QUOTA    DIR_COUNT   FILE_COUNT       CONTENT_SIZE PATHNAME
          88              51            66 G          64.5 G           16           21            748.2 M /user/root
[root@hadoop101.yinzhengjie.com ~]# 


相關術語解釋以下:
  QUOTA:
    名稱配額相關信息,即文件和目錄的限制。
  REM_QUOTA:
    此用戶能夠建立的配額中剩餘文件和目錄數。
  SPACE_QUOTA:
    授予此用戶的空間配額。
  REM_SPACE_QUOTA:
    此用戶剩餘空間配額。
  DIR_COUNT:
    目錄數。
  FILE_COUNT:
    文件數。
  CONTENT_SIZE:
    文件大小。
  PATHNAME:
    路徑名稱。
[root@hadoop101.yinzhengjie.com ~]# hdfs dfs -count -h -v -q /user/root    #使用"-q"參數檢查配額信息

24>.權限管理

  博主推薦閱讀:
    https://www.cnblogs.com/yinzhengjie2020/p/13308791.html

25>.快照管理

  博主推薦閱讀:
    https://www.cnblogs.com/yinzhengjie2020/p/13303008.html
相關文章
相關標籤/搜索