[轉]清理ambari安裝的hadoop集羣

本文針對redhat或者centospython

對於測試集羣,若是經過ambari安裝Hadoop集羣后,想從新再來一次的話,須要清理集羣。web

對於安裝了不少hadoop組件的話,這個工做很繁瑣。接下來是我整理的清理過程。sql

1,經過ambari將集羣中的所用組件都關閉,若是關閉不了,直接kill -9 XXXcentos

2,關閉ambari-server,ambari-agentapp

 

[plain]  view plain  copy
 
  1. ambari-server stop  
  2. ambari-agent stop  

 

3,卸載安裝的軟件ide

 

[plain]  view plain  copy
 
  1. yum remove hadoop_2* hdp-select* ranger_2* zookeeper_* bigtop* atlas-metadata* ambari* postgresql spark*  slider* storm* snappy*

 

 

以上命令可能不全,執行完一下命令後,再執行oop

 

[plain]  view plain  copy
 
  1. yum list | grep @HDP  

 

查看是否還有沒有卸載的,若是有,繼續經過#yum remove XXX卸載post

4,刪除postgresql的數據測試

      postgresql軟件卸載後,其數據還保留在硬盤中,須要把這部分數據刪除掉,若是不刪除掉,從新安裝ambari-server後,有可能還應用之前的安裝數據,而這些數據時錯誤數據,因此須要刪除掉。spa

 

[plain]  view plain  copy
 
  1. rm -rf /var/lib/pgsql  

 

5,刪除用戶

     ambari安裝hadoop集羣會建立一些用戶,清除集羣時有必要清除這些用戶,並刪除對應的文件夾。這樣作能夠避免集羣運行時出現的文件訪問權限錯誤的問題。     

 

[plain]  view plain  copy
 
  1. userdel oozie  
  2. userdel hive  
  3. userdel ambari-qa  
  4. userdel flume    
  5. userdel hdfs    
  6. userdel knox    
  7. userdel storm    
  8. userdel mapred  
  9. userdel hbase    
  10. userdel tez    
  11. userdel zookeeper  
  12. userdel kafka    
  13. userdel falcon  
  14. userdel sqoop    
  15. userdel yarn    
  16. userdel hcat  
  17. userdel atlas  
  18. userdel spark  
  19. userdel ams

 

 

[plain]  view plain  copy
 
  1. rm -rf /home/atlas  
  2. rm -rf /home/accumulo  
  3. rm -rf /home/hbase  
  4. rm -rf /home/hive  
  5. rm -rf /home/oozie  
  6. rm -rf /home/storm  
  7. rm -rf /home/yarn  
  8. rm -rf /home/ambari-qa  
  9. rm -rf /home/falcon  
  10. rm -rf /home/hcat  
  11. rm -rf /home/kafka  
  12. rm -rf /home/mahout  
  13. rm -rf /home/spark  
  14. rm -rf /home/tez  
  15. rm -rf /home/zookeeper  
  16. rm -rf /home/flume  
  17. rm -rf /home/hdfs  
  18. rm -rf /home/knox  
  19. rm -rf /home/mapred  
  20. rm -rf /home/sqoop  

 

6,刪除ambari遺留數據

 

[plain]  view plain  copy
 
  1. rm -rf /var/lib/ambari*  
  2. rm -rf /usr/lib/python2.6/site-packages/ambari_*  
  3. rm -rf /usr/lib/python2.6/site-packages/resource_management  
  4. rm -rf /usr/lib/ambri-*  

 

7,刪除其餘hadoop組件遺留數據

 

[plain]  view plain  copy
 
  1. rm -rf /etc/falcon
    rm -rf /etc/knox
    rm -rf /etc/hive-webhcat
    rm -rf /etc/kafka
    rm -rf /etc/slider
    rm -rf /etc/storm-slider-client
    rm -rf /etc/spark
    rm -rf /var/run/spark
    rm -rf /var/run/hadoop
    rm -rf /var/run/hbase
    rm -rf /var/run/zookeeper
    rm -rf /var/run/flume
    rm -rf /var/run/storm
    rm -rf /var/run/webhcat
    rm -rf /var/run/hadoop-yarn
    rm -rf /var/run/hadoop-mapreduce
    rm -rf /var/run/kafka
    rm -rf /var/log/hadoop
    rm -rf /var/log/hbase
    rm -rf /var/log/flume
    rm -rf /var/log/storm
    rm -rf /var/log/hadoop-yarn
    rm -rf /var/log/hadoop-mapreduce
    rm -rf /var/log/knox
    rm -rf /usr/lib/flume
    rm -rf /usr/lib/storm
    rm -rf /var/lib/hive
    rm -rf /var/lib/oozie
    rm -rf /var/lib/flume
    rm -rf /var/lib/hadoop-hdfs
    rm -rf /var/lib/knox
    rm -rf /var/log/hive
    rm -rf /var/log/oozie
    rm -rf /var/log/zookeeper
    rm -rf /var/log/falcon
    rm -rf /var/log/webhcat
    rm -rf /var/log/spark
    rm -rf /var/tmp/oozie
    rm -rf /tmp/ambari-qa
    rm -rf /var/hadoop
    rm -rf /hadoop/falcon
    rm -rf /tmp/hadoop
    rm -rf /tmp/hadoop-hdfs
    rm -rf /usr/hdp
    rm -rf /usr/hadoop
    rm -rf /opt/hadoop
    rm -rf /opt/hadoop2
    rm -rf /tmp/hadoop
    rm -rf /var/hadoop
    rm -rf /hadoop
    rm -rf /etc/ambari-metrics-collector
    rm -rf /etc/ambari-metrics-monitor
    rm -rf /var/run/ambari-metrics-collector
    rm -rf /var/run/ambari-metrics-monitor
    rm -rf /var/log/ambari-metrics-collector
    rm -rf /var/log/ambari-metrics-monitor
    rm -rf /var/lib/hadoop-yarn
    rm -rf /var/lib/hadoop-mapreduce

 

8,清理yum數據源

 

[plain]  view plain  copy
 
    1. #yum clean all  
相關文章
相關標籤/搜索