程序員就是把一切手工作的事情變成讓計算機來作,從而能夠讓本身偷偷懶。程序員
如下就是個很是low的hive文件夾同步程序,至於節點超過100個或者1000個的,能夠加個循環了。shell
#!/bin/sh #================ hive 安裝包同步 =================# # 該腳本用來將name節點的hive文件夾同步到data節點 # # 當hive安裝包變更時,須要同步data節點,不然oozie # # 經過shell調用hive程序時,會由於分配的節點hive安 # # 裝包不一樣步而引發錯誤 # #==================================================# # 1.清理舊的hive ssh -t hadoop@dwprod-dataslave1 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave2 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave3 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave4 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave5 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave6 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave7 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave8 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave9 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave10 rm -r /opt/local/hive # 2.拷貝新的hive scp -r -q /opt/local/hive hadoop@dwprod-dataslave1:/opt/local/ scp -r -q /opt/local/hive hadoop@dwprod-dataslave2:/opt/local/ scp -r -q /opt/local/hive hadoop@dwprod-dataslave3:/opt/local/ scp -r -q /opt/local/hive hadoop@dwprod-dataslave4:/opt/local/ scp -r -q /opt/local/hive hadoop@dwprod-dataslave5:/opt/local/ scp -r -q /opt/local/hive hadoop@dwprod-dataslave6:/opt/local/ scp -r -q /opt/local/hive hadoop@dwprod-dataslave7:/opt/local/ scp -r -q /opt/local/hive hadoop@dwprod-dataslave8:/opt/local/ scp -r -q /opt/local/hive hadoop@dwprod-dataslave9:/opt/local/ scp -r -q /opt/local/hive hadoop@dwprod-dataslave10:/opt/local/