一、基本工具html
yum、scp、curl、wget、pdsh、sshpython
二、環境準備mysql
2.一、系統環境linux
Centos6.5 64bitios
Ambari 1.4.3.38web
2.二、ambari server 與 ambari agent 服務器root ssh無密互訪sql
Ambari server服務器數據庫
Ssh-keygen 生產私鑰公鑰 id_rsa id_rsa.pubvim
Ambari agent服務器centos
將ambari server 服上root生成的id_rsa.pub上傳到各個集羣機器上。
cat id_rsa.pub >> authorized_keys
Ambar server
ssh root@ambariagent
2.三、全部集羣機器時間同步
安裝ntp服務
2.四、全部集羣機器關閉selinux
setenforce 0
2.五、全部集羣機器關閉防火牆
/etc/init.d/iptables stop
2.六、全部集羣機器centos關閉packagekit
vim /etc/yum/pluginconf.d/refresh-packagekit.conf
enabled=0
三、安裝準備
注:本節只須要在ambari server服務器上作
3.一、安裝源
wget http://public-repo-1.hortonworks.com/ambari/centos6/1.x/GA/ambari.repo
cp ambari.repo /etc/yum.repos.d
這個源可能會很是慢,若是慢能夠部署本地源(如下爲可選,通常ambari源所需資源少,不須要部署本地源),步奏以下:
一、下載打包好的ambari源百度雲盤
二、在ambari-server機器上部署源,將上面下載的文件解壓到如/var/www/html/目錄下
三、修改ambari.repo文件以下,注意黃色部分便可
[ambari-1.x]
name=Ambari 1.x
baseurl=file:///var/www/html/ambari/
gpgcheck=0
gpgkey=http://public-repo-1.hortonworks.com/ambari/centos6/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
enabled=1
priority=1
[HDP-UTILS-1.1.0.16]
name=Hortonworks Data Platform Utils Version - HDP-UTILS-1.1.0.16
baseurl=file:///var/www/html/ambari/
gpgcheck=0
gpgkey=http://public-repo-1.hortonworks.com/ambari/centos6/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
enabled=1
priority=1
[Updates-ambari-1.x]
name=ambari-1.x - Updates
baseurl=file:///var/www/html/ambari/
gpgcheck=0
gpgkey=http://public-repo-1.hortonworks.com/ambari/centos6/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
enabled=1
priority=1
3.二、Install the epel repository
yum install epel-release
yum repolist
3.三、安裝ambari server
yum install ambari-server
3.四、設置ambari server
ambari-server setup
一路按肯定便可,ambari-server會用到數據庫,自行選擇默認使用PostgreSQL,若是你選擇使用mysql還須要把jdbc驅動放到/usr/lib/ambari-server目錄下,會自動下載jdk-6u31-linux-x64.bin到/var/lib/ambari-server/resources你能夠能夠本身下載
安裝好以後 啓動ambari-server
ambari-server start
若是啓動失敗主義看幾個日誌文件/var/log/ambari-server
查看ambari-server狀態
ambari-server status
注:若是不本身使用本身建立的源,一下步奏略過
ambari-server 會本身搭建一個web服務器根目錄爲/usr/lib/ambari-server/web端口號爲8080咱們能夠經過 http://ambari-server-hostname:8080/訪問到,咱們要利用起來這個web服務器(也可使用本身已有服務器)部署咱們的ambari 和 hdp源,步驟以下:
一、將3.1中/var/www/html/ambari/複製到/usr/lib/ambari-server/web
cp -r /var/www/html/ambari/ /usr/lib/ambari-server/web
二、同時修改ambari.repo
[ambari-1.x]
name=Ambari 1.x
baseurl=http://ambari-server-hostname:8080/ambari/
gpgcheck=0
gpgkey=http://public-repo-1.hortonworks.com/ambari/centos6/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
enabled=1
priority=1
[HDP-UTILS-1.1.0.16]
name=Hortonworks Data Platform Utils Version - HDP-UTILS-1.1.0.16
baseurl=http://ambari-server-hostname:8080/ambari/
gpgcheck=0
gpgkey=http://public-repo-1.hortonworks.com/ambari/centos6/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
enabled=1
priority=1
[Updates-ambari-1.x]
name=ambari-1.x - Updates
baseurl=http://ambari-server-hostname:8080/ambari/
gpgcheck=0
gpgkey=http://public-repo-1.hortonworks.com/ambari/centos6/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
enabled=1
priority=1
三、建立HDP源
將HDP源(我本身打包好的源)百度雲盤下載並解壓到/usr/lib/ambari-server/web下能夠看到一個hdp文件夾
四、修改repoinfo.xml文件
cd /var/lib/ambari-server/resources/stacks/HDP/2.0.6/repos
修改以下部分,我使用的是centos6.5因此修改對應的。
<os type="centos6">
<repo>
<baseurl>http://ambari-server-hostname:8080/hdp/</baseurl>
<repoid>HDP-2.0.6</repoid>
<reponame>HDP</reponame>
</repo>
</os>
<os type="oraclelinux6">
<repo>
<baseurl>http://ambari-server-hostname:8080/hdp/</baseurl>
<repoid>HDP-2.0.6</repoid>
<reponame>HDP</reponame>
</repo>
</os>
<os type="redhat6">
<repo>
<baseurl>http://ambari-server-hostname:8080/hdp/</baseurl>
<repoid>HDP-2.0.6</repoid>
<reponame>HDP</reponame>
</repo>
</os>
在安裝過程當中,ambari會在集羣機器上建立一個HDP.repo的源文件,文件的內容就是咱們這裏修改的相關信息。
四、安裝
瀏覽器訪問http://ambari-server-hostname:8080/
按照步奏一步步走在這一步時注意,選擇本身對應得版本,我使用的是2.0.6前面章節中我也配置的是2.0.6版本,其餘版本相似修改。
在安裝中可能會出現屢次失敗,若是retry不少次還這樣就要清理各個機器上的相關類庫,用戶,目錄,配置等信息。因此建議最好在一個新安裝的純淨系統上使用。
五、從新安裝
注:如下來自網絡,我在安裝中也是這樣作的,有些內容可能與實際不一樣,主要就是刪除相關包、用戶、以及配置
一、中止ambari
全部集羣機器
ambari-agent stop
Ambari-server機器
ambari-server stop
二、刪除安裝包
#用yum list installed | grep HDP來檢查安裝的hadoop相關的包
yum remove -y sqoop.noarch
yum remove -y lzo-devel.x86_64
yum remove -y hadoop-libhdfs.x86_64
yum remove -y rrdtool.x86_64
yum remove -y hbase.noarch
yum remove -y pig.noarch
yum remove -y lzo.x86_64
yum remove -y ambari-log4j.noarch
yum remove -y oozie.noarch
yum remove -y oozie-client.noarch
yum remove -y gweb.noarch
yum remove -y snappy-devel.x86_64
yum remove -y hcatalog.noarch
yum remove -y python-rrdtool.x86_64
yum remove -y nagios.x86_64
yum remove -y webhcat-tar-pig.noarch
yum remove -y snappy.x86_64
yum remove -y libconfuse.x86_64
yum remove -y webhcat-tar-hive.noarch
yum remove -y ganglia-gmetad.x86_64
yum remove -y extjs.noarch
yum remove -y hive.noarch
yum remove -y hadoop-lzo.x86_64
yum remove -y hadoop-lzo-native.x86_64
yum remove -y hadoop-native.x86_64
yum remove -y hadoop-pipes.x86_64
yum remove -y nagios-plugins.x86_64
yum remove -y hadoop.x86_64
yum remove -y zookeeper.noarch
yum remove -y hadoop-sbin.x86_64
yum remove -y ganglia-gmond.x86_64
yum remove -y libganglia.x86_64
yum remove -y perl-rrdtool.x86_64
yum remove -y epel-release.noarch
yum remove -y compat-readline5*
yum remove -y fping.x86_64
yum remove -y perl-Crypt-DES.x86_64
yum remove -y exim.x86_64
yum remove -y ganglia-web.noarch
yum remove -y perl-Digest-HMAC.noarch
yum remove -y perl-Digest-SHA1.x86_64
3.刪除用戶
userdel nagios
userdel hive
userdel ambari-qa
userdel hbase
userdel oozie
userdel hcat
userdel mapred
userdel hdfs
userdel rrdcached
userdel zookeeper
userdel sqoop
userdel puppet
4.刪除快捷方式
cd /etc/alternatives
rm -rf hadoop-etc
rm -rf zookeeper-conf
rm -rf hbase-conf
rm -rf hadoop-log
rm -rf hadoop-lib
rm -rf hadoop-default
rm -rf oozie-conf
rm -rf hcatalog-conf
rm -rf hive-conf
rm -rf hadoop-man
rm -rf sqoop-conf
rm -rf hadoop-conf
5.刪除文件夾
rm -rf /var/lib/pgsql
rm -rf /hadoop
rm -rf /etc/hadoop
rm -rf /etc/hbase
rm -rf /etc/hcatalog
rm -rf /etc/hive
rm -rf /etc/ganglia
rm -rf /etc/nagios
rm -rf /etc/oozie
rm -rf /etc/sqoop
rm -rf /etc/zookeeper
rm -rf /var/run/hadoop
rm -rf /var/run/hbase
rm -rf /var/run/hive
rm -rf /var/run/ganglia
rm -rf /var/run/nagios
rm -rf /var/run/oozie
rm -rf /var/run/zookeeper
rm -rf /var/log/hadoop
rm -rf /var/log/hbase
rm -rf /var/log/hive
rm -rf /var/log/nagios
rm -rf /var/log/oozie
rm -rf /var/log/zookeeper
rm -rf /usr/lib/hadoop
rm -rf /usr/lib/hbase
rm -rf /usr/lib/hcatalog
rm -rf /usr/lib/hive
rm -rf /usr/lib/oozie
rm -rf /usr/lib/sqoop
rm -rf /usr/lib/zookeeper
rm -rf /var/lib/hive
rm -rf /var/lib/ganglia
rm -rf /var/lib/oozie
rm -rf /var/lib/zookeeper
rm -rf /var/tmp/oozie
rm -rf /tmp/hive
rm -rf /tmp/nagios
rm -rf /tmp/ambari-qa
rm -rf /tmp/sqoop-ambari-qa
rm -rf /var/nagios
rm -rf /hadoop/oozie
rm -rf /hadoop/zookeeper
rm -rf /hadoop/mapred
rm -rf /hadoop/hdfs
rm -rf /tmp/hadoop-hive
rm -rf /tmp/hadoop-nagios
rm -rf /tmp/hadoop-hcat
rm -rf /tmp/hadoop-ambari-qa
rm -rf /tmp/hsperfdata_hbase
rm -rf /tmp/hsperfdata_hive
rm -rf /tmp/hsperfdata_nagios
rm -rf /tmp/hsperfdata_oozie
rm -rf /tmp/hsperfdata_zookeeper
rm -rf /tmp/hsperfdata_mapred
rm -rf /tmp/hsperfdata_hdfs
rm -rf /tmp/hsperfdata_hcat
rm -rf /tmp/hsperfdata_ambari-qa
6.刪除ambari包
#採用這句命令來檢查yum list installed | grep ambari
yum remove -y ambari-*
yum remove -y postgresql
rm -rf /var/lib/ambari*
rm -rf /var/log/ambari*
rm -rf /etc/ambari*
7.刪除HDP.repo和amabari.repo
cd /etc/yum.repos.d/
rm -rf HDP*
rm -rf ambari*