很少說,直接上乾貨!html
個人集羣機器狀況是 bigdatamaster(192.168.80.10)、bigdataslave1(192.168.80.11)和bigdataslave2(192.168.80.12)java
而後,安裝目錄是在/home/hadoop/app下。node
官方建議在master機器上安裝Hue,我這裏也不例外。安裝在bigdatamaster機器上。python
Hue版本:hue-3.9.0-cdh5.5.4
須要編譯才能使用(聯網)
說給你們的話:你們電腦的配置好的話,必定要安裝cloudera manager。畢竟是一家人的。廢話很少說,由於我目前讀研,本身筆記本電腦最大8G,只能玩手動來練手。
純粹是爲了給身邊沒高配且條件有限的學生黨看的! 但我已經在實驗室機器羣裏搭建好cloudera manager 以及 ambari都有。
由於,這篇博客,我是以CentOS爲例的。
http://archive.cloudera.com/cdh5/cdh/5/hue-3.9.0-cdh5.5.4/manual.html#_install_hue
其實,其餘系統如Ubuntu而言,就是這些依賴安裝有些區別而已。這裏,你們自行去看官網吧!mysql
1、hue-3.9.0-cdh5.5.4.tar.gz的下載地址linux
http://archive.cloudera.com/cdh5/cdh/5/hue-3.9.0-cdh5.5.4.tar.gz
2、在安裝Hue以前,須要安裝各類依賴包c++
ant
asciidoc
cyrus-sasl-devel
cyrus-sasl-gssapi
gcc
gcc-c++
krb5-devel
libtidy (for unit tests only)
libxml2-devel
libxslt-devel
make
mvn (from ``maven`` package or maven3 tarball)
MySQL(能夠不用安裝)(固然,我在安裝Hive時,已經在bigdatamaster這臺機器就安裝了MySQL)
mysql-devel (能夠不用安裝)(固然,我在安裝Hive時,已經在bigdatamaster這臺機器就安裝了MySQL)
openldap-devel
Python-devel
sqlite-devel
openssl-devel (for version 7+)
gmp-develgit
或者github
ant asciidoc cyrus-sasl-devel cyrus-sasl-gssapi gcc gcc-c++ krb5-devel libtidy (for unit tests only) libxml2-devel libxslt-devel
make mvn (from maven package or maven3 tarball) mysql (我這裏不安裝了,由於在hive那邊已經安裝了) mysql-devel (我這裏不安裝了,由於在hive那邊已經安裝了) openldap-devel python-devel sqlite-devel openssl-devel (for version 7+)
gmp-devel
檢查系統上有沒有上述的那些包web
rpm -qa | grep package_name
注意,不是上述的用法,是具體的。
[hadoop@bigdatamaster app]$ rpm -qa | grep ant wpa_supplicant-0.7.3-4.el6_3.x86_64 anthy-9100h-10.1.el6.x86_64 ibus-anthy-1.2.1-3.el6.x86_64 enchant-1.5.0-4.el6.x86_64 [hadoop@bigdatamaster app]$ rpm -qa | grep asciidoc [hadoop@bigdatamaster app]$ rpm -qa | grep cyrus-sasl-devel [hadoop@bigdatamaster app]$ rpm -qa | grep gcc libgcc-4.4.7-4.el6.x86_64 [hadoop@bigdatamaster app]$ rpm -qa | grep gcc-c++ [hadoop@bigdatamaster app]$ rpm -qa | grep krb5-devel krb5-devel-1.10.3-65.el6.x86_64 [hadoop@bigdatamaster app]$ rpm -qa | grep libtidy [hadoop@bigdatamaster app]$ rpm -qa | grep libxml2-devel [hadoop@bigdatamaster app]$ rpm -qa | grep libxslt-devel [hadoop@bigdatamaster app]$ rpm -qa | grep make make-3.81-20.el6.x86_64 [hadoop@bigdatamaster app]$ rpm -qa | grep mvn [hadoop@bigdatamaster app]$ rpm -qa | grep mysql-devel mysql-devel-5.1.73-8.el6_8.x86_64 [hadoop@bigdatamaster app]$ rpm -qa | grep openldap-devel [hadoop@bigdatamaster app]$ rpm -qa | grep python-devel [hadoop@bigdatamaster app]$ rpm -qa | grep sqlite-devel [hadoop@bigdatamaster app]$ rpm -qa | grep openssl-devel openssl-devel-1.0.1e-57.el6.x86_64 [hadoop@bigdatamaster app]$ rpm -qa | grep gmp-devel [hadoop@bigdatamaster app]$
這一步,我看到有些資料上說,須要先卸載掉,自帶的這些。否則會對後續的安裝,產生版本衝突的問題。
直接用yum安裝的,中間可能會報一些依賴的版本衝突問題,能夠卸載已經安裝的版本,而後再裝。
卸載自帶的包
rpm -e --nodeps ***
查閱和卸載自帶的ant
[hadoop@bigdatamaster app]$ su root Password: [root@bigdatamaster app]# rpm -qa | grep ant wpa_supplicant-0.7.3-4.el6_3.x86_64 anthy-9100h-10.1.el6.x86_64 ibus-anthy-1.2.1-3.el6.x86_64 enchant-1.5.0-4.el6.x86_64 [root@bigdatamaster app]# rpm -e --nodeps wpa_supplicant-0.7.3-4.el6_3.x86_64 [root@bigdatamaster app]# rpm -e --nodeps anthy-9100h-10.1.el6.x86_64 [root@bigdatamaster app]# rpm -e --nodeps ibus-anthy-1.2.1-3.el6.x86_64 [root@bigdatamaster app]# rpm -e --nodeps enchant-1.5.0-4.el6.x86_64 [root@bigdatamaster app]#
查閱和卸載自帶的asciidoc、cyrus-sasl-devel、gcc
[root@bigdatamaster app]# rpm -qa | grep asciidoc [root@bigdatamaster app]# rpm -qa | grep cyrus-sasl-devel [root@bigdatamaster app]# rpm -qa | grep gcc libgcc-4.4.7-4.el6.x86_64 [root@bigdatamaster app]# rpm -e --nodeps libgcc-4.4.7-4.el6.x86_64 (刪除完這個命令,我就後悔了) [root@bigdatamaster app]#
其餘的,我都很少贅述了,也能夠你們在中間,刪除的時候,會出現以下的問題。
[root@bigdatamaster app]# rpm -qa | grep krb5-devel rpm: error while loading shared libraries: libgcc_s.so.1: cannot open shared object file: No such file or directory [root@bigdatamaster app]# rpm -qa | grep libtidy rpm: error while loading shared libraries: libgcc_s.so.1: cannot open shared object file: No such file or directory
解決辦法
說先搜下有沒有這個libgcc_s.so.1共享庫,果真是有的。
[root@bigdatamaster app]# locate libgcc_s.so.1
在/lib64/libgcc_s.so.1。
而後,個人還在/lib64下,則
error while loading shared libraries: xxx.so.x"錯誤的緣由和解決辦法
即上面的這篇博客,裏的1) 若是共享庫文件安裝到了/lib或/usr/lib目錄下, 那麼需執行一下ldconfig命令。
其實吧,我感受,這些資料都很差。
最簡單的辦法就是,咱們不是作大數據的麼,直接,把另一臺機器的libgcc_s-4.4.6-20110824.so.1到/lib64下恢復正常。
添加maven源
wget http://repos.fedorapeople.org/repos/dchen/apache-maven/epel-apache-maven.repo -O /etc/yum.repos.d/epel-apache-maven.repo
[root@bigdatamaster app]# wget http://repos.fedorapeople.org/repos/dchen/apache-maven/epel-apache-maven.repo -O /etc/yum.repos.d/epel-apache-maven.repo --2017-05-05 19:52:01-- http://repos.fedorapeople.org/repos/dchen/apache-maven/epel-apache-maven.repo Resolving repos.fedorapeople.org... 152.19.134.199, 2610:28:3090:3001:5054:ff:fea7:9474 Connecting to repos.fedorapeople.org|152.19.134.199|:80... connected. HTTP request sent, awaiting response... 302 Found Location: https://repos.fedorapeople.org/repos/dchen/apache-maven/epel-apache-maven.repo [following] --2017-05-05 19:52:02-- https://repos.fedorapeople.org/repos/dchen/apache-maven/epel-apache-maven.repo Connecting to repos.fedorapeople.org|152.19.134.199|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 445 Saving to: 「/etc/yum.repos.d/epel-apache-maven.repo」 100%[=====================================================================================================================================================>] 445 --.-K/s in 0s 2017-05-05 19:52:04 (11.8 MB/s) - 「/etc/yum.repos.d/epel-apache-maven.repo」 saved [445/445] [root@bigdatamaster app]#
安裝依賴 (注意mysql 和 mysql - dever 不須要安裝了,由於我在hive那邊已經安裝好了)(別怪我沒提醒你)(你這裏若再安裝,版本不兼容,會出問題的)
yum install -y ant asciidoc cyrus-sasl-devel cyrus-sasl-gssapi gcc gcc-c++ krb5-devel libtidy libxml2-devel libxslt-devel make mvn openldap-devel python-devel sqlite-devel openssl-devel gmp-devel
Verifying : python-libs-2.6.6-51.el6.x86_64 75/79 Verifying : gmp-4.3.1-7.el6_2.2.x86_64 76/79 Verifying : python-2.6.6-51.el6.x86_64 77/79 Verifying : libxml2-python-2.7.6-14.el6.x86_64 78/79 Verifying : sqlite-3.6.20-1.el6.x86_64 79/79 Installed: ant.x86_64 0:1.7.1-15.el6 asciidoc.noarch 0:8.4.5-4.1.el6 cyrus-sasl-devel.x86_64 0:2.1.23-15.el6_6.2 gcc.x86_64 0:4.4.7-18.el6 gcc-c++.x86_64 0:4.4.7-18.el6 gmp-devel.x86_64 0:4.3.1-12.el6 libtidy.x86_64 0:0.99.0-19.20070615.1.el6 libxml2-devel.x86_64 0:2.7.6-21.el6_8.1 libxslt-devel.x86_64 0:1.1.26-2.el6_3.1 openldap-devel.x86_64 0:2.4.40-16.el6 python-devel.x86_64 0:2.6.6-66.el6_8 sqlite-devel.x86_64 0:3.6.20-1.el6_7.2 Dependency Installed: cloog-ppl.x86_64 0:0.15.7-1.2.el6 cpp.x86_64 0:4.4.7-18.el6 docbook-style-xsl.noarch 0:1.75.2-6.el6 java-1.5.0-gcj.x86_64 0:1.5.0.0-29.1.el6 java-1.7.0-openjdk.x86_64 1:1.7.0.131-2.6.9.0.el6_8 java-1.7.0-openjdk-devel.x86_64 1:1.7.0.131-2.6.9.0.el6_8 java_cup.x86_64 1:0.10k-5.el6 libgcc.x86_64 0:4.4.7-18.el6 libgcj.x86_64 0:4.4.7-18.el6 libgcrypt-devel.x86_64 0:1.4.5-12.el6_8 libgpg-error-devel.x86_64 0:1.7-4.el6 libstdc++-devel.x86_64 0:4.4.7-18.el6 lksctp-tools.x86_64 0:1.0.10-7.el6 mpfr.x86_64 0:2.4.1-6.el6 pcsc-lite-libs.x86_64 0:1.5.2-16.el6 ppl.x86_64 0:0.10.2-11.el6 sinjdoc.x86_64 0:0.5-9.1.el6 tzdata-java.noarch 0:2017b-1.el6 xerces-j2.x86_64 0:2.7.1-12.7.el6_5 xml-commons-apis.x86_64 0:1.3.04-3.6.el6 xml-commons-resolver.x86_64 0:1.1-4.18.el6 Updated: cyrus-sasl-gssapi.x86_64 0:2.1.23-15.el6_6.2 make.x86_64 1:3.81-23.el6 Dependency Updated: cyrus-sasl.x86_64 0:2.1.23-15.el6_6.2 cyrus-sasl-lib.x86_64 0:2.1.23-15.el6_6.2 cyrus-sasl-md5.x86_64 0:2.1.23-15.el6_6.2 cyrus-sasl-plain.x86_64 0:2.1.23-15.el6_6.2 gmp.x86_64 0:4.3.1-12.el6 libgcrypt.x86_64 0:1.4.5-12.el6_8 libgomp.x86_64 0:4.4.7-18.el6 libstdc++.x86_64 0:4.4.7-18.el6 libxml2.x86_64 0:2.7.6-21.el6_8.1 libxml2-python.x86_64 0:2.7.6-21.el6_8.1 nspr.x86_64 0:4.13.1-1.el6 nss.x86_64 0:3.28.4-1.el6_9 nss-softokn.x86_64 0:3.14.3-23.3.el6_8 nss-softokn-freebl.x86_64 0:3.14.3-23.3.el6_8 nss-sysinit.x86_64 0:3.28.4-1.el6_9 nss-tools.x86_64 0:3.28.4-1.el6_9 nss-util.x86_64 0:3.28.4-1.el6_9 openldap.x86_64 0:2.4.40-16.el6 python.x86_64 0:2.6.6-66.el6_8 python-libs.x86_64 0:2.6.6-66.el6_8 sqlite.x86_64 0:3.6.20-1.el6_7.2 Complete! [root@bigdatamaster app]#
上傳hue-3.9.0-cdh5.5.4.tar.gz(我這裏選擇先下載好,再上傳)
固然,這一步,你們也是能夠下載,編譯源碼(hue 3.9),編譯時間較長
git clone https://github.com/cloudera/hue.git branch-3.9 cd branch-3.9 make apps
編譯完後也能夠選擇安裝
make install
我這裏選擇上傳
[hadoop@bigdatamaster app]$ pwd /home/hadoop/app [hadoop@bigdatamaster app]$ ll total 60 drwxr-xr-x 8 hadoop hadoop 4096 Apr 26 2016 apache-flume-1.6.0-cdh5.5.4-bin lrwxrwxrwx 1 hadoop hadoop 19 May 5 11:15 elasticsearch -> elasticsearch-2.4.3 drwxrwxr-x 7 hadoop hadoop 4096 May 5 11:35 elasticsearch-2.4.3 lrwxrwxrwx 1 hadoop hadoop 22 May 5 12:44 filebeat -> filebeat-1.3.1-x86_64/ drwxr-xr-x 2 hadoop hadoop 4096 May 5 12:47 filebeat-1.3.1-x86_64 lrwxrwxrwx 1 hadoop hadoop 32 May 5 09:31 flume -> apache-flume-1.6.0-cdh5.5.4-bin/ lrwxrwxrwx. 1 hadoop hadoop 21 May 4 20:59 hadoop -> hadoop-2.6.0-cdh5.5.4 drwxr-xr-x. 15 hadoop hadoop 4096 May 4 21:14 hadoop-2.6.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 20 May 4 21:48 hbase -> hbase-1.0.0-cdh5.5.4 drwxr-xr-x. 27 hadoop hadoop 4096 May 4 22:05 hbase-1.0.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 20 May 4 22:37 hive -> hive-1.1.0-cdh5.5.4/ drwxr-xr-x. 10 hadoop hadoop 4096 Apr 26 2016 hive-1.1.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 11 May 4 20:34 jdk -> jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Apr 11 2015 jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Aug 5 2015 jdk1.8.0_60 lrwxrwxrwx. 1 hadoop hadoop 19 May 4 22:49 kafka -> kafka_2.11-0.8.2.2/ drwxr-xr-x. 6 hadoop hadoop 4096 May 4 22:57 kafka_2.11-0.8.2.2 lrwxrwxrwx 1 hadoop hadoop 26 May 5 19:03 kibana -> kibana-4.6.3-linux-x86_64/ drwxrwxr-x 11 hadoop hadoop 4096 Nov 4 23:24 kibana-4.6.3-linux-x86_64 lrwxrwxrwx 1 hadoop hadoop 15 May 5 14:44 logstash -> logstash-2.4.1/ drwxrwxr-x 5 hadoop hadoop 4096 May 5 14:44 logstash-2.4.1 lrwxrwxrwx 1 hadoop hadoop 12 May 5 09:05 scala -> scala-2.11.8 drwxrwxr-x 6 hadoop hadoop 4096 Mar 4 2016 scala-2.11.8 lrwxrwxrwx 1 hadoop hadoop 25 May 5 09:05 spark -> spark-2.1.0-bin-hadoop2.6 drwxr-xr-x 14 hadoop hadoop 4096 May 5 09:20 spark-2.1.0-bin-hadoop2.6 lrwxrwxrwx 1 hadoop hadoop 21 May 5 09:49 sqoop -> sqoop-1.4.6-cdh5.5.4/ drwxr-xr-x 10 hadoop hadoop 4096 Apr 26 2016 sqoop-1.4.6-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 25 May 4 20:44 zookeeper -> zookeeper-3.4.5-cdh5.5.4/ drwxr-xr-x. 16 hadoop hadoop 4096 May 4 20:52 zookeeper-3.4.5-cdh5.5.4 [hadoop@bigdatamaster app]$ rz [hadoop@bigdatamaster app]$ ll total 70956 drwxr-xr-x 8 hadoop hadoop 4096 Apr 26 2016 apache-flume-1.6.0-cdh5.5.4-bin lrwxrwxrwx 1 hadoop hadoop 19 May 5 11:15 elasticsearch -> elasticsearch-2.4.3 drwxrwxr-x 7 hadoop hadoop 4096 May 5 11:35 elasticsearch-2.4.3 lrwxrwxrwx 1 hadoop hadoop 22 May 5 12:44 filebeat -> filebeat-1.3.1-x86_64/ drwxr-xr-x 2 hadoop hadoop 4096 May 5 12:47 filebeat-1.3.1-x86_64 lrwxrwxrwx 1 hadoop hadoop 32 May 5 09:31 flume -> apache-flume-1.6.0-cdh5.5.4-bin/ lrwxrwxrwx. 1 hadoop hadoop 21 May 4 20:59 hadoop -> hadoop-2.6.0-cdh5.5.4 drwxr-xr-x. 15 hadoop hadoop 4096 May 4 21:14 hadoop-2.6.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 20 May 4 21:48 hbase -> hbase-1.0.0-cdh5.5.4 drwxr-xr-x. 27 hadoop hadoop 4096 May 4 22:05 hbase-1.0.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 20 May 4 22:37 hive -> hive-1.1.0-cdh5.5.4/ drwxr-xr-x. 10 hadoop hadoop 4096 Apr 26 2016 hive-1.1.0-cdh5.5.4 -rw-r--r-- 1 hadoop hadoop 72594458 May 4 00:14 hue-3.9.0-cdh5.5.4.tar.gz lrwxrwxrwx. 1 hadoop hadoop 11 May 4 20:34 jdk -> jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Apr 11 2015 jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Aug 5 2015 jdk1.8.0_60 lrwxrwxrwx. 1 hadoop hadoop 19 May 4 22:49 kafka -> kafka_2.11-0.8.2.2/ drwxr-xr-x. 6 hadoop hadoop 4096 May 4 22:57 kafka_2.11-0.8.2.2 lrwxrwxrwx 1 hadoop hadoop 26 May 5 19:03 kibana -> kibana-4.6.3-linux-x86_64/ drwxrwxr-x 11 hadoop hadoop 4096 Nov 4 23:24 kibana-4.6.3-linux-x86_64 lrwxrwxrwx 1 hadoop hadoop 15 May 5 14:44 logstash -> logstash-2.4.1/ drwxrwxr-x 5 hadoop hadoop 4096 May 5 14:44 logstash-2.4.1 lrwxrwxrwx 1 hadoop hadoop 12 May 5 09:05 scala -> scala-2.11.8 drwxrwxr-x 6 hadoop hadoop 4096 Mar 4 2016 scala-2.11.8 lrwxrwxrwx 1 hadoop hadoop 25 May 5 09:05 spark -> spark-2.1.0-bin-hadoop2.6 drwxr-xr-x 14 hadoop hadoop 4096 May 5 09:20 spark-2.1.0-bin-hadoop2.6 lrwxrwxrwx 1 hadoop hadoop 21 May 5 09:49 sqoop -> sqoop-1.4.6-cdh5.5.4/ drwxr-xr-x 10 hadoop hadoop 4096 Apr 26 2016 sqoop-1.4.6-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 25 May 4 20:44 zookeeper -> zookeeper-3.4.5-cdh5.5.4/ drwxr-xr-x. 16 hadoop hadoop 4096 May 4 20:52 zookeeper-3.4.5-cdh5.5.4 [hadoop@bigdatamaster app]$
解壓
[hadoop@bigdatamaster app]$ ll total 70956 drwxr-xr-x 8 hadoop hadoop 4096 Apr 26 2016 apache-flume-1.6.0-cdh5.5.4-bin lrwxrwxrwx 1 hadoop hadoop 19 May 5 11:15 elasticsearch -> elasticsearch-2.4.3 drwxrwxr-x 7 hadoop hadoop 4096 May 5 11:35 elasticsearch-2.4.3 lrwxrwxrwx 1 hadoop hadoop 22 May 5 12:44 filebeat -> filebeat-1.3.1-x86_64/ drwxr-xr-x 2 hadoop hadoop 4096 May 5 12:47 filebeat-1.3.1-x86_64 lrwxrwxrwx 1 hadoop hadoop 32 May 5 09:31 flume -> apache-flume-1.6.0-cdh5.5.4-bin/ lrwxrwxrwx. 1 hadoop hadoop 21 May 4 20:59 hadoop -> hadoop-2.6.0-cdh5.5.4 drwxr-xr-x. 15 hadoop hadoop 4096 May 4 21:14 hadoop-2.6.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 20 May 4 21:48 hbase -> hbase-1.0.0-cdh5.5.4 drwxr-xr-x. 27 hadoop hadoop 4096 May 4 22:05 hbase-1.0.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 20 May 4 22:37 hive -> hive-1.1.0-cdh5.5.4/ drwxr-xr-x. 10 hadoop hadoop 4096 Apr 26 2016 hive-1.1.0-cdh5.5.4 -rw-r--r-- 1 hadoop hadoop 72594458 May 4 00:14 hue-3.9.0-cdh5.5.4.tar.gz lrwxrwxrwx. 1 hadoop hadoop 11 May 4 20:34 jdk -> jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Apr 11 2015 jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Aug 5 2015 jdk1.8.0_60 lrwxrwxrwx. 1 hadoop hadoop 19 May 4 22:49 kafka -> kafka_2.11-0.8.2.2/ drwxr-xr-x. 6 hadoop hadoop 4096 May 4 22:57 kafka_2.11-0.8.2.2 lrwxrwxrwx 1 hadoop hadoop 26 May 5 19:03 kibana -> kibana-4.6.3-linux-x86_64/ drwxrwxr-x 11 hadoop hadoop 4096 Nov 4 23:24 kibana-4.6.3-linux-x86_64 lrwxrwxrwx 1 hadoop hadoop 15 May 5 14:44 logstash -> logstash-2.4.1/ drwxrwxr-x 5 hadoop hadoop 4096 May 5 14:44 logstash-2.4.1 lrwxrwxrwx 1 hadoop hadoop 12 May 5 09:05 scala -> scala-2.11.8 drwxrwxr-x 6 hadoop hadoop 4096 Mar 4 2016 scala-2.11.8 lrwxrwxrwx 1 hadoop hadoop 25 May 5 09:05 spark -> spark-2.1.0-bin-hadoop2.6 drwxr-xr-x 14 hadoop hadoop 4096 May 5 09:20 spark-2.1.0-bin-hadoop2.6 lrwxrwxrwx 1 hadoop hadoop 21 May 5 09:49 sqoop -> sqoop-1.4.6-cdh5.5.4/ drwxr-xr-x 10 hadoop hadoop 4096 Apr 26 2016 sqoop-1.4.6-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 25 May 4 20:44 zookeeper -> zookeeper-3.4.5-cdh5.5.4/ drwxr-xr-x. 16 hadoop hadoop 4096 May 4 20:52 zookeeper-3.4.5-cdh5.5.4 [hadoop@bigdatamaster app]$ tar -zxvf hue-3.9.0-cdh5.5.4.tar.gz
total 70960 drwxr-xr-x 8 hadoop hadoop 4096 Apr 26 2016 apache-flume-1.6.0-cdh5.5.4-bin lrwxrwxrwx 1 hadoop hadoop 19 May 5 11:15 elasticsearch -> elasticsearch-2.4.3 drwxrwxr-x 7 hadoop hadoop 4096 May 5 11:35 elasticsearch-2.4.3 lrwxrwxrwx 1 hadoop hadoop 22 May 5 12:44 filebeat -> filebeat-1.3.1-x86_64/ drwxr-xr-x 2 hadoop hadoop 4096 May 5 12:47 filebeat-1.3.1-x86_64 lrwxrwxrwx 1 hadoop hadoop 32 May 5 09:31 flume -> apache-flume-1.6.0-cdh5.5.4-bin/ lrwxrwxrwx. 1 hadoop hadoop 21 May 4 20:59 hadoop -> hadoop-2.6.0-cdh5.5.4 drwxr-xr-x. 15 hadoop hadoop 4096 May 4 21:14 hadoop-2.6.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 20 May 4 21:48 hbase -> hbase-1.0.0-cdh5.5.4 drwxr-xr-x. 27 hadoop hadoop 4096 May 4 22:05 hbase-1.0.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 20 May 4 22:37 hive -> hive-1.1.0-cdh5.5.4/ drwxr-xr-x. 10 hadoop hadoop 4096 Apr 26 2016 hive-1.1.0-cdh5.5.4 drwxr-xr-x 9 hadoop hadoop 4096 Apr 26 2016 hue-3.9.0-cdh5.5.4 -rw-r--r-- 1 hadoop hadoop 72594458 May 4 00:14 hue-3.9.0-cdh5.5.4.tar.gz lrwxrwxrwx. 1 hadoop hadoop 11 May 4 20:34 jdk -> jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Apr 11 2015 jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Aug 5 2015 jdk1.8.0_60 lrwxrwxrwx. 1 hadoop hadoop 19 May 4 22:49 kafka -> kafka_2.11-0.8.2.2/ drwxr-xr-x. 6 hadoop hadoop 4096 May 4 22:57 kafka_2.11-0.8.2.2 lrwxrwxrwx 1 hadoop hadoop 26 May 5 19:03 kibana -> kibana-4.6.3-linux-x86_64/ drwxrwxr-x 11 hadoop hadoop 4096 Nov 4 23:24 kibana-4.6.3-linux-x86_64 lrwxrwxrwx 1 hadoop hadoop 15 May 5 14:44 logstash -> logstash-2.4.1/ drwxrwxr-x 5 hadoop hadoop 4096 May 5 14:44 logstash-2.4.1 lrwxrwxrwx 1 hadoop hadoop 12 May 5 09:05 scala -> scala-2.11.8 drwxrwxr-x 6 hadoop hadoop 4096 Mar 4 2016 scala-2.11.8 lrwxrwxrwx 1 hadoop hadoop 25 May 5 09:05 spark -> spark-2.1.0-bin-hadoop2.6 drwxr-xr-x 14 hadoop hadoop 4096 May 5 09:20 spark-2.1.0-bin-hadoop2.6 lrwxrwxrwx 1 hadoop hadoop 21 May 5 09:49 sqoop -> sqoop-1.4.6-cdh5.5.4/ drwxr-xr-x 10 hadoop hadoop 4096 Apr 26 2016 sqoop-1.4.6-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 25 May 4 20:44 zookeeper -> zookeeper-3.4.5-cdh5.5.4/ drwxr-xr-x. 16 hadoop hadoop 4096 May 4 20:52 zookeeper-3.4.5-cdh5.5.4 [hadoop@bigdatamaster app]$ rm hue-3.9.0-cdh5.5.4.tar.gz
建立軟連接
[hadoop@bigdatamaster app]$ ll total 64 drwxr-xr-x 8 hadoop hadoop 4096 Apr 26 2016 apache-flume-1.6.0-cdh5.5.4-bin lrwxrwxrwx 1 hadoop hadoop 19 May 5 11:15 elasticsearch -> elasticsearch-2.4.3 drwxrwxr-x 7 hadoop hadoop 4096 May 5 11:35 elasticsearch-2.4.3 lrwxrwxrwx 1 hadoop hadoop 22 May 5 12:44 filebeat -> filebeat-1.3.1-x86_64/ drwxr-xr-x 2 hadoop hadoop 4096 May 5 12:47 filebeat-1.3.1-x86_64 lrwxrwxrwx 1 hadoop hadoop 32 May 5 09:31 flume -> apache-flume-1.6.0-cdh5.5.4-bin/ lrwxrwxrwx. 1 hadoop hadoop 21 May 4 20:59 hadoop -> hadoop-2.6.0-cdh5.5.4 drwxr-xr-x. 15 hadoop hadoop 4096 May 4 21:14 hadoop-2.6.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 20 May 4 21:48 hbase -> hbase-1.0.0-cdh5.5.4 drwxr-xr-x. 27 hadoop hadoop 4096 May 4 22:05 hbase-1.0.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 20 May 4 22:37 hive -> hive-1.1.0-cdh5.5.4/ drwxr-xr-x. 10 hadoop hadoop 4096 Apr 26 2016 hive-1.1.0-cdh5.5.4 drwxr-xr-x 9 hadoop hadoop 4096 Apr 26 2016 hue-3.9.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 11 May 4 20:34 jdk -> jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Apr 11 2015 jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Aug 5 2015 jdk1.8.0_60 lrwxrwxrwx. 1 hadoop hadoop 19 May 4 22:49 kafka -> kafka_2.11-0.8.2.2/ drwxr-xr-x. 6 hadoop hadoop 4096 May 4 22:57 kafka_2.11-0.8.2.2 lrwxrwxrwx 1 hadoop hadoop 26 May 5 19:03 kibana -> kibana-4.6.3-linux-x86_64/ drwxrwxr-x 11 hadoop hadoop 4096 Nov 4 23:24 kibana-4.6.3-linux-x86_64 lrwxrwxrwx 1 hadoop hadoop 15 May 5 14:44 logstash -> logstash-2.4.1/ drwxrwxr-x 5 hadoop hadoop 4096 May 5 14:44 logstash-2.4.1 lrwxrwxrwx 1 hadoop hadoop 12 May 5 09:05 scala -> scala-2.11.8 drwxrwxr-x 6 hadoop hadoop 4096 Mar 4 2016 scala-2.11.8 lrwxrwxrwx 1 hadoop hadoop 25 May 5 09:05 spark -> spark-2.1.0-bin-hadoop2.6 drwxr-xr-x 14 hadoop hadoop 4096 May 5 09:20 spark-2.1.0-bin-hadoop2.6 lrwxrwxrwx 1 hadoop hadoop 21 May 5 09:49 sqoop -> sqoop-1.4.6-cdh5.5.4/ drwxr-xr-x 10 hadoop hadoop 4096 Apr 26 2016 sqoop-1.4.6-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 25 May 4 20:44 zookeeper -> zookeeper-3.4.5-cdh5.5.4/ drwxr-xr-x. 16 hadoop hadoop 4096 May 4 20:52 zookeeper-3.4.5-cdh5.5.4 [hadoop@bigdatamaster app]$ ln -s hue-3.9.0-cdh5.5.4/ hue [hadoop@bigdatamaster app]$ ll total 64 drwxr-xr-x 8 hadoop hadoop 4096 Apr 26 2016 apache-flume-1.6.0-cdh5.5.4-bin lrwxrwxrwx 1 hadoop hadoop 19 May 5 11:15 elasticsearch -> elasticsearch-2.4.3 drwxrwxr-x 7 hadoop hadoop 4096 May 5 11:35 elasticsearch-2.4.3 lrwxrwxrwx 1 hadoop hadoop 22 May 5 12:44 filebeat -> filebeat-1.3.1-x86_64/ drwxr-xr-x 2 hadoop hadoop 4096 May 5 12:47 filebeat-1.3.1-x86_64 lrwxrwxrwx 1 hadoop hadoop 32 May 5 09:31 flume -> apache-flume-1.6.0-cdh5.5.4-bin/ lrwxrwxrwx. 1 hadoop hadoop 21 May 4 20:59 hadoop -> hadoop-2.6.0-cdh5.5.4 drwxr-xr-x. 15 hadoop hadoop 4096 May 4 21:14 hadoop-2.6.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 20 May 4 21:48 hbase -> hbase-1.0.0-cdh5.5.4 drwxr-xr-x. 27 hadoop hadoop 4096 May 4 22:05 hbase-1.0.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 20 May 4 22:37 hive -> hive-1.1.0-cdh5.5.4/ drwxr-xr-x. 10 hadoop hadoop 4096 Apr 26 2016 hive-1.1.0-cdh5.5.4 lrwxrwxrwx 1 hadoop hadoop 19 May 5 20:44 hue -> hue-3.9.0-cdh5.5.4/ drwxr-xr-x 9 hadoop hadoop 4096 Apr 26 2016 hue-3.9.0-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 11 May 4 20:34 jdk -> jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Apr 11 2015 jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Aug 5 2015 jdk1.8.0_60 lrwxrwxrwx. 1 hadoop hadoop 19 May 4 22:49 kafka -> kafka_2.11-0.8.2.2/ drwxr-xr-x. 6 hadoop hadoop 4096 May 4 22:57 kafka_2.11-0.8.2.2 lrwxrwxrwx 1 hadoop hadoop 26 May 5 19:03 kibana -> kibana-4.6.3-linux-x86_64/ drwxrwxr-x 11 hadoop hadoop 4096 Nov 4 23:24 kibana-4.6.3-linux-x86_64 lrwxrwxrwx 1 hadoop hadoop 15 May 5 14:44 logstash -> logstash-2.4.1/ drwxrwxr-x 5 hadoop hadoop 4096 May 5 14:44 logstash-2.4.1 lrwxrwxrwx 1 hadoop hadoop 12 May 5 09:05 scala -> scala-2.11.8 drwxrwxr-x 6 hadoop hadoop 4096 Mar 4 2016 scala-2.11.8 lrwxrwxrwx 1 hadoop hadoop 25 May 5 09:05 spark -> spark-2.1.0-bin-hadoop2.6 drwxr-xr-x 14 hadoop hadoop 4096 May 5 09:20 spark-2.1.0-bin-hadoop2.6 lrwxrwxrwx 1 hadoop hadoop 21 May 5 09:49 sqoop -> sqoop-1.4.6-cdh5.5.4/ drwxr-xr-x 10 hadoop hadoop 4096 Apr 26 2016 sqoop-1.4.6-cdh5.5.4 lrwxrwxrwx. 1 hadoop hadoop 25 May 4 20:44 zookeeper -> zookeeper-3.4.5-cdh5.5.4/ drwxr-xr-x. 16 hadoop hadoop 4096 May 4 20:52 zookeeper-3.4.5-cdh5.5.4 [hadoop@bigdatamaster app]$
進入hue的安裝目錄,進行編譯。
make apps
看我的的網速吧,這個過程當中,須要幾分鐘。
爲何須要make apps,其實啊,是爲了獲得以下
首先,你們一應要看清個人3個節點的集羣機器狀況!(不看清楚,本身去後悔吧)
(這個配置文件表格,我是爲了給你們方便看,製做出來展現,是根據你們的機器變更而走的)(動態的)
Hue配置段 | Hue配置項 | Hue配置值 | 說明 |
desktop | default_hdfs_superuser | hadoop | HDFS管理用戶 |
desktop | http_host | 192.168.80.10 | Hue Web Server所在主機/IP |
desktop | http_port | 8000 | Hue Web Server服務端口 |
desktop | server_user | hue | 運行Hue Web Server的進程用戶 |
desktop | server_group | hue | 運行Hue Web Server的進程用戶組 |
desktop | default_user | hue | Hue管理員 |
desktop | default_hdfs_superuser | hadoop | 更改成你的hadoop用戶,網上有些資料寫爲何將 修改 文件desktop/libs/hadoop/src/hadoop/fs/webhdfs.py 中的 DEFAULT_HDFS_SUPERUSER = ‘hdfs’ 更改成你的hadoop用戶。
個人這裏是hadoop
修改默認的hdfs訪問用戶 修改hue.ini中的配置 default_hdfs_superuser=hdfs 改成 default_hdfs_superuser=root (注意,這裏別人的用戶是root) |
hadoop/hdfs_clusters | fs_defaultfs | hdfs://bigdatamaster:9000 | 對應core-site.xml配置項fs.defaultFS |
hadoop/hdfs_clusters | hadoop_conf_dir | /home/hadoop/app/hadoop/etc/hadoop/conf | Hadoop配置文件目錄 |
hadoop/yarn_clusters | resourcemanager_host | bigdatamaster | 對應yarn-site.xml配置項yarn.resourcemanager.hostname |
hadoop/yarn_clusters | resourcemanager_port | 8032 | ResourceManager服務端口號 |
hadoop/yarn_clusters | resourcemanager_api_url | http://bigdatamaster:23188 | 對應於yarn-site.xml配置項yarn.resourcemanager.webapp.address(我這裏是爲了不跟spark那邊的端口衝突,固然你也能夠改成其餘的端口) |
hadoop/yarn_clusters | proxy_api_url | http://bigdatamaster:8888 | 對應yarn-site.xml配置項yarn.web-proxy.address |
hadoop/yarn_clusters | history_server_api_url | http://bigdatamaster:19888 | 對應mapred-site.xml配置項mapreduce.jobhistory.webapp.address |
zookeeper |
host_ports |
bigdatamaster:2181,bigdataslave1:2181,bigdataslave2:2181 |
zookeeper集羣管理 |
beeswax | hive_server_host | bigdatamaster | Hive所在節點主機名/IP |
beeswax | hive_server_port | 10000 | HiveServer2服務端口號 |
beeswax |
hive_conf_dir | home/hadoop/app/hive/conf |
Hive配置文件目錄
|
由於,個人Hue僅只安裝在bigdatamaster(192.168.80.10)這臺機器上便可!!!
配置hue文件(重點,必定要細心)
$HUE_HOME/desktop/conf/hue.ini
[hadoop@bigdatamaster conf]$ pwd /home/hadoop/app/hue/desktop/conf [hadoop@bigdatamaster conf]$ ll total 52 -rw-r--r-- 1 hadoop hadoop 41572 Apr 26 2016 hue.ini -rw-r--r-- 1 hadoop hadoop 1843 Apr 26 2016 log4j.properties -rw-r--r-- 1 hadoop hadoop 1721 Apr 26 2016 log.conf [hadoop@bigdatamaster conf]$ vim hue.ini
http://archive.cloudera.com/cdh5/cdh/5/hue-3.9.0-cdh5.5.4/manual.html#_install_hue
[desktop]這塊,配置以下
[desktop] # hue webServer 地址和端口號
secret_key= secret_key=jFE93j;2[290-eiw.KEiwN2s3['d;/.q[eIW^y#e=+Iei*@Mn<qW5o
http_host=192.168.80.10 http_port=8888
time_zone=Asia/Shanghai
# Webserver runs as this user server_user=hue server_group=hue # This should be the Hue admin and proxy user default_user=hue # This should be the hadoop cluster admin default_hdfs_superuser=hadoop
注意,這裏也能夠不弄,保持默認的
http://archive.cloudera.com/cdh5/cdh/5/hue-3.9.0-cdh5.5.4/manual.html#_install_hue
[hadoop]這塊,配置以下 (注意官網說,WebHdfs 或者 HttpFS)(通常用WebHdfs,那是由於非HA集羣。若是是HA集羣,則必須還要配置HttpFS)
下面這篇博客,我給了具體的配置和緣由。
好的,咱們繼續往下。由於本博客立足因而由bigdatamaster、bigdataslave1和bigdataslave2組成的非HA的3節點集羣,因此選擇WebHdfs。
bigdataslave1 和 bigdataslave2都操做,很少贅述。
而後,修改完三臺機器的hdfs-site.xml以後,再修改core-site.xml
bigdataslave1和bigdataslave2都操做,很少贅述。
hadoop模塊
[hadoop] # Configuration for HDFS NameNode # ------------------------------------------------------------------------ [[hdfs_clusters]] # HA support by using HttpFs [[[default]]] # Enter the filesystem uri fs_defaultfs=hdfs://bigdatamaster:9000 # NameNode logical name. ## logical_name= # Use WebHdfs/HttpFs as the communication mechanism. # Domain should be the NameNode or HttpFs host. # Default port is 14000 for HttpFs. webhdfs_url=http://bigdatamaster:50070/webhdfs/v1 # Change this if your HDFS cluster is Kerberos-secured ## security_enabled=false # In secure mode (HTTPS), if SSL certificates from YARN Rest APIs # have to be verified against certificate authority ## ssl_cert_ca_verify=True # Directory of the Hadoop configuration hadoop_conf_dir=/home/hadoop/app/hadoop/etc/hadoop/conf
注意,個人 fs_defaultfs=hdfs://bigdatamaster:9000 ,你們要根據本身的機器來配置,思路必定要清晰,別一味地看別人博客怎麼配置的
網上有些如, fs_defaultfs=hdfs://mycluster ,以及 fs_defaultfs=hdfs://master:8020。注意,這是別人的機器是這麼配置的。
總之,跟本身機器的core-site.xml的fs.defaultFS屬性保持一致便可。
其中,bigdatamaster,是我安裝Hue這臺機器的主機名,192.168.80.10是它對應的靜態ip。
那麼,爲何是上面這樣來配置的呢。你們要知道爲何
http://hadoop.apache.org/docs/r2.5.2/
[yarn_clusters]這塊
[[yarn_clusters]] [[[default]]] # Enter the host on which you are running the ResourceManager resourcemanager_host=192.168.80.10 # The port where the ResourceManager IPC listens on resourcemanager_port=8032 # Whether to submit jobs to this cluster submit_to=True # Resource Manager logical name (required for HA) ## logical_name= # Change this if your YARN cluster is Kerberos-secured ## security_enabled=false # URL of the ResourceManager API resourcemanager_api_url=http://192.168.80.10:8088 # URL of the ProxyServer API proxy_api_url=http://192.168.80.10:8088 # URL of the HistoryServer API history_server_api_url=http://192.168.80.10:19888
進一步深刻的話,請移步個人博客
[zookeeper]這塊
[zookeeper] host_ports=bigdatamaster:2181,bigdataslave1:2181,bigdataslave2:2181
[beeswax] 和 hive 這塊
[beeswax] # Host where HiveServer2 is running. # If Kerberos security is enabled, use fully-qualified domain name (FQDN). hive_server_host=bigdatamaster # Port where HiveServer2 Thrift server runs on. hive_server_port=10000 # Hive configuration directory, where hive-site.xml is located hive_conf_dir=/home/hadoop/app/hive/conf
由於,個人hive是安裝在bigdatamaster這臺機器上。你們必定要根據本身的機器狀況來配置啊!
<property> <name>hive.server2.thrift.port</name> <value>10000</value> </property> <property> <name>hive.server2.thrift.bind.host</name> <value>bigdatamaster</value> </property>
同時,是還要將hive-site.xml裏的hive.server2.thrift.port屬性 和 hive.server2.thrift.bind.host屬性。我這裏的hive是安裝在bigdatamaster機器上。
更深刻,想請請見
database模塊
########################################################################### # Settings for the RDBMS application ########################################################################### [librdbms] # The RDBMS app can have any number of databases configured in the databases # section. A database is known by its section name # (IE sqlite, mysql, psql, and oracle in the list below). [[databases]] # sqlite configuration. [[[sqlite]]] # Name to show in the UI. nice_name=SQLite # For SQLite, name defines the path to the database. name=/home/hadoop/app/hue/desktop/desktop.db # Database backend to use. engine=sqlite
hive> show databases; OK default hive Time taken: 0.074 seconds, Fetched: 2 row(s) hive>
# mysql, oracle, or postgresql configuration. [[[mysql]]] # Name to show in the UI. nice_name="My SQL DB" # For MySQL and PostgreSQL, name is the name of the database. # For Oracle, Name is instance of the Oracle server. For express edition # this is 'xe' by default. name=hive # Database backend to use. This can be: # 1. mysql # 2. postgresql # 3. oracle engine=mysql # IP or hostname of the database to connect to. host=bigdatamaster # Port the database server is listening to. Defaults are: # 1. MySQL: 3306 # 2. PostgreSQL: 5432 # 3. Oracle Express Edition: 1521 port=3306 # Username to authenticate with when connecting to the database. user=hive # Password matching the username to authenticate with when # connecting to the database. password=hive # Database options to send to the server when connecting. # https://docs.djangoproject.com/en/1.4/ref/databases/ ## options={}
pig模塊的配置
具體,見
zookeeper模塊的配置
具體,見
spark模塊的配置
具體,見
impala模塊的配置
具體請見
liboozie和oozie模塊的配置
具體,見
sqoop模塊的配置
具體,見
hbase模塊的配置(暫時這裏遇到了點問題)
一、配置HBase
Hue須要讀取HBase的數據是使用thrift的方式,默認HBase的thrift服務沒有開啓,全部須要手動額外開啓thrift 服務。
thrift service默認使用的是9090端口,使用以下命令查看端口是否被佔用。
[hadoop@bigdatamaster conf]$ netstat -nl|grep 9090 [hadoop@bigdatamaster conf]$
這裏,最好保持默認端口。
對於Hbase的配置,有點錯誤。
Hue的啓動
也就是說,你Hue配置文件裏面配置了什麼,進程都要先提早啓動。
build/env/bin/supervisor
[hadoop@bigdatamaster hue]$ pwd /home/hadoop/app/hue [hadoop@bigdatamaster hue]$ ll total 92 -rw-rw-r-- 1 hadoop hadoop 2782 May 5 21:03 app.reg drwxr-xr-x 22 hadoop hadoop 4096 May 5 21:03 apps drwxrwxr-x 4 hadoop hadoop 4096 May 5 21:03 build drwxr-xr-x 3 hadoop hadoop 4096 Apr 26 2016 cloudera drwxr-xr-x 6 hadoop hadoop 4096 May 7 10:13 desktop drwxr-xr-x 6 hadoop hadoop 4096 Apr 26 2016 docs drwxr-xr-x 3 hadoop hadoop 4096 Apr 26 2016 ext -rw-r--r-- 1 hadoop hadoop 11358 Apr 26 2016 LICENSE.txt drwxrwxr-x 2 hadoop hadoop 4096 May 7 09:27 logs -rw-r--r-- 1 hadoop hadoop 4742 Apr 26 2016 Makefile -rw-r--r-- 1 hadoop hadoop 8505 Apr 26 2016 Makefile.sdk -rw-r--r-- 1 hadoop hadoop 3531 Apr 26 2016 Makefile.vars -rw-r--r-- 1 hadoop hadoop 2192 Apr 26 2016 Makefile.vars.priv drwxr-xr-x 2 hadoop hadoop 4096 Apr 26 2016 maven -rw-r--r-- 1 hadoop hadoop 801 Apr 26 2016 NOTICE.txt -rw-r--r-- 1 hadoop hadoop 1305 Apr 26 2016 README drwxr-xr-x 5 hadoop hadoop 4096 Apr 26 2016 tools -rw-r--r-- 1 hadoop hadoop 932 Apr 26 2016 VERSION [hadoop@bigdatamaster hue]$ build/env/bin/supervisor
[hadoop@bigdatamaster hue]$ build/env/bin/supervisor [INFO] Not running as root, skipping privilege drop starting server with options: {'daemonize': False, 'host': '192.168.80.10', 'pidfile': None, 'port': 8888, 'server_group': 'hue', 'server_name': 'localhost', 'server_user': 'hue', 'ssl_certificate': None, 'ssl_certificate_chain': None, 'ssl_cipher_list': 'ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-DSS-AES128-GCM-SHA256:kEDH+AESGCM:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA:ECDHE-ECDSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-DSS-AES128-SHA256:DHE-RSA-AES256-SHA256:DHE-DSS-AES256-SHA:DHE-RSA-AES256-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:AES:CAMELLIA:DES-CBC3-SHA:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!aECDH:!EDH-DSS-DES-CBC3-SHA:!EDH-RSA-DES-CBC3-SHA:!KRB5-DES-CBC3-SHA', 'ssl_private_key': None, 'threads': 40, 'workdir': None} /home/hadoop/app/hue-3.9.0-cdh5.5.4/build/env/lib/python2.6/site-packages/django_axes-1.4.0-py2.6.egg/axes/decorators.py:210: DeprecationWarning: The use of AUTH_PROFILE_MODULE to define user profiles has been deprecated. profile = user.get_profile()
http://bigdatamaster:8888
這裏,很少贅述了。
若是你們,在啓動Hue以後,遇到一些問題,相應能夠去看我如下寫的博客