在mac上安裝hadoop僞分佈式

換了macbook pro以後,要從新安裝hadoop,可是mac上的jdk跟windows上的不一樣,致使折騰了挺久的,如今分享出來,但願對你們有用。html

一:下載jdkjava

選擇最新版本下載,地址:http://www.oracle.com/technetwork/java/javase/downloads/index.htmlnode

安裝完成以後,打開終端,輸入java -version ,出現相似以下說明安裝成功。apache

java version "1.8.0"vim

Java(TM) SE Runtime Environment (build 1.8.0-b132)windows

Java HotSpot(TM) 64-Bit Server VM (build 25.0-b70, mixed mode)安全

二:配置hadooporacle

下載hadoop,本身可到官網下載穩定版本,不用下載最新的,由於最新的可能不穩定。app

配置hadoop 裏面conf文件夾四個文件(hadoop-env.sh,core-site.xml,mapred-site.xml,hdfs-site.xml)dom

下載完hadoop以後,把它解壓到你想存放的文件夾,而後進入hadoop的conf目錄

1.配置hadopp-env.sh

打開該文件以後找到

#export JAVA_HOME=
#export HADOOP_HEAPSIZE=2000
#export HADOOP_OPTS=-server

修改成:

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home

export HADOOP_HEAPSIZE=2000

#export HADOOP_OPTS=-server

export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"

 

即去掉前面的解釋符#

特別注意:

JAVA_HOME=的配置是相似上面的目錄,百度上有些文章寫到是在終端輸入whereis java出現的目錄,這是不對的。

由於mac的jdk安裝在根目錄的Library文件夾下面。

2.配置core-site.xml

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property> 
   <name>hadoop.tmp.dir</name>
   <value>/Users/username/Documents/apache/hadoop-2.6.0/hadoop_tmp</value>
   <description>A base for other temporary directories.</description>
 </property>
<property>
   <name>fs.default.name</name>
   <value>hdfs://localhost:9000</value>
</property>
<property>
    <name>mapred.job.tracker</name>
    <value>hdfs://localhost:9001</value>
</property>
<property>
    <name>dfs.replication</name>
    <value>1</value>
</property>
</configuration>

 

3.配置mapred-site.xml

 

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
</configuration>

 

4.配置hdfs-site.xml

 

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>

 

三:配置ssh,無密碼登陸

1.mac 上已經ssh了,在終端輸入ssh-keygen -t rsa命令,碰到須要輸入密碼的直接按enter健便可。出現以下成功

Enter passphrase (empty for no passphrase): 
Enter same passphrase again: 
Your identification has been saved in /Users/jia/.ssh/id_rsa.
Your public key has been saved in /Users/jia/.ssh/id_rsa.pub.
The key fingerprint is:
d4:85:aa:83:ae:db:50:48:0c:5b:dd:80:bb:fa:26:a7 jia@JIAS-MacBook-Pro.local
The key's randomart image is:
+--[ RSA 2048]----+
|. .o.o     ..    |
| =. . .  ...     |
|. o.    ...      |
| ...   ..        |
|  .... .S        |
|  ... o          |
| ...   .         |
|o oo.            |
|E*+o.            |
+-----------------+

 

 在終端輸入cd .ssh 進入.ssh目錄,輸入命令。

cp id_rsa.pub authorized_keys

便可。

2.在mac上輸入

ssh localhost

若是出現

ssh: connect to host localhost port 22: Connection refused

表示當前用戶沒有權限。這個多是系統爲安全考慮,默認設置的。

更改設置以下:進入system preference --> sharing --> 勾選remote login,並設置allow access for all users。再次輸入「ssh localhost",再輸入密碼並確認以後,能夠看到ssh成功

若是出現

No such file or directory...

1.啓動sshd服務:
sudo launchctl load -w /System/Library/LaunchDaemons/ssh.plist

2.查看服務的啓動,篩選:
sudo launchctl list | grep ssh

四:配置環境變量

suodo vim /etc/profile

添加:

export HADOOP_HOME=/Users/username/Documents/apache/hadoop-2.6.0 export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin

五:啓動hadoop

1.進入hadoop文件夾,用以下命令格式化:

bin/hadoop namenode -format

出現以下,說明成功

/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = JIAS-MacBook-Pro.local/192.168.1.3
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 0.20.2
STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
************************************************************/
Re-format filesystem in /tmp/hadoop-jia/dfs/name ? (Y or N) Y
14/07/14 13:55:17 INFO namenode.FSNamesystem: fsOwner=jia,staff,everyone,localaccounts,_appserverusr,admin,_appserveradm,_lpadmin,com.apple.sharepoint.group.1,_appstore,_lpoperator,_developer,com.apple.access_screensharing,com.apple.access_ssh
14/07/14 13:55:17 INFO namenode.FSNamesystem: supergroup=supergroup
14/07/14 13:55:17 INFO namenode.FSNamesystem: isPermissionEnabled=true
14/07/14 13:55:17 INFO common.Storage: Image file of size 93 saved in 0 seconds.
14/07/14 13:55:17 INFO common.Storage: Storage directory /tmp/hadoop-jia/dfs/name has been successfully formatted.
14/07/14 13:55:17 INFO namenode.NameNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at JIAS-MacBook-Pro.local/192.168.1.3
************************************************************/

 

2.啓動hadoop守護進程

bin/start-all.sh

3.中止hadoop守護進程

bin/stop-all.sh

六:錯誤分析

1.以前運行時總出現以下錯誤

cat: Documents/hadoop-0.20.2/conf/slaves: No such file or directory
cat: Documents/hadoop-0.20.2/conf/masters: No such file or directory

其實這是在配置mahout的環境時加上了下面這句環境變量致使的錯誤

export HADOOP_CONF_DIR=Documents/hadoop-0.20.2/conf

以前我電腦的環境變量爲(在終端輸入命令:open /etc/  , 找的profile文件,打開,配置好環境變量在覆蓋以前的)

export HADOOP_HOME=Documents/hadoop-0.20.0
export MAHOUT_HOME=Documents/hadoop-0.20.2/mahout-distribution-0.9
export MAVEN_HOME=Documents/apache-maven-3.2.2


export PATH=$PATH:$HADOOP_HOME/bin:$MAHOUT_HOME/bin:$MAVEN_HOME/bin

export HADOOP_CONF_DIR=Documents/hadoop-0.20.2/conf
export MAHOUT_CONF_DIR=Documents/hadoop-0.20.2/mahout-distribution-0.9/conf

export classpath=$classpath:$MAHOUT_HOME/lib:$HADOOP_CONF_DIR:$MAHOUT_CONF_DIR

把環境配置改成以下就能夠正常運行了

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0.jdk/Contents/Home
export HADOOP_HOME=Documents/hadoop-0.20.0
export MAHOUT_HOME=Documents/hadoop-0.20.2/mahout-distribution-0.9
export MAVEN_HOME=Documents/apache-maven-3.2.2

export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$MAVEN_HOME/bin:$MAHOUT_HOME/bin

 

export MAHOUT_CONF_DIR=Documents/hadoop-0.20.2/mahout-distribution-0.9/conf

export classpath=$classpath:$JAVA_HOME/lib:$MAHOUT_HOME/lib:$MAHOUT_CONF_DIR

紅色部分作好加進去

既去掉

export HADOOP_CONF_DIR=Documents/hadoop-0.20.2/conf

 注意:

修改/etc/profile上的環境配置文件以後,用以下命令使其生效

source /etc/profile

 2.若是須要本身編譯hadoop源碼的話,編譯過程可能會出現org.apache.maven.plugin.MojoExecutionException的錯誤

 Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.6.2:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.6.2:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
    at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
    at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
    at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
    at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
    at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version
    at org.apache.hadoop.maven.plugin.protoc.ProtocMojo.execute(ProtocMojo.java:105)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
    ... 20 more
Caused by: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version
    at org.apache.hadoop.maven.plugin.protoc.ProtocMojo.execute(ProtocMojo.java:68)
    ... 22 more

下載protobuf-2.4.1.jar

http://vdisk.weibo.com/s/tYlk6JrNUYTY

安裝protobuf

tar xzf protobuf-2.4.1.tar.gz
cd protobuf-2.4.1
./configure
make
sudo make install
sudo ldconfig

在從新編譯

 3.namenode沒啓動

解決方法:先可視化以後再啓動

bin/hadoop namenode -format
相關文章
相關標籤/搜索