編譯Hadoop1.0.2歷程和解決問題記錄

1.安裝eclipse3.6.2, 廢止3.7, 這個有不少問題

2.安裝eclipse插件ivy

You can install Apache IvyDE plugins from the IvyDE update site: http://www.apache.org/dist/ant/ivyde/updatesite.

First you have to configure Eclipse: add the IvyDE update site. In order to do it, follow these steps (Note that for Eclipse 3.4 it may defers):java

  • Open the update manager in Eclipse: Help > Software Updates > Find and Install...
  • In the popup window, select Search for features to install, and click Next
  • Then click on New Remote Site...
  • Name: Apache Ivy update site
  • URL: http://www.apache.org/dist/ant/ivyde/updatesite
  • Click OK
  • A new entry "Apache Ivy update site" will appear in the list of update sitesc++

3.下面引用另出博客的安裝,有關注能夠去下面連接出看看
apache

http://gushuizerotoone.iteye.com/blog/638480app

-------------------start ref-----eclipse

1.修改$HADOOP_HOME/src/contrib/build-contrib.xml
增長一行:<property name="eclipse.home" location="/home/gushui/eclipse"/>
上句後面的/home/gushui/eclipse由本身的$ECLIPSE_HOME代替

2.修改 $HADOOP_HOME/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/HadoopApplicationLaunchShortcut.java
註釋掉原來的//import org.eclipse.jdt.internal.debug.ui.launcher.JavaApplicationLaunchShortcut;
改成import org.eclipse.jdt.debug.ui.launchConfigurations.JavaApplicationLaunchShortcut;

3.執行:jvm

  • cd $HADOOP_HOME
  • ant compile
  • ln -sf $HADOOP_HOME/docs $HADOOP_HOME/build/docs
  • ant package -Djava5.home=/usr/lib/jvm/java-1.5.0-sun-1.5.0.19 -Dforrest.home=/home/gushui/src/apache-forrest-0.8


注:安裝apache-forrest-0.8:http://forrest.apache.org/mirrors.cgi,放在 /home/gushui/src/apache-forrest-0.8)ide

注: 這裏我用的jdk1.5.0.22和apache-forrest-0.9

注意上面的java5路徑和apache-forrest路徑要根據你本身的安裝路徑而設定
ok,應該會在$HADOOP_HOME/build/contrib/eclipse-plugin/hadoop-0.20.3-dev-eclipse-plugin.jar

修更名字爲hadoop-0.20.2-eclipse-plugin.jar,搞定。至於爲何要修改我也不太清楚,版本自己是0.20.2的,它卻跳出來0.20.3。
注:我編的是1.0.2, 若是是0.2.203須要更改上面文件,不然編出來的eclipse plugin連接不上dfs server.

5.注意幾點:
(1)把這個jar包放到eclipse 的plugins目錄下。重啓eclipse。個人貌似不行,用了最笨的辦法,把eclipse刪掉再從新解壓tar文件重裝,後面能夠了oop

(2)zz,個人也是這樣:若是你的eclipse 的 run as -> run on hadoop 功能按了還是沒有反應的話,請先執行 run as -> java application ,再 run as -> run on hadoop 就能夠了ui

-----------------end ref
spa

4.我執行以上步驟遇到的問題:

         a。Execute failed: java.io.IOException: Cannot run program "autoreconf" (in directory。。。。。。。。

這個經過更新這個tool解決

"sudo apt-get install automake autoconf" 

         b。又一個break,如個人格言,咱們老是不那麼幸運

     [exec] * [15/35]   [0/0]     0.086s 0b      hdfs_user_guide.pdf
     [exec] Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/fop/messaging/MessageHandler
     [exec]     at org.apache.cocoon.serialization.FOPSerializer.configure(FOPSerializer.java:122)
     [exec]     at org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201)
     [exec]     at org.apache.avalon.excalibur.component.DefaultComponentFactory.newInstance(DefaultComponentFactory.java:289)

解決方法:ant clean 重複前面過程

task-controller:
     [exec] Can't exec "libtoolize": No such file or directory at /usr/bin/autoreconf line 196.

解決方法: sudo apt-get install libtool

[exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be used with -D_FILE_OFFSET_BITS==64"
     [exec] make: *** [impl/task-controller.o] Error 1
解決方法:這個終於找到答案,這是一個fix已經提供,see MAPREDUCE-2178 fix link

 簡單說就是controler不用大文件操做,能夠把AC_SYS_LARGEFILE去掉, 步驟:

1.找到文件/$HADOOP_HOME/src/c++/task-controller/configure.ac

2.找到行AC_SYS_LARGEFILE, 註釋掉

從新編譯ant package步驟,oh,yeal,經過了!!!!

相關文章
相關標籤/搜索