hadoop 2.7.3 (hadoop2.x)使用ant製做eclipse插件hadoop-eclipse-plugin-2.7.3.jar

  爲了作mapreduce開發,要使用eclipse,而且須要對應的Hadoop插件hadoop-eclipse-plugin-2.7.3.jar,首先說明一下,在hadoop1.x以前官方hadoop安裝包中都自帶有eclipse的插件,而現在隨着程序員的開發工具eclipse版本的增多和差別,hadoop插件也必需要和開發工具匹配,hadoop的插件包也不可能所有兼容.爲了簡化,現在的hadoop安裝包內不會含有eclipse的插件.須要各自根據本身的eclipse自行編譯.java

1. 環境準備linux

  使用ant製做本身的eclipse插件,介紹一下個人環境和工具  ( 安裝路徑根據本身 )git

  系統: 64bit Ubuntu 14.04,(系統不重要Win也能夠,方法都同樣)程序員

  JDK 版本: jdk-7u80-linux-x64.tar.gz   安裝路徑: /usr/lib/jvm   github

  eclipse 版本: ide工具eclipse-jee-mars-2-linux-gtk-x86_64.tar.gz     安裝路徑: /home/hadoop/sql

  hadoop 版本: hadoop-2.7.3.tar.gz     安裝路徑:/usr/localshell

  ant(這個也隨意,二進制安裝或者apt-get安裝均可以,配置好環境變量)  , 個人 ant 版本是1.9.3 , 有的是 1.9.7express

  export ANT_HOME=/usr/local/apache-ant-1.9.3
  export PATH=$PATH:$ANT_HOME/binapache

  若是提示找不到ant的launcher.ja包,添加環境變量json

  export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/jre/lib:$JAVA_HOME/lib/toos.jar:$ANT_HOME/lib/ant-launcher.jar

$ ant -version
Apache Ant(TM) version 1.9.3 compiled on April 8 2014

  我使用的是GitHub 上的 hadoop2x-eclipse-plugin (https://github.com/winghc/hadoop2x-eclipse-plugin)提供的 hadoop2x-eclipse-plugin-master.zip  ( 這是最新的 ,它對應 hadoop2x-eclipse-plugin-2.6.0.zip ), 

  以zip格式下載,而後解壓到一個合適的路徑下.注意路徑的權限和目錄全部者是當前用戶下的.

  三個編譯工具和資源的路徑以下: 

eclipse : /home/hadoop/eclipse
hadoop : /usr/local/hadoop-2.7.3
hadoop2x-eclipse-plugin-master : /home/hadoop/hadoop2x-eclipse-plugin-master

2 製做 hadoop-eclipse-plugin-2.7.3.jar

  可是個人hadoop版本是2.7.3,因此須要另外製做插件,好在這篇Github文章提供了怎麼build插件的方法。

  解壓下載過來的hadoop2x-eclipse-plugin,進入其中目錄hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin/執行操做, Github文章提供了怎麼build插件的方法以下所示: 

eclipse plugin for hadoop 2.x.x
How to build  
[hdpusr@demo hadoop2x-eclipse-plugin]$ cd src/contrib/eclipse-plugin
# Assume hadoop installation directory is /usr/share/hadoop 
[hdpusr@apclt eclipse-plugin]$ ant jar -Dversion=2.4.1 -Dhadoop.version=2.4.1 -Declipse.home=/opt/eclipse -Dhadoop.home=/usr/share/hadoop
final jar will be generated at directory
${hadoop2x-eclipse-plugin}/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.4.1.jar

2.1 修改相關文件, 主要有兩個,一個是 hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin/build.xml , 第二個是 hadoop2x-eclipse-plugin-master/ivy/libraries.properties

  可是此時我須要的是 hadoop 2.7.3的 eclilpse 插件,而 github 下載過來的 hadoop2x-eclipse-plugin 配置是 hadoop2.6 的編譯環境,因此執行 ant 以前須要須要修改 ant 的 build.xml 配置文件以及相關文件

  第一個文件: hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin/build.xml , 以下所示:

  1 <?xml version="1.0" encoding="UTF-8" standalone="no"?>
  2 
  3 <!--
  4    Licensed to the Apache Software Foundation (ASF) under one or more
  5    contributor license agreements.  See the NOTICE file distributed with
  6    this work for additional information regarding copyright ownership.
  7    The ASF licenses this file to You under the Apache License, Version 2.0
  8    (the "License"); you may not use this file except in compliance with
  9    the License.  You may obtain a copy of the License at
 10 
 11        http://www.apache.org/licenses/LICENSE-2.0
 12 
 13    Unless required by applicable law or agreed to in writing, software
 14    distributed under the License is distributed on an "AS IS" BASIS,
 15    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 16    See the License for the specific language governing permissions and
 17    limitations under the License.
 18 -->
 19 
 20 <project default="jar" name="eclipse-plugin">
 21 
 22   <import file="../build-contrib.xml"/>
 23 
 24   <path id="eclipse-sdk-jars">
 25     <fileset dir="${eclipse.home}/plugins/">
 26       <include name="org.eclipse.ui*.jar"/>
 27       <include name="org.eclipse.jdt*.jar"/>
 28       <include name="org.eclipse.core*.jar"/>
 29       <include name="org.eclipse.equinox*.jar"/>
 30       <include name="org.eclipse.debug*.jar"/>
 31       <include name="org.eclipse.osgi*.jar"/>
 32       <include name="org.eclipse.swt*.jar"/>
 33       <include name="org.eclipse.jface*.jar"/>
 34 
 35       <include name="org.eclipse.team.cvs.ssh2*.jar"/>
 36       <include name="com.jcraft.jsch*.jar"/>
 37     </fileset> 
 38   </path>
 39 
 40   <path id="hadoop-sdk-jars">
 41     <fileset dir="${hadoop.home}/share/hadoop/mapreduce">
 42       <include name="hadoop*.jar"/>
 43     </fileset> 
 44     <fileset dir="${hadoop.home}/share/hadoop/hdfs">
 45       <include name="hadoop*.jar"/>
 46     </fileset> 
 47     <fileset dir="${hadoop.home}/share/hadoop/common">
 48       <include name="hadoop*.jar"/>
 49     </fileset> 
 50   </path>
 51 
 52 
 53 
 54   <!-- Override classpath to include Eclipse SDK jars -->
 55   <path id="classpath">
 56     <pathelement location="${build.classes}"/>
 57     <!--pathelement location="${hadoop.root}/build/classes"/-->
 58     <path refid="eclipse-sdk-jars"/>
 59     <path refid="hadoop-sdk-jars"/>
 60   </path>
 61 
 62   <!-- Skip building if eclipse.home is unset. -->
 63   <target name="check-contrib" unless="eclipse.home">
 64     <property name="skip.contrib" value="yes"/>
 65     <echo message="eclipse.home unset: skipping eclipse plugin"/>
 66   </target>
 67 
 68  <target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">
 69     <echo message="contrib: ${name}"/>
 70     <javac
 71      encoding="${build.encoding}"
 72      srcdir="${src.dir}"
 73      includes="**/*.java"
 74      destdir="${build.classes}"
 75      debug="${javac.debug}"
 76      deprecation="${javac.deprecation}">
 77      <classpath refid="classpath"/>
 78     </javac>
 79   </target>
 80 
 81   <!-- Override jar target to specify manifest -->
 82   <target name="jar" depends="compile" unless="skip.contrib">
 83     <mkdir dir="${build.dir}/lib"/>
 84     <copy  todir="${build.dir}/lib/" verbose="true">
 85           <fileset dir="${hadoop.home}/share/hadoop/mapreduce">
 86            <include name="hadoop*.jar"/>
 87           </fileset>
 88     </copy>
 89     <copy  todir="${build.dir}/lib/" verbose="true">
 90           <fileset dir="${hadoop.home}/share/hadoop/common">
 91            <include name="hadoop*.jar"/>
 92           </fileset>
 93     </copy>
 94     <copy  todir="${build.dir}/lib/" verbose="true">
 95           <fileset dir="${hadoop.home}/share/hadoop/hdfs">
 96            <include name="hadoop*.jar"/>
 97           </fileset>
 98     </copy>
 99     <copy  todir="${build.dir}/lib/" verbose="true">
100           <fileset dir="${hadoop.home}/share/hadoop/yarn">
101            <include name="hadoop*.jar"/>
102           </fileset>
103     </copy>
104 
105     <copy  todir="${build.dir}/classes" verbose="true">
106           <fileset dir="${root}/src/java">
107            <include name="*.xml"/>
108           </fileset>
109     </copy>
110 
111 
112 
113     <copy file="${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
114     <copy file="${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
115     <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
116     <copy file="${hadoop.home}/share/hadoop/common/lib/commons-configuration-${commons-configuration.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
117     <copy file="${hadoop.home}/share/hadoop/common/lib/commons-lang-${commons-lang.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
118     <copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar"  todir="${build.dir}/lib" verbose="true"/>  
119     <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
120     <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
121     <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
122     <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
123     <copy file="${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
124     <copy file="${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
125     <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
126     <copy file="${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
127     <copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core-${htrace.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
128 
129     <jar
130       jarfile="${build.dir}/hadoop-${name}-${hadoop.version}.jar"
131       manifest="${root}/META-INF/MANIFEST.MF">
132       <manifest>
133    <attribute name="Bundle-ClassPath" 
134     value="classes/, 
135  lib/hadoop-mapreduce-client-core-${hadoop.version}.jar,
136  lib/hadoop-mapreduce-client-common-${hadoop.version}.jar,
137  lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar,
138  lib/hadoop-auth-${hadoop.version}.jar,
139  lib/hadoop-common-${hadoop.version}.jar,
140  lib/hadoop-hdfs-${hadoop.version}.jar,
141  lib/protobuf-java-${protobuf.version}.jar,
142  lib/log4j-${log4j.version}.jar,
143  lib/commons-cli-${commons-cli.version}.jar,
144  lib/commons-configuration-${commons-configuration.version}.jar,
145  lib/commons-httpclient-${commons-httpclient.version}.jar,
146  lib/commons-lang-${commons-lang.version}.jar,  
147  lib/commons-collections-${commons-collections.version}.jar,  
148  lib/jackson-core-asl-${jackson.version}.jar,
149  lib/jackson-mapper-asl-${jackson.version}.jar,
150  lib/slf4j-log4j12-${slf4j-log4j12.version}.jar,
151  lib/slf4j-api-${slf4j-api.version}.jar,
152  lib/guava-${guava.version}.jar,
153  lib/netty-${netty.version}.jar,
154  lib/htrace-core-${htrace.version}.jar"/>
155    </manifest>
156       <fileset dir="${build.dir}" includes="classes/ lib/"/>
157       <!--fileset dir="${build.dir}" includes="*.xml"/-->
158       <fileset dir="${root}" includes="resources/ plugin.xml"/>
159     </jar>
160   </target>
161 
162 </project>
build.xml

  在第81行 找到 <!-- Override jar target to specify manifest --> , 在第82行 找到 <target name="jar" depends="compile" unless="skip.contrib">標籤,添加和修改copy子標籤標籤一下內容, 也就是127行下面, 以下 ( 刪除第127行

<copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core-${htrace.version}.jar"  todir="${build.dir}/lib" verbose="true"/> , 添加下面3行 )

<copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core-${htrace.version}-incubating.jar"  todir="${build.dir}/lib" verbose="true"/>  
<copy file="${hadoop.home}/share/hadoop/common/lib/servlet-api-${servlet-api.version}.jar"  todir="${build.dir}/lib" verbose="true"/>  
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-io-${commons-io.version}.jar"  todir="${build.dir}/lib" verbose="true"/> 

  而後找到標籤<attribute name="Bundle-ClassPath"  ( 在修改以前的配置文件 build.xml 第133行 )在齊總的value的列表中對應的添加和修改lib,以下 ( 刪除第154行 lib/htrace-core-${htrace.version}.jar, 添加下面3行 )

lib/servlet-api-${servlet-api.version}.jar,  
lib/commons-io-${commons-io.version}.jar,  
lib/htrace-core-${htrace.version}-incubating.jar"/> 

  保存退出.注意若是不修改這個,即使你編譯完成jar包,放到eclipse中,配置連接會報錯的.    修改結果以下所示:

  <!-- Override jar target to specify manifest -->
  <target name="jar" depends="compile" unless="skip.contrib">
    <mkdir dir="${build.dir}/lib"/>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/mapreduce">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/common">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/hdfs">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/yarn">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>

    <copy  todir="${build.dir}/classes" verbose="true">
          <fileset dir="${root}/src/java">
           <include name="*.xml"/>
          </fileset>
    </copy>



    <copy file="${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-configuration-${commons-configuration.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-lang-${commons-lang.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar"  todir="${build.dir}/lib" verbose="true"/>  
    <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar"  todir="${build.dir}/lib" verbose="true"/>

    <!--my added, 3 lines-->
    <copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core-${htrace.version}-incubating.jar"  todir="${build.dir}/lib" verbose="true"/>  
    <copy file="${hadoop.home}/share/hadoop/common/lib/servlet-api-${servlet-api.version}.jar"  todir="${build.dir}/lib" verbose="true"/>  
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-io-${commons-io.version}.jar"  todir="${build.dir}/lib" verbose="true"/> 

    <jar
      jarfile="${build.dir}/hadoop-${name}-${hadoop.version}.jar"
      manifest="${root}/META-INF/MANIFEST.MF">
      <manifest>
   <attribute name="Bundle-ClassPath" 
    value="classes/, 
 lib/hadoop-mapreduce-client-core-${hadoop.version}.jar,
 lib/hadoop-mapreduce-client-common-${hadoop.version}.jar,
 lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar,
 lib/hadoop-auth-${hadoop.version}.jar,
 lib/hadoop-common-${hadoop.version}.jar,
 lib/hadoop-hdfs-${hadoop.version}.jar,
 lib/protobuf-java-${protobuf.version}.jar,
 lib/log4j-${log4j.version}.jar,
 lib/commons-cli-${commons-cli.version}.jar,
 lib/commons-configuration-${commons-configuration.version}.jar,
 lib/commons-httpclient-${commons-httpclient.version}.jar,
 lib/commons-lang-${commons-lang.version}.jar,  
 lib/commons-collections-${commons-collections.version}.jar,  
 lib/jackson-core-asl-${jackson.version}.jar,
 lib/jackson-mapper-asl-${jackson.version}.jar,
 lib/slf4j-log4j12-${slf4j-log4j12.version}.jar,
 lib/slf4j-api-${slf4j-api.version}.jar,
 lib/guava-${guava.version}.jar,
 lib/netty-${netty.version}.jar,
 lib/servlet-api-${servlet-api.version}.jar,  
 lib/commons-io-${commons-io.version}.jar,  
 lib/htrace-core-${htrace.version}-incubating.jar"/>
   </manifest>
      <fileset dir="${build.dir}" includes="classes/ lib/"/>
      <!--fileset dir="${build.dir}" includes="*.xml"/-->
      <fileset dir="${root}" includes="resources/ plugin.xml"/>
    </jar>
  </target>

  可是隻是添加和修改這些lib是不行的,hadoop2.6到hadoop2.7中share/hadoop/common/lib/下的jar版本都是有不少不一樣的,所以還須要修改相應的jar版本. 這個耗費了我半天的時間啊.一個個的對號修改.

  注意這個版本的環境配置文件在hadoop2x-eclipse-plugin-master跟目錄的ivy目錄下,也就 hadoop2x-eclipse-plugin-master/ivy/libraries.properties 中, ( 將libraries.properties 的 各個選項與 hadoop 各個相應選項( 在 share/hadoop/的幾個文件夾下 )的版本號相對應 , )

  爲了方便你們,我複製過來,( 直接覆蓋原來的便可 ) #覆蓋的就是原來的配置 

#   Licensed under the Apache License, Version 2.0 (the "License");
#   you may not use this file except in compliance with the License.
#   You may obtain a copy of the License at
#
#       http://www.apache.org/licenses/LICENSE-2.0
#
#   Unless required by applicable law or agreed to in writing, software
#   distributed under the License is distributed on an "AS IS" BASIS,
#   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#   See the License for the specific language governing permissions and
#   limitations under the License.

#This properties file lists the versions of the various artifacts used by hadoop and components.
#It drives ivy and the generation of a maven POM

# This is the version of hadoop we are generating
#hadoop.version=2.6.0    modify
hadoop.version=2.7.3
hadoop-gpl-compression.version=0.1.0

#These are the versions of our dependencies (in alphabetical order)
apacheant.version=1.7.0
ant-task.version=2.0.10

asm.version=3.2
aspectj.version=1.6.5
aspectj.version=1.6.11

checkstyle.version=4.2

commons-cli.version=1.2
commons-codec.version=1.4
#commons-collections.version=3.2.1    modify
commons-collections.version=3.2.2
commons-configuration.version=1.6
commons-daemon.version=1.0.13
#commons-httpclient.version=3.0.1    modify
commons-httpclient.version=3.1
commons-lang.version=2.6
#commons-logging.version=1.0.4        modify
commons-logging.version=1.1.3
#commons-logging-api.version=1.0.4    modify
commons-logging-api.version=1.1.3
#commons-math.version=2.1    modify
commons-math.version=3.1.1
commons-el.version=1.0
commons-fileupload.version=1.2
#commons-io.version=2.1        modify
commons-io.version=2.4
commons-net.version=3.1
core.version=3.1.1
coreplugin.version=1.3.2

#hsqldb.version=1.8.0.10    modify
hsqldb.version=2.0.0
#htrace.version=3.0.4    modify
htrace.version=3.1.0

ivy.version=2.1.0

jasper.version=5.5.12
jackson.version=1.9.13
#not able to figureout the version of jsp & jsp-api version to get it resolved throught ivy
# but still declared here as we are going to have a local copy from the lib folder
jsp.version=2.1
jsp-api.version=5.5.12
jsp-api-2.1.version=6.1.14
jsp-2.1.version=6.1.14
#jets3t.version=0.6.1    modify
jets3t.version=0.9.0
jetty.version=6.1.26
jetty-util.version=6.1.26
#jersey-core.version=1.8    modify
#jersey-json.version=1.8    modify
#jersey-server.version=1.8    modify
jersey-core.version=1.9
jersey-json.version=1.9
jersey-server.version=1.9
#junit.version=4.5    modify
junit.version=4.11
jdeb.version=0.8
jdiff.version=1.0.9
json.version=1.0

kfs.version=0.1

log4j.version=1.2.17
lucene-core.version=2.3.1

mockito-all.version=1.8.5
jsch.version=0.1.42

oro.version=2.0.8

rats-lib.version=0.5.1

servlet.version=4.0.6
servlet-api.version=2.5
#slf4j-api.version=1.7.5    modify
#slf4j-log4j12.version=1.7.5    modify
slf4j-api.version=1.7.10
slf4j-log4j12.version=1.7.10

wagon-http.version=1.0-beta-2
xmlenc.version=0.52
#xerces.version=1.4.4    modify
xerces.version=2.9.1

protobuf.version=2.5.0
guava.version=11.0.2
netty.version=3.6.2.Final

  修改完成後,大功告成,開始ant

2.2 開始 ant

  進入src/contrib/eclipse-plugin/執行ant命令,以下 ( 權限不夠的話,能夠切換到 root 權限, 命令 su root )

$ cd /home/hadoop/hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin/
$ su root
$ ant jar -Dhadoop.version=2.7.3 -Declipse.home=/home/hadoop/eclipse -Dhadoop.home=/usr/local/hadoop-2.7.3

  這個過程第一次會慢點,後來就會很快.

  當最終顯示以下,就表示ant製做成功:

Buildfile: /home/hadoop/hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin/build.xml

check-contrib:

init:
     [echo] contrib: eclipse-plugin

init-contrib:

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:configure] :: loading settings :: file = /home/hadoop/hadoop2x-eclipse-plugin-master/ivy/ivysettings.xml

ivy-resolve-common:

ivy-retrieve-common:
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = /home/hadoop/hadoop2x-eclipse-plugin-master/ivy/ivysettings.xml

compile:
     [echo] contrib: eclipse-plugin
    [javac] /home/hadoop/hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin/build.xml:76: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
     [copy] Copying 1 file to /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/lib
     [copy] Copying /usr/local/hadoop-2.7.3/share/hadoop/common/lib/htrace-core-3.1.0-incubating.jar to /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/lib/htrace-core-3.1.0-incubating.jar
     [copy] Copying 1 file to /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/lib
     [copy] Copying /usr/local/hadoop-2.7.3/share/hadoop/common/lib/servlet-api-2.5.jar to /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/lib/servlet-api-2.5.jar
     [copy] Copying 1 file to /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/lib
     [copy] Copying /usr/local/hadoop-2.7.3/share/hadoop/common/lib/commons-io-2.4.jar to /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/lib/commons-io-2.4.jar
      [jar] Building jar: /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.7.3.jar

BUILD SUCCESSFUL
Total time: 4 seconds

  而後你能夠切換到 /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/, 下面有咱們製做好的插件 hadoop-eclipse-plugin-2.7.3.jar.

3 將本身製做的插件放入到eclipse目錄的plugins下, 並配置 Hadoop-Eclipse-Plugin   ( 這裏就要求本機上搭建了僞分佈式或全分佈式 )
       能夠參考 使用Eclipse編譯運行MapReduce程序 Hadoop2.6.0_Ubuntu/CentOS 

  而後將本身製做的插件放入到eclipse目錄的plugins下 

  而後重啓eclipse或者shell命令行刷新eclipse以下,同時也能夠在shell中顯示eclipse的運行過程,以及出錯後及時發現緣由

$ cp /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.7.2.jar /home/hadoop/eclipse/plugins/  
$ /home/hadoop/eclipse/eclipse -clean 
相關文章
相關標籤/搜索