昨天折騰hadoop2X的eclipse插件,從https://github.com/winghc/hadoop2x-eclipse-plugin把源碼搞下來後,很快搞定出來一個,可是。。。New Hadoop Location時,窗口出不來,汗死,緣由後面會說明的。源碼難下,最後我會附上源碼包的。java
仔細看了ant和ivy的文件以後,我發現了一個坑爹的現象,編譯一個插件,須要ivy麼?!git
ant用了很久了,仍是本身寫一個的好。github
查看了jar包,都是hadoop安裝目錄裏面存在的,還須要連個毛線的網去檢測啊!看了下ivy的調試信息,也就那麼些東西,直接幹掉ivy!api
從新寫了個build.xml,只要往目錄hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin裏面一丟,改改前5個屬性的值,直接跑「ant jar」就好了。app
或者eclipse創建一個普通項目,把less
META-INF
resources
src
build.properties
plugin.xml
往項目一丟,再把build.xml丟進去,改了前5個屬性值直接跑就好了。eclipse
build.xml內容以下ssh
<?xml version="1.0"?> <project name="eclipse-plugin" default="jar"> <property name="jdk.home" value="G:/jdk/jdk1.6.0" /> <property name="hadoop.version" value="2.4.1" /> <property name="hadoop.home" value="G:/hadoop/hadoop-2.4.1" /> <property name="eclipse.version" value="3.7" /> <property name="eclipse.home" value="G:/MyEclipse10/Common" /> <property name="root" value="${basedir}" /> <property file="${root}/build.properties" /> <property name="name" value="${ant.project.name}" /> <property name="src.dir" location="${root}/src/java" /> <property name="build.contrib.dir" location="${root}/build/contrib" /> <property name="build.dir" location="${build.contrib.dir}/${name}" /> <property name="build.classes" location="${build.dir}/classes" /> <!-- all jars together --> <property name="javac.deprecation" value="off" /> <property name="javac.debug" value="on" /> <property name="build.encoding" value="ISO-8859-1" /> <path id="eclipse-sdk-jars"> <fileset dir="${eclipse.home}/plugins/"> <include name="org.eclipse.ui*.jar" /> <include name="org.eclipse.jdt*.jar" /> <include name="org.eclipse.core*.jar" /> <include name="org.eclipse.equinox*.jar" /> <include name="org.eclipse.debug*.jar" /> <include name="org.eclipse.osgi*.jar" /> <include name="org.eclipse.swt*.jar" /> <include name="org.eclipse.jface*.jar" /> <include name="org.eclipse.team.cvs.ssh2*.jar" /> <include name="com.jcraft.jsch*.jar" /> </fileset> </path> <path id="project-jars"> <fileset file="${build.dir}/lib/*.jar" /> </path> <target name="init" unless="skip.contrib"> <echo message="contrib: ${name}" /> <mkdir dir="${build.dir}" /> <mkdir dir="${build.classes}" /> <mkdir dir="${build.dir}/lib" /> <copy todir="${build.dir}/lib/" verbose="true"> <fileset dir="${hadoop.home}/share/hadoop/mapreduce"> <include name="hadoop*.jar" /> <exclude name="*test*"/> <exclude name="*example*"/> </fileset> <fileset dir="${hadoop.home}/share/hadoop/common"> <include name="hadoop*.jar" /> <exclude name="*test*"/> <exclude name="*example*"/> </fileset> <fileset dir="${hadoop.home}/share/hadoop/hdfs"> <include name="hadoop*.jar" /> <exclude name="*test*"/> <exclude name="*example*"/> </fileset> <fileset dir="${hadoop.home}/share/hadoop/yarn"> <include name="hadoop*.jar" /> <exclude name="*test*"/> <exclude name="*example*"/> </fileset> <fileset dir="${hadoop.home}/share/hadoop/common/lib"> <include name="protobuf-java-*.jar" /> <include name="log4j-*.jar" /> <include name="commons-cli-*.jar" /> <include name="commons-collections-*.jar" /> <include name="commons-configuration-*.jar" /> <include name="commons-lang-*.jar" /> <include name="jackson-core-asl-*.jar" /> <include name="jackson-mapper-asl-*.jar" /> <include name="slf4j-log4j12-*.jar" /> <include name="slf4j-api-*.jar" /> <include name="guava-*.jar" /> <include name="hadoop-annotations-*.jar" /> <include name="hadoop-auth-*.jar" /> <include name="commons-cli-*.jar" /> <include name="netty-*.jar" /> </fileset> </copy> </target> <target name="compile" depends="init" unless="skip.contrib"> <echo message="contrib: ${name}" /> <javac fork="true" executable="${jdk.home}/bin/javac" encoding="${build.encoding}" srcdir="${src.dir}" includes="**/*.java" destdir="${build.classes}" debug="${javac.debug}" deprecation="${javac.deprecation}" includeantruntime="on"> <classpath refid="eclipse-sdk-jars" /> <classpath refid="project-jars" /> </javac> </target> <target name="jar" depends="compile" unless="skip.contrib"> <pathconvert property="mf.classpath" pathsep=",lib/"> <path refid="project-jars" /> <flattenmapper /> </pathconvert> <jar jarfile="${build.dir}/hadoop-${hadoop.version}-eclipse-${eclipse.version}-plugin.jar" manifest="${root}/META-INF/MANIFEST.MF"> <manifest> <attribute name="Bundle-ClassPath" value="classes/,lib/${mf.classpath}" /> </manifest> <fileset dir="${build.dir}" includes="classes/ lib/" /> <fileset dir="${root}" includes="resources/ plugin.xml" /> </jar> </target> <target name="clean"> <echo message="contrib: ${name}" /> <delete dir="${build.dir}" /> </target> </project>
最前面5個屬性中,jdk.home指定編譯時用的jdk,hadoop和eclipse版本號是用來給jar起名字的,hadoop.home是hadoop安裝目錄,會從裏面提取須要的文件,eclipse.home是eclipse的安裝目錄,MyEclipse中是那個Common目錄。oop
原版中坑爹的jar包版本號無視了,MANIFEST.MF的classpath值也懶得去改了,都自動了,雖然自動會多了幾個值,可是總比少的好。ui
又回到最開始的問題,new hadoop location時的窗口問題,是由於少了commons-collections包,因此直接從hadoop安裝目錄把這個包拉進來打包了,而後問題解決!
用jdk1.6編譯了兩個插件包,MyEclipse10的和Eclipse4.4的,都能用。至於上傳文件時因爲當前用戶名致使的權限問題,這個就是改配置或改代碼的事情了,我是編譯前直接在源碼裏面加了兩行,反正插件隨便編譯一下就有了。
最後附上官方源碼,裏面原先那個打包好的插件幹掉了,都是舊的,還巨大無比!下載源碼包