恕我直言!!!對於Maven,菜鳥玩dependency,神仙玩plugin

打包是一項神聖、而莊嚴的工做。package意味着咱們離生產已經很是近了。它會把咱們以前的大量工做濃縮成爲一個、或者多個文件。接下來,運維的同窗就能夠拿着這些個打包文件在生產上縱橫四海了。html

這麼一項莊嚴、神聖的工做,卻沒有受到多數人的關注,你們習慣去網上隨意copy一段pom的xml代碼,往本身項目裏面一扔,而後就開始執行package打包了。大多數只知道,Maven幫助我管理了JAR包的依賴,能夠自動下載,很方便。確實,由於它太方便了,不少時候,咱們幾乎是沒有感知它的存在。想起來某個功能的時候,直接去使用就能夠了。java

而構建的工做其實並不簡單!例如:linux

  • 打包後的程序,與生產環境JAR包衝突
  • 依賴中有多個版本的依賴,如何選擇、排除依賴
  • 編譯scala,某些JAR包的調用存在兼容問題
  • 如何根據不一樣的環境來加載不一樣的配置,例如:本地環境、集羣環境。
  • 編譯開源項目報錯,根本無從下手解決。
  • ...

其實,稍微離生產環境近一些,咱們會發現不少的問題都暴露了出來。碰到這些問題的時候,固然能夠第一時間百度。但爲了可以更精準的定位問題、減小打包時候給別人挖坑,咱們仍是頗有必要來了解一些關於Maven的細節。git

目錄

菜鳥玩dependency,神仙玩plugin

咱們使用Maven的時候,95%的時候關注是dependency,而不多有人真正會花時間去研究Maven的plugin。但小猴要告訴你們,其實Maven工做的核心是plugin,而不是dependency。好吧!再直接一點,菜鳥玩dependency,神仙玩plugin。是否是拼命想要反駁我,你們看看官網Plugin在Maven文檔的位置,這意味着什麼?github

靈魂拷問:你們留意過嗎?是否是隻去官網上下載Maven,而後隨便百度一個教程就開始用Maven了?sql

image-20210206120452130

分析Hadoop Example模塊打包

學習的一種最好方式就是借鑑,借鑑優秀的開源項目。看看別人是怎麼作的。因此,接下來,咱們就來看看Hadoop是如何打包的。爲了方便給你們演示,小猴特地用Maven給你們演示一遍編譯、打包。這樣效果會明顯些。shell

操做步驟:apache

  1. 在github上找到Apahce/hadoop項目(https://github.com/apache/hadoop)
  2. 找到hadoop-mapreduce-project / hadoop-mapreduce-examples模塊。
  3. 打開pom.xml文件。
<project xmlns="http://maven.apache.org/POM/4.0.0"
  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
                      https://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <parent>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-project</artifactId>
    <version>3.4.0-SNAPSHOT</version>
    <relativePath>../../hadoop-project</relativePath>
  </parent>
  <artifactId>hadoop-mapreduce-examples</artifactId>
  <version>3.4.0-SNAPSHOT</version>
  <description>Apache Hadoop MapReduce Examples</description>
  <name>Apache Hadoop MapReduce Examples</name>
  <packaging>jar</packaging>

  <properties>
    <mr.examples.basedir>${basedir}</mr.examples.basedir>
  </properties>

  <dependencies>
    <dependency>
      <groupId>commons-cli</groupId>
      <artifactId>commons-cli</artifactId>
    </dependency>
    <dependency>
      <groupId>commons-logging</groupId>
      <artifactId>commons-logging</artifactId>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
      <scope>provided</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
      <scope>test</scope>
      <type>test-jar</type>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-common</artifactId>
      <scope>provided</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-common</artifactId>
      <scope>test</scope>
      <type>test-jar</type>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-hdfs-client</artifactId>
      <scope>runtime</scope>
    </dependency>
    <dependency>
       <groupId>org.apache.hadoop</groupId>
       <artifactId>hadoop-hdfs</artifactId>
       <scope>test</scope>
       <type>test-jar</type>
     </dependency>
     <dependency>
       <groupId>org.apache.hadoop</groupId>
       <artifactId>hadoop-yarn-server-tests</artifactId>
       <scope>test</scope>
       <type>test-jar</type>
     </dependency>
     <dependency>
       <groupId>org.apache.hadoop</groupId>
       <artifactId>hadoop-mapreduce-client-app</artifactId>
       <scope>provided</scope>
     </dependency>
     <dependency>
       <groupId>org.apache.hadoop</groupId>
       <artifactId>hadoop-mapreduce-client-app</artifactId>
       <type>test-jar</type>
       <scope>test</scope>
     </dependency>
    <dependency>
      <groupId>com.sun.jersey.jersey-test-framework</groupId>
      <artifactId>jersey-test-framework-grizzly2</artifactId>
      <scope>test</scope>
    </dependency>
     <dependency>
       <groupId>org.apache.hadoop</groupId>
       <artifactId>hadoop-mapreduce-client-hs</artifactId>
       <scope>test</scope>
     </dependency>
     <dependency>
       <groupId>org.hsqldb</groupId>
       <artifactId>hsqldb</artifactId>
       <scope>provided</scope>
     </dependency>
     <dependency>
      <groupId>org.apache.hadoop.thirdparty</groupId>
      <artifactId>hadoop-shaded-guava</artifactId>
      <scope>provided</scope>
     </dependency>
    <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-api</artifactId>
    </dependency>
    <dependency>
      <groupId>org.assertj</groupId>
      <artifactId>assertj-core</artifactId>
      <scope>test</scope>
    </dependency>
  </dependencies>
  
  <build>
   <plugins>
    <plugin>
    <groupId>org.apache.maven.plugins</groupId>
     <artifactId>maven-jar-plugin</artifactId>
      <configuration>
       <archive>
         <manifest>
           <mainClass>org.apache.hadoop.examples.ExampleDriver</mainClass>
         </manifest>
       </archive>
     </configuration>
    </plugin>

      <plugin>
        <groupId>org.codehaus.mojo</groupId>
        <artifactId>findbugs-maven-plugin</artifactId>
         <configuration>
          <findbugsXmlOutput>true</findbugsXmlOutput>
          <xmlOutput>true</xmlOutput>
          <excludeFilterFile>${mr.examples.basedir}/dev-support/findbugs-exclude.xml</excludeFilterFile>
          <effort>Max</effort>
        </configuration>
      </plugin>
      <plugin>
        <groupId>org.apache.rat</groupId>
        <artifactId>apache-rat-plugin</artifactId>
        <configuration>
          <excludes>
            <exclude>src/main/java/org/apache/hadoop/examples/dancing/puzzle1.dta</exclude>
          </excludes>
        </configuration>
      </plugin>
   </plugins>
   </build>
</project>

經過瀏覽hadoop example的xml文件,咱們發現瞭如下幾點:api

  1. 全部的依賴都在父工程hadoop-project的pom.xml中定義好了。在hadoop example項目中,沒有出現任何一個版本號。bash

    image-20210206104307402

  2. Hadoop使用了三個插件,一個是maven-jar-plugin、一個是findbugs-maven-plugin、還有一個是apache-rat-plugin。

咱們進入到example模塊中pom.xml所在的目錄中,直接執行mvn package試試看。

[root@compile hadoop-mapreduce-examples]# mvn package
[INFO] Scanning for projects...
[INFO] 
[INFO] ------------< org.apache.hadoop:hadoop-mapreduce-examples >-------------
[INFO] Building Apache Hadoop MapReduce Examples 3.2.1
[INFO] --------------------------------[ jar ]---------------------------------
Downloading from apache.snapshots.https: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-mapreduce-client-app/3.2.1/hadoop-mapreduce-client-app-3.2.1-tests.jar
.....

[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce-examples ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-mapreduce-examples ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /opt/hadoop-3.2.1-src/hadoop-mapreduce-project/hadoop-mapreduce-examples/src/main/resources
[INFO] 

[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-mapreduce-examples ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 

[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hadoop-mapreduce-examples ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /opt/hadoop-3.2.1-src/hadoop-mapreduce-project/hadoop-mapreduce-examples/src/test/resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hadoop-mapreduce-examples ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M1:test (default-test) @ hadoop-mapreduce-examples ---
Downloading from central: http://maven.aliyun.com/nexus/content/groups/public/org/apache/maven/surefire/surefire-junit4/3.0.0-M1/surefire-junit4-3.0.0-M1.jar

..........

[INFO] 
[INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.hadoop.examples.TestBaileyBorweinPlouffe
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.536 s - in org.apache.hadoop.examples.TestBaileyBorweinPlouffe

..........

[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 11, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-jar-plugin:2.5:jar (default-jar) @ hadoop-mapreduce-examples ---
[INFO] 
[INFO] --- maven-site-plugin:3.6:attach-descriptor (attach-descriptor) @ hadoop-mapreduce-examples ---
[INFO] Skipping because packaging 'jar' is not pom.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  01:11 min
[INFO] Finished at: 2021-02-06T10:49:18+08:00
[INFO] ------------------------------------------------------------------------

很快就編譯成功了,咱們來看看Maven作了什麼:

一、執行maven-antrun-plugin插件的run create-testdirs任務。奇怪的是,Example模塊中並無引入該插件。一會來看看該插件在何處配置的。

[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce-examples ---
[INFO] Executing tasks

二、執行maven-resources-plugin插件的resources任務,這個插件應該是拷貝resource目錄到target的。

[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-mapreduce-examples ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /opt/hadoop-3.2.1-src/hadoop-mapreduce-project/hadoop-mapreduce-examples/src/main/resources

三、執行maven-compiler-plugin插件的compile任務,注意:如今纔開始編譯代碼。由於發現咱們以前已經編譯過了,因此此處並無從新編譯class。

[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-mapreduce-examples ---
[INFO] Compiling 47 source files to /opt/hadoop-3.2.1-src/hadoop-mapreduce-project/hadoop-mapreduce-examples/target/classes

四、執行maven-resources-plugin插件的testResources任務,顧名思義,就是將單元測試相關的resource目錄拷貝到target。

[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hadoop-mapreduce-examples ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /opt/hadoop-3.2.1-src/hadoop-mapreduce-project/hadoop-mapreduce-examples/src/test/resources

五、執行maven-compiler-plugin插件的testCompile任務,一樣,將單元測試的文件編譯一遍。

[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hadoop-mapreduce-examples ---
[INFO] Compiling 7 source files to /opt/hadoop-3.2.1-src/hadoop-mapreduce-project/hadoop-mapreduce-examples/target/test-classes

六、執行maven-surefire-plugin插件的test任務,開始執行單元測試。確保編譯的代碼沒有問題。

INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.hadoop.examples.TestBaileyBorweinPlouffe
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.436 s - in org.apache.hadoop.examples.TestBaileyBorweinPlouffe
[INFO] Running org.apache.hadoop.examples.TestWordStats
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.332 s - in org.apache.hadoop.examples.TestWordStats
[INFO] Running org.apache.hadoop.examples.pi.math.TestLongLong
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.163 s - in org.apache.hadoop.examples.pi.math.TestLongLong
[INFO] Running org.apache.hadoop.examples.pi.math.TestModular
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.164 s - in org.apache.hadoop.examples.pi.math.TestModular
[INFO] Running org.apache.hadoop.examples.pi.math.TestSummation
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.091 s - in org.apache.hadoop.examples.pi.math.TestSummation
[INFO] Running org.apache.hadoop.examples.terasort.TestTeraSort
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.449 s - in org.apache.hadoop.examples.terasort.TestTeraSort

七、執行maven-jar-plugin插件的jar任務,這個任務是打包成jar文件。

[INFO] --- maven-jar-plugin:2.5:jar (default-jar) @ hadoop-mapreduce-examples ---
[INFO] Building jar: /opt/hadoop-3.2.1-src/hadoop-mapreduce-project/hadoop-mapreduce-examples/target/hadoop-mapreduce-examples-3.2.1.jar

八、執行maven-site-plugin的attach-descriptor任務。該任務只有項目是pom打包時候纔可用,將site.xml(site描述符)添加到部署的文件列表中。

[INFO] --- maven-site-plugin:3.6:attach-descriptor (attach-descriptor) @ hadoop-mapreduce-examples ---
[INFO] Skipping because packaging 'jar' is not pom.

由此,咱們能夠發現,當咱們執行一個package、compile、或者clean命令時,其實背後都是執行Maven的一個插件。只不過有的插件是Maven自帶的,直接可使用,當咱們須要自定義插件的行爲時,就須要顯示在pom.xml中顯式配置插件了。

Maven中有大量的、豐富的插件供開發人員使用。

image-20210206110859445

地址:https://maven.apache.org/plugins/

咱們能夠點擊任意一個plugin,查看其具體的內容。

image-20210206111024621

maven-antrun-plugin插件

咱們發如今example模塊的父模塊hadoop-project中有一個pom.xml。

<plugin>
     <groupId>org.apache.maven.plugins</groupId>
     <artifactId>maven-antrun-plugin</artifactId>
     <executions>
         <execution>
             <id>create-testdirs</id>
             <phase>validate</phase>
             <goals>
                 <goal>run</goal>
             </goals>
             <configuration>
                 <target>
                     <mkdir dir="${test.build.dir}"/>
                     <mkdir dir="${test.build.data}"/>
                 </target>
             </configuration>
         </execution>
     </executions>
</plugin>

咱們看到這裏面有配置一些插件,其中就個maven-antrun-plugin。該插件會執行run#create-testdirs任務,而且在validate階段執行。咱們看到,該插件執行了兩次mkdir。

maven-jar-plugin插件

插件配置以下:

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-jar-plugin</artifactId>
    <configuration>
        <archive>
            <manifest>
                <mainClass>org.apache.hadoop.examples.ExampleDriver</mainClass>
            </manifest>
        </archive>
    </configuration>
</plugin>

前面,咱們看到了是在執行package階段時自動執行的。而且指定了運行的主類是ExampleDriver。經過查看打包後的JAR文件,咱們能夠發現,JAR插件只會將項目中的class文件打包到JAR文件中,並不會打包其餘的依賴。

image-20210206114017839

而且,在JAR包的META-INF(元數據中),能夠看到MAINFEST.MF文件,已經生成了運行主類:

image-20210206114118518

這個插件的相關說明,能夠參看官網:https://maven.apache.org/plugins/maven-jar-plugin/

JAR包中的META-INF目錄

在每一個jar包中有一個META-INF目錄,顧名思義。它確定是包含了JAR文件的元數據相關。Java基於META-INF目錄中的文件來配置Java應用程序、類加載器以及其餘服務。它包含如下內容:

MANIFEST.MF

用於定義擴展名以及打包相關的清單。

Manifest-Version: 1.0
Archiver-Version: Plexus Archiver
Built-By: China
Created-By: Apache Maven 3.5.0
Build-Jdk: 1.8.0_241
Main-Class: cn.monkey.StreamingJob

該文件中顯示了文件的版本、由哪一個用戶構建的、由哪一個應用建立的、構建的JDK版本、以及很是重要的Main-Class。

INDEX.LIST

該文件由JAR工具的-i選項生成,包括了應用程序或者擴展中定義的包的位置。用於類加載器加速類加載過程。

xxx.SF

JAR包的簽名文件

xxx.DSA

與SF文件關聯的簽名塊文件。該文件存儲了簽名文件對應的數字簽名。

Maven插件

Maven構建生命週期

Maven是一個項目管理工具,它把項目的構建主要分爲了如下階段:

  • validate:驗證項目是否正確,全部必要的信息是否均已經提供
  • compile:編譯項目的源代碼。
  • test:運行單元測試。
  • package:打包已編譯的代碼。
  • verify:對集成測試結果進行檢查,確保符合質量標準。
  • install:將軟件包安裝到本地倉庫。
  • deploy:將最終的軟件包複製到遠程倉庫,方便和其餘開發人員共享。

也就是說,只要是一個Maven項目,從源代碼到一個可運行的程序,須要經歷着一系列的構建階段。而每一個階段的背後,是Maven提供了一個構建過程的核心執行引擎,這個核心的項目構建執行引擎是由大量的插件來執行具體的任務。

從新定義Maven

讓咱們從技術角度,從新定義Maven——一個包含了不少插件的框架。真正執行各類Maven操做的其實都是插件。例如:

  • 構建JAR包
  • 構建WAR包
  • 編譯代碼
  • 執行單元測試
  • 建立項目文檔

等等。只要是可以想到須要在項目上執行的全部操做,其背後都是插件實現的。

插件是Maven的核心功能,一旦定義好了插件,就能夠在多個項目中重用。想一想,咱們是否是在每一個pom.xml配置打包插件、編譯插件等等。找到pom.xml的位置,而後執行 package、comile、clean等操做便可。當咱們執行:

mvn compile

的時候,Maven得知道,哦!當前要執行編譯了。由此能夠知道,Maven的插件是由mvn的參數來驅動的。這些參數定義了插件的目標(或者Mojo)。

Mojo

Maven中最簡單的插件是Clean Plugin。它只是負責刪除Maven項目的target目錄。但運行mvn clean時,Maven將執行Clean插件中定義的clean目標(Goal),並刪除目標目錄。Clean插件還定義了可用於自定義插件行爲的參數,長概述爲outputDirectory,默認爲${project.build.directory}。

Mojo其實是Maven插件中的一個目標(Goal),一個插件能夠包含許多的Goal。咱們能夠用帶註解的Java類或者BeanShell腳原本定義Mojo。它指定了Goal相關的元數據:Goal名稱、以及Goal所運行的生命週期、以及參數。

查看clean插件源碼

經過Maven官網的plugins連接,咱們能夠找到clean插件在github的地址。

image-20210206124814385

image-20210206124929870

看到src目錄和pom.xml,咱們就能夠知道,原來Maven的plugin也是一個Maven標準項目。先來看看pom.xml裏面有什麼。爲了方便觀察,我刪除了一些代碼。

<project>
  <modelVersion>4.0.0</modelVersion>

  <parent>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-plugins</artifactId>
    <version>34</version>
    <relativePath/>
  </parent>

  <properties>
    <mavenVersion>3.1.1</mavenVersion>
    <javaVersion>7</javaVersion>
    <surefire.version>2.22.2</surefire.version>
    <mavenPluginToolsVersion>3.6.0</mavenPluginToolsVersion>
    <project.build.outputTimestamp>2020-04-07T21:04:00Z</project.build.outputTimestamp>
  </properties>

  <dependencies>
    <dependency>
      <groupId>org.apache.maven</groupId>
      <artifactId>maven-plugin-api</artifactId>
      <version>${mavenVersion}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.maven.shared</groupId>
      <artifactId>maven-shared-utils</artifactId>
      <version>3.2.1</version>
    </dependency>

    <!-- dependencies to annotations -->
    <dependency>
      <groupId>org.apache.maven.plugin-tools</groupId>
      <artifactId>maven-plugin-annotations</artifactId>
      <scope>provided</scope>
    </dependency>

  <build>
    <pluginManagement>
      <plugins>
        <!-- remove with next parent upgrade -->
        <plugin>
          <artifactId>maven-project-info-reports-plugin</artifactId>
          <version>3.1.1</version>
        </plugin>
        <plugin>
          <artifactId>maven-enforcer-plugin</artifactId>
          <version>3.0.0-M3</version>
        </plugin>
        <plugin>
          <artifactId>maven-javadoc-plugin</artifactId>
          <version>3.2.0</version>
        </plugin>
        <plugin>
          <artifactId>maven-site-plugin</artifactId>
          <version>3.9.1</version>
        </plugin>
      </plugins>
    </pluginManagement>
  </build>
      
  <profiles>
    <profile>
      <id>run-its</id>
      <build>
        <pluginManagement>
          <plugins>
            <plugin>
              <groupId>org.apache.maven.plugins</groupId>
              <artifactId>maven-invoker-plugin</artifactId>
              <configuration>
                <debug>true</debug>
                <addTestClassPath>true</addTestClassPath>
                <projectsDirectory>src/it</projectsDirectory>
                <cloneProjectsTo>${project.build.directory}/it</cloneProjectsTo>
                <pomIncludes>
                  <pomInclude>*/pom.xml</pomInclude>
                </pomIncludes>
                <preBuildHookScript>setup</preBuildHookScript>
                <postBuildHookScript>verify</postBuildHookScript>
                <localRepositoryPath>${project.build.directory}/local-repo</localRepositoryPath>
                <settingsFile>src/it/settings.xml</settingsFile>
                <goals>
                  <goal>clean</goal>
                </goals>
              </configuration>
            </plugin>
          </plugins>
        </pluginManagement>
      </build>
    </profile>
  </profiles>
</project>

咱們看到,pom.xml中引入了一些必要的依賴、以及定義了一些其餘插件的版本、在profile中,還定義了maven-invoker-plugin的配置。裏面配置了

Invoker插件:用於運行一組Maven項目,該插件能夠肯定每一個項目執行是否成功,而且能夠選擇驗證從給定項目執行生成的輸出。比較適合用於集成測試。

image-20210206130007340

咱們看到了源碼中有一個CleanMojo的源文件,代碼中使用了註解來定義插件的Goal和參數。

image-20210206130107577

image-20210206130431392

咱們看到execute中就是執行clean目標,裏面調用了Cleaner來清理文件。再查看下install插件的Mojo

image-20210206130701348

install默認綁定的是INSTALL階段。

看完上面的源碼,咱們知道:之後使用插件,能夠看看它的Mojo文件就知道它對應的目標是什麼、參數是什麼。咱們還能夠經過插件的源碼來進行錯誤排查。

scala版本的pom.xml依賴要比Java版本要複雜,由於Maven默認就是用於構建Java的。而針對scala的構建,須要進行額外配置Maven支持。

mvn archetype:generate \
-DarchetypeGroupId=org.apache.flink \
-DarchetypeArtifactId=flink-quickstart-scala \
-DarchetypeVersion=1.12.1

Flink自動生成的代碼以下:

<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>

	<groupId>cn.monkey</groupId>
	<artifactId>flink_scala_demo_1.12.1</artifactId>
	<version>1.0-SNAPSHOT</version>
	<packaging>jar</packaging>

	<name>Flink Quickstart Job</name>

	<repositories>
		<repository>
			<id>apache.snapshots</id>
			<name>Apache Development Snapshot Repository</name>
			<url>https://repository.apache.org/content/repositories/snapshots/</url>
			<releases>
				<enabled>false</enabled>
			</releases>
			<snapshots>
				<enabled>true</enabled>
			</snapshots>
		</repository>
	</repositories>

	<properties>
		<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
		<flink.version>1.12.1</flink.version>
		<scala.binary.version>2.11</scala.binary.version>
		<scala.version>2.11.12</scala.version>
		<log4j.version>2.12.1</log4j.version>
	</properties>

	<dependencies>
		<!-- Apache Flink dependencies -->
		<!-- These dependencies are provided, because they should not be packaged into the JAR file. -->
		<dependency>
			<groupId>org.apache.flink</groupId>
			<artifactId>flink-scala_${scala.binary.version}</artifactId>
			<version>${flink.version}</version>
			<scope>provided</scope>
		</dependency>
		<dependency>
			<groupId>org.apache.flink</groupId>
			<artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>
			<version>${flink.version}</version>
			<scope>provided</scope>
		</dependency>
		<dependency>
			<groupId>org.apache.flink</groupId>
			<artifactId>flink-clients_${scala.binary.version}</artifactId>
			<version>${flink.version}</version>
			<scope>provided</scope>
		</dependency>

		<!-- Scala Library, provided by Flink as well. -->
		<dependency>
			<groupId>org.scala-lang</groupId>
			<artifactId>scala-library</artifactId>
			<version>${scala.version}</version>
			<scope>provided</scope>
		</dependency>

		<!-- Add connector dependencies here. They must be in the default scope (compile). -->

		<!-- Example:

		<dependency>
			<groupId>org.apache.flink</groupId>
			<artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
			<version>${flink.version}</version>
		</dependency>
		-->

		<!-- Add logging framework, to produce console output when running in the IDE. -->
		<!-- These dependencies are excluded from the application JAR by default. -->
		<dependency>
			<groupId>org.apache.logging.log4j</groupId>
			<artifactId>log4j-slf4j-impl</artifactId>
			<version>${log4j.version}</version>
			<scope>runtime</scope>
		</dependency>
		<dependency>
			<groupId>org.apache.logging.log4j</groupId>
			<artifactId>log4j-api</artifactId>
			<version>${log4j.version}</version>
			<scope>runtime</scope>
		</dependency>
		<dependency>
			<groupId>org.apache.logging.log4j</groupId>
			<artifactId>log4j-core</artifactId>
			<version>${log4j.version}</version>
			<scope>runtime</scope>
		</dependency>
	</dependencies>

	<build>
		<plugins>
			<!-- We use the maven-shade plugin to create a fat jar that contains all necessary dependencies. -->
			<!-- Change the value of <mainClass>...</mainClass> if your program entry point changes. -->
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-shade-plugin</artifactId>
				<version>3.1.1</version>
				<executions>
					<!-- Run shade goal on package phase -->
					<execution>
						<phase>package</phase>
						<goals>
							<goal>shade</goal>
						</goals>
						<configuration>
							<artifactSet>
								<excludes>
									<exclude>org.apache.flink:force-shading</exclude>
									<exclude>com.google.code.findbugs:jsr305</exclude>
									<exclude>org.slf4j:*</exclude>
									<exclude>org.apache.logging.log4j:*</exclude>
								</excludes>
							</artifactSet>
							<filters>
								<filter>
									<!-- Do not copy the signatures in the META-INF folder.
									Otherwise, this might cause SecurityExceptions when using the JAR. -->
									<artifact>*:*</artifact>
									<excludes>
										<exclude>META-INF/*.SF</exclude>
										<exclude>META-INF/*.DSA</exclude>
										<exclude>META-INF/*.RSA</exclude>
									</excludes>
								</filter>
							</filters>
							<transformers>
								<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
									<mainClass>cn.monkey.StreamingJob</mainClass>
								</transformer>
							</transformers>
						</configuration>
					</execution>
				</executions>
			</plugin>

			<!-- Java Compiler -->
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-compiler-plugin</artifactId>
				<version>3.1</version>
				<configuration>
					<source>1.8</source>
					<target>1.8</target>
				</configuration>
			</plugin>

			<!-- Scala Compiler -->
			<plugin>
				<groupId>net.alchim31.maven</groupId>
				<artifactId>scala-maven-plugin</artifactId>
				<version>3.2.2</version>
				<executions>
					<execution>
						<goals>
							<goal>compile</goal>
							<goal>testCompile</goal>
						</goals>
					</execution>
				</executions>
				<configuration>
					<args>
						<arg>-nobootcp</arg>
					</args>
				</configuration>
			</plugin>

			<!-- Eclipse Scala Integration -->
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-eclipse-plugin</artifactId>
				<version>2.8</version>
				<configuration>
					<downloadSources>true</downloadSources>
					<projectnatures>
						<projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
						<projectnature>org.eclipse.jdt.core.javanature</projectnature>
					</projectnatures>
					<buildcommands>
						<buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
					</buildcommands>
					<classpathContainers>
						<classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>
						<classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
					</classpathContainers>
					<excludes>
						<exclude>org.scala-lang:scala-library</exclude>
						<exclude>org.scala-lang:scala-compiler</exclude>
					</excludes>
					<sourceIncludes>
						<sourceInclude>**/*.scala</sourceInclude>
						<sourceInclude>**/*.java</sourceInclude>
					</sourceIncludes>
				</configuration>
			</plugin>
			<plugin>
				<groupId>org.codehaus.mojo</groupId>
				<artifactId>build-helper-maven-plugin</artifactId>
				<version>1.7</version>
				<executions>
					<!-- Add src/main/scala to eclipse build path -->
					<execution>
						<id>add-source</id>
						<phase>generate-sources</phase>
						<goals>
							<goal>add-source</goal>
						</goals>
						<configuration>
							<sources>
								<source>src/main/scala</source>
							</sources>
						</configuration>
					</execution>
					<!-- Add src/test/scala to eclipse build path -->
					<execution>
						<id>add-test-source</id>
						<phase>generate-test-sources</phase>
						<goals>
							<goal>add-test-source</goal>
						</goals>
						<configuration>
							<sources>
								<source>src/test/scala</source>
							</sources>
						</configuration>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build>
</project>

這是一個標準Flink Maven腳手架,Flink清晰地告訴了咱們,哪些依賴是須要設置爲provided、哪些是runtime,若是須要使用connector,須要本身額外引入對應不一樣存儲庫的connector。咱們重點來分析插件:

  • maven-shade-plugin:能夠看到,Flink是使用shade插件進行fat jar打包的。能夠經過mainClass參數配置jar包的入口。
  • maven-compiler-plugin:配置Java編譯器。Flink默認使用1.8進行編譯。
  • scala-maven-plugin:配置Scala編譯器。
  • maven-eclipse-plugin:該插件定義了編譯scala和java文件

咱們重點來看shade插件。

注意:

若是多個插件配置的生命週期階段爲package,那麼會按照pom.xml的順序依次執行。

Shade插件

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-shade-plugin</artifactId>
    <version>3.1.1</version>
    <executions>
        <!-- Run shade goal on package phase -->
        <execution>
            <phase>package</phase>
            <goals>
                <goal>shade</goal>
            </goals>
            <configuration>
                <artifactSet>
                    <excludes>
                        <exclude>org.apache.flink:force-shading</exclude>
                        <exclude>com.google.code.findbugs:jsr305</exclude>
                        <exclude>org.slf4j:*</exclude>
                        <exclude>org.apache.logging.log4j:*</exclude>
                    </excludes>
                </artifactSet>
                <filters>
                    <filter>
                        <!-- Do not copy the signatures in the META-INF folder.
         Otherwise, this might cause SecurityExceptions when using the JAR. -->
                        <artifact>*:*</artifact>
                        <excludes>
                            <exclude>META-INF/*.SF</exclude>
                            <exclude>META-INF/*.DSA</exclude>
                            <exclude>META-INF/*.RSA</exclude>
                        </excludes>
                    </filter>
                </filters>
                <transformers>
                    <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                        <mainClass>cn.monkey.StreamingJob</mainClass>
                    </transformer>
                </transformers>
            </configuration>
        </execution>
    </executions>
</plugin>

Shade插件能夠將打包全部的artifact到一個uber-jar(uber-jar表示在一個JAR文件中包含自身、以及全部的依賴)。Shade插件只有一個goal:shade:shade。

Mojo源碼URL:https://github.com/apache/maven-shade-plugin/blob/master/src/main/java/org/apache/maven/plugins/shade/mojo/ShadeMojo.java

image-20210206143601564

能夠看到,該插件的生命週期配置在package,也就是執行package時,會自動運行。

在configuration中配置了要排除哪些artifacts。filter中配置了排除全部JAR包中的簽名文件。咱們能夠在artifactSet、filter中來解決包衝突問題。

Assembly插件

簡介

不少時候咱們須要把項目打包成一個tar.gz包,就像Apache的一些組件同樣。經過使用Assembly插件能夠將程序、文檔、配置文件等等打包成一個「assemblies」。使用一個assembly descriptor能夠描述整個過程。使用該插件,能夠把應用打包成如下類型:

zip
tar
tar.gz (or tgz)
tar.bz2 (or tbz2)
tar.snappy
tar.xz (or txz)
jar
dir
war

而若是要打包成user-jar,assembly插件提供了一些基本的支持。官方建議仍是使用shade插件。Assembly插件的使用步驟以下:

  1. 選擇或編寫一個assembly descriptor
  2. 在pom.xml文件中配置assembly插件
  3. 運行mvn assembly:single

Assembly介紹

Assembly(程序集)指的是一組文件、目錄以及相關的依賴,爲了方便軟件的安裝、部署、以及分發,咱們能夠把Assembly組織成一個zip包、或者tar.gz這種類型的包。例如:一個Maven項目中包含了控制檯應用和FX桌面客戶端應用。能夠定義兩個Assembly,將應用和不一樣的腳本、依賴綁定到一塊兒。

針對Assembly,須要有一個Assembly Descriptor(程序集描述符),經過assembly descripor文件能夠描述將哪些文件複製到bin目錄,而且能夠修改目錄中文件的權限。

Goal

每個Maven插件都會有一個Goal,Assembly插件也有一個Goal,那就是single,表示建立全部的Assembly。

分析Hadoop項目的Assembly插件

Maven插件配置

咱們來看一下Hadoop中如何使用該插件的。

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-assembly-plugin</artifactId>
    <inherited>false</inherited>
    <executions>
        <execution>
            <id>src-dist</id>
            <phase>package</phase>
            <goals>
                <goal>single</goal>
            </goals>
            <configuration>
                <appendAssemblyId>false</appendAssemblyId>
                <attach>false</attach>
                <finalName>hadoop-${project.version}-src</finalName>
                <outputDirectory>hadoop-dist/target</outputDirectory>
                <!-- Not using descriptorRef and hadoop-assembly dependency -->
                <!-- to avoid making hadoop-main to depend on a module      -->
                <descriptors>
                    <descriptor>hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml</descriptor>
                </descriptors>
            </configuration>
        </execution>
    </executions>
</plugin>

解釋下每一個XML節點的意義:

配置項 說明
appendAssemblyId 設置爲false表示從最終的輸出程序集中排除src-dist名字
attach 控制Assembly插件是否將生成的assembly附加到項目中
finalName Assembly發行版最終的文件名
outputDirectory Assembly文件的最終輸出目錄
descriptors 默認會從bin、jar-with-dependencies、src、project中加載內置描述符。

內置的descriptors能夠從assembly.jar中加載。

image-20210206183715685

能夠參考https://maven.apache.org/plugins/maven-assembly-plugin/descriptor-refs.html看到全部內置的descriptor中說明。這裏Hadoop配置的是本身的descriptor。

你們能夠參考插件的AbstractAssemblyMojo.java中的定義。

hadoop-src.xml Assembly配置

<assembly xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.3"
  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.3 http://maven.apache.org/xsd/assembly-1.1.3.xsd">
  <id>hadoop-src</id>
  <formats>
    <format>tar.gz</format>
  </formats>
  <includeBaseDirectory>true</includeBaseDirectory>
  <fileSets>
    <fileSet>
      <directory>.</directory>
      <includes>
        <include>LICENCE.txt</include>
        <include>README.txt</include>
        <include>NOTICE.txt</include>
      </includes>
    </fileSet>
    <fileSet>
      <directory>./licenses</directory>
      <includes>
        <include>*</include>
      </includes>
    </fileSet>
    <fileSet>
      <directory>.</directory>
      <useDefaultExcludes>true</useDefaultExcludes>
      <excludes>
        <exclude>.git/**</exclude>
        <exclude>**/.gitignore</exclude>
        <exclude>**/.svn</exclude>
        <exclude>**/*.iws</exclude>
        <exclude>**/*.ipr</exclude>
        <exclude>**/*.iml</exclude>
        <exclude>**/.classpath</exclude>
        <exclude>**/.project</exclude>
        <exclude>**/.settings</exclude>
        <exclude>**/target/**</exclude>
        <!-- until the code that does this is fixed -->
        <exclude>**/*.log</exclude>
        <exclude>**/build/**</exclude>
        <exclude>**/file:/**</exclude>
        <exclude>**/SecurityAuth.audit*</exclude>
        <exclude>patchprocess/**</exclude>
      </excludes>
    </fileSet>
  </fileSets>
</assembly>

咱們能夠參考:https://maven.apache.org/plugins/maven-assembly-plugin/assembly.html看到assembly相關的全部配置。

配置項 配置 說明
id hadoop-src 設置Assembly的id。
formats/format* tar.gz 指定assembly的最終格式。這裏hadoop配置的是tar.gz,表示最終打包出來一個tar.gz文件,固然能夠配置多個。例如:zip、tar、jar等。
includeBaseDirectory true 在tar.gz中包括一個base目錄。就是tar.gz會包含一個文件夾。而不是直接把大量的文件直接放在tar.gz中。(這個必定要配置true,否則解壓安裝的時候會很蛋疼。)
fileSets fileset 指定要包含在Assembly中的模塊文件。就是最終要將哪些文件複製到tar.gz中。
fileset/useDefaultExcludes true 是否應使用標準排除模式。對於向後兼容性,默認值爲true。
fileset/directory . 設置模塊目錄中的絕對或相對位置。例如,「src/main/bin」將選擇定義此依賴項的項目的子目錄。該目錄配置表示要打包哪一個目錄下的文件。
includes/include* LICENCE.txt 定義一組要包含的文件和目錄。若是沒有配置,表示全部target內容
excludes/exclude* **/.settings 定義一組要排除的文件和目錄。

製做一個屬於咱們本身的打包程序

需求

咱們經過編寫一個簡單的代碼,而後將代碼打包成相似於Apache的軟件包。代碼很是簡單:

public class HelloMaven {
    public static void main(String[] args) {
        System.out.println("Hello! Maven Assembly Plugin!");
    }
}

有一個配置文件,咱們也須要打包:

version=1.1.0

image-20210206210951618

添加一個測試依賴

在pom.xml中添加如下:

<dependencies>
    <dependency>
        <groupId>commons-cli</groupId>
        <artifactId>commons-cli</artifactId>
        <version>1.2</version>
    </dependency>
</dependencies>

一會咱們會用shade插件,將該依賴直接打成uber-jar。

配置shade插件

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-shade-plugin</artifactId>
    <version>3.1.1</version>
    <executions>
        <!-- Run shade goal on package phase -->
        <execution>
            <phase>package</phase>
            <goals>
                <goal>shade</goal>
            </goals>
            <configuration>
                <filters>
                    <filter>
                        <artifact>*:*</artifact>
                        <excludes>
                            <exclude>META-INF/*.SF</exclude>
                            <exclude>META-INF/*.DSA</exclude>
                            <exclude>META-INF/*.RSA</exclude>
                            <exclude>shell-scripts/*</exclude>
                        </excludes>
                    </filter>
                </filters>
                <transformers>
                    <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                        <mainClass>cn.monkey.HelloMaven</mainClass>
                    </transformer>
                </transformers>
            </configuration>
        </execution>
    </executions>
</plugin>

執行package時,會先執行maven-jar-plugin。而後就會執行shade插件了。注意:由於咱們一會要使用assembly打包,將shade打包的user-jar直接打進tar.gz。因此,shade要配置在assembly插件以前。

注意配置mainClass,也就是JAR的運行主類

配置Assembly插件

配置Maven pom.xml

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-assembly-plugin</artifactId>
    <inherited>false</inherited>
    <executions>
        <execution>
            <id>assembly-test</id>
            <phase>package</phase>
            <goals>
                <goal>single</goal>
            </goals>
            <configuration>
                <appendAssemblyId>false</appendAssemblyId>
                <attach>false</attach>
                <finalName>${project.artifactId}-bin</finalName>
                <outputDirectory>${project.build.directory}</outputDirectory>
                <!-- Not using descriptorRef and hadoop-assembly dependency -->
                <!-- to avoid making hadoop-main to depend on a module      -->
                <descriptors>
                    <descriptor>test-assemblies/test-descriptor.xml</descriptor>
                </descriptors>
            </configuration>
        </execution>
    </executions>
</plugin>
  • finalName爲最終打包的文件名,此處爲artifactId加上bin後綴
  • outputDirectory配置爲Maven默認的輸出目錄,也就是一會打包完會自動在target目錄生成tar.gz

配置assembly descriptor

<assembly xmlns="http://maven.apache.org/ASSEMBLY/2.1.0"
          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
          xsi:schemaLocation="http://maven.apache.org/ASSEMBLY/2.1.0 http://maven.apache.org/xsd/assembly-2.1.0.xsd">
<id>assembly-test</id>
<formats>
    <format>tar.gz</format>
</formats>
<fileSets>
    <fileSet>
        <directory>${project.build.directory}</directory>
        <useDefaultExcludes></useDefaultExcludes>
        <outputDirectory>lib</outputDirectory>
        <includes>
            <include>*-shaded.jar</include>
        </includes>
    </fileSet>
    <fileSet>
        <directory>${project.build.directory}/classes</directory>
        <outputDirectory>conf</outputDirectory>
        <includes>
            <include>**/config.properties</include>
        </includes>
    </fileSet>
    <fileSet>
        <directory>${project.build.directory}/classes/shell-scripts</directory>
        <outputDirectory>bin</outputDirectory>
        <fileMode>755</fileMode>
        <includes>
            <include>**/start.sh</include>
        </includes>
    </fileSet>
</fileSets>
</assembly>
  • 咱們最終的文件以tar.gz方式打包
  • 第一個fileset爲打包lib,咱們最終的程序以jar包形式存放在tar.gz的lib文件夾中
  • 第二個fileset打包配置文件,這裏直接打包config.properties
  • 第三個fileset打包運行的shell腳本,而且配置了755可執行權限

建立運行腳本

在main目錄中添加shell-scripts/start.sh,要執行程序直接執行start.sh便可

#!/bin/bash
java -jar lib/${artifact.name}

配置資源打包

<build>
    <resources>
        <resource>
            <directory>src/main/resources</directory>
            <includes>
                <include>**/*</include>
            </includes>
        </resource>
        <resource>
            <targetPath>${project.build.outputDirectory}/shell-scripts</targetPath>
            <directory>src/main/shell-scripts</directory>
            <filtering>true</filtering>
            <includes>
                <include>**/*</include>
            </includes>
        </resource>
    </resources>
    ....
</build>

咱們須要對shell-scriptrs下的腳本進行變量替換。

配置profile

<project>
	<profiles>
        <profile>
            <id>pro</id>
            <properties>
                <artifact.name>${project.artifactId}-${project.version}-shaded.jar</artifact.name>
            </properties>
            <activation>
                <!-- 默認激活該profile -->
                <activeByDefault>true</activeByDefault>
            </activation>
        </profile>
    </profiles>
</project>

此處配置默認的Profile爲pro,當打包時會用artifact.name屬性直接對shell-script中的腳本進行替換。

執行打包

我把Maven的執行過程給你們解析一遍。

# 注意此處自動指定了profile爲pro

C:\opt\jdk1.8.0_241\bin\java.exe -Dmaven.multiModuleProjectDirectory=C:\Users\China\Desktop\assembly-test -Dmaven.multiModuleProjectDirectory=$MAVEN_HOME -Dmaven.wagon.http.ssl.insecure=true -Dmaven.wagon.http.ssl.allowall=true -Dmaven.wagon.http.ssl.ignore.validity.dates=true -Dmaven.home=C:\Java\apache-maven-3.5.0 -Dclassworlds.conf=C:\Java\apache-maven-3.5.0\bin\m2.conf "-Dmaven.ext.class.path=C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2020.3.2\plugins\maven\lib\maven-event-listener.jar" "-javaagent:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2020.3.2\lib\idea_rt.jar=58840:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2020.3.2\bin" -Dfile.encoding=UTF-8 -classpath C:\Java\apache-maven-3.5.0\boot\plexus-classworlds-2.5.2.jar org.codehaus.classworlds.Launcher -Didea.version=2020.3.2 package -P pro
[INFO] Scanning for projects...
[INFO] 
[INFO] ------------------------------------------------------------------------
[INFO] Building assembly-test 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 

# 拷貝resoource資源文件
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ assembly-test ---
[WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. build is platform dependent!
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] Copying 1 resource
[INFO] Copying 1 resource to C:\Users\China\Desktop\assembly-test\target\classes/shell-scripts
[INFO] 

# 執行編譯插件
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ assembly-test ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 

# 執行test資源拷貝,當前爲空
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ assembly-test ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory C:\Users\China\Desktop\assembly-test\src\test\resources
[INFO] 

# 執行編譯測試用例代碼
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ assembly-test ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 

# 執行單元測試
[INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ assembly-test ---
[INFO] No tests to run.
[INFO] 

# 執行Maven的默認jar打包
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ assembly-test ---
[INFO] Building jar: C:\Users\China\Desktop\assembly-test\target\assembly-test-1.0-SNAPSHOT.jar
[INFO] 

# 執行shade插件打包
[INFO] --- maven-shade-plugin:3.1.1:shade (default) @ assembly-test ---
[INFO] Including commons-cli:commons-cli:jar:1.2 in the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing C:\Users\China\Desktop\assembly-test\target\assembly-test-1.0-SNAPSHOT.jar with C:\Users\China\Desktop\assembly-test\target\assembly-test-1.0-SNAPSHOT-shaded.jar
[INFO] Dependency-reduced POM written at: C:\Users\China\Desktop\assembly-test\dependency-reduced-pom.xml
[INFO] 

# 執行assembly插件打包
[INFO] --- maven-assembly-plugin:2.2-beta-5:single (assembly-test) @ assembly-test ---
[INFO] Reading assembly descriptor: test-assemblies/test-descriptor.xml
[INFO] Building tar : C:\Users\China\Desktop\assembly-test\target\assembly-test-bin.tar.gz
[WARNING] Assembly file: C:\Users\China\Desktop\assembly-test\target\assembly-test-bin.tar.gz is not a regular file (it may be a directory). It cannot be attached to the project build for installation or deployment.

[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2.202 s
[INFO] Finished at: 2021-02-06T21:19:21+08:00
[INFO] Final Memory: 15M/491M
[INFO] ------------------------------------------------------------------------

Process finished with exit code 0

是否是一目瞭然?

一頓操做後,咱們發現確實打包成了tar.gz了。

image-20210206212226277

用壓縮軟件打開看一下:

image-20210206212250495

所有都已經打包好了。並且shell腳本已經進行了變量替換。

在Linux上部署

# 上傳到Linux服務器
[root@compile assembly-test]# ll
總用量 40
-rw-r--r--. 1 root root 39611 2月   6 21:19 assembly-test-bin.tar.gz

# 解壓
[root@compile assembly-test]# tar -xvzf assembly-test-bin.tar.gz 
assembly-test-bin/lib/assembly-test-1.0-SNAPSHOT-shaded.jar
assembly-test-bin/conf/config.properties
assembly-test-bin/bin/start.sh

# 執行
[root@compile assembly-test-bin]# pwd
/root/assembly-test/assembly-test-bin
[root@compile assembly-test-bin]# bin/start.sh 
Hello! Maven Assembly Plugin!

是否是很酷~這樣的程序纔是真正的、很正式的程序。

打包源代碼

最後爲了方便小夥伴們測試,咱們把Maven項目的源碼打包一份。仍是使用assembly插件。

在Assembly中建立一個新的execution

<execution>
    <id>test-source-descriptor</id>
    <phase>compile</phase>
    <goals>
        <goal>single</goal>
    </goals>
    <configuration>
        <appendAssemblyId>false</appendAssemblyId>
        <attach>false</attach>
        <finalName>${project.artifactId}-source</finalName>
        <outputDirectory>${project.build.directory}</outputDirectory>
        <!-- Not using descriptorRef and hadoop-assembly dependency -->
        <!-- to avoid making hadoop-main to depend on a module      -->
        <descriptors>
            <descriptor>test-assemblies/test-source-descriptor.xml</descriptor>
        </descriptors>
    </configuration>
</execution>

注意:我當前配置的phase爲compile,也就是編譯階段就會打包好maven的源代碼.

爲打包源碼配置assembly descriptor

<assembly xmlns="http://maven.apache.org/ASSEMBLY/2.1.0"
          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
          xsi:schemaLocation="http://maven.apache.org/ASSEMBLY/2.1.0 http://maven.apache.org/xsd/assembly-2.1.0.xsd">
<id>test-source-descriptor</id>
<formats>
    <format>zip</format>
</formats>
<fileSets>
    <fileSet>
        <directory>${project.basedir}</directory>
        <useDefaultExcludes></useDefaultExcludes>
        <outputDirectory>.</outputDirectory>
        <includes>
            <include>src/**/*</include>
            <include>test-assemblies/**/*</include>
            <include>pom.xml</include>
        </includes>
    </fileSet>
</fileSets>
</assembly>

此處j將src源碼目錄、test-assemblies下的全部文件、以及pom.xml一塊打包給小夥伴。

執行compile

[INFO] Scanning for projects...
[INFO] 
[INFO] ------------------------------------------------------------------------
[INFO] Building assembly-test 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ assembly-test ---
[WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. build is platform dependent!
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] Copying 1 resource
[INFO] Copying 1 resource to C:\Users\China\Desktop\assembly-test\target\classes/shell-scripts
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ assembly-test ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-assembly-plugin:2.2-beta-5:single (test-source-descriptor) @ assembly-test ---
[INFO] Reading assembly descriptor: test-assemblies/test-source-descriptor.xml
[INFO] Building zip: C:\Users\China\Desktop\assembly-test\target\assembly-test-source.zip
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1.274 s
[INFO] Finished at: 2021-02-06T21:38:37+08:00
[INFO] Final Memory: 13M/491M
[INFO] ------------------------------------------------------------------------

Process finished with exit code 0

你們看到assembly插件已經執行,並將代碼打包好了。

image-20210206214137329

你們看到了,這纔是咱們作爲一個開發應該玩的Maven。本次的案例代碼在我公衆號上回復:maven-plugin便可獲取下載連接。你們自取之。

咱們下期再見!

以上

參考文獻

[1] https://maven.apache.org/plugins/

[2] https://github.com/apache/hadoop

[3] https://maven.apache.org/guides/introduction/introduction-to-plugins.html

[4] https://github.com/apache/spark

[5] https://stackoverflow.com/questions/11947037/what-is-an-uber-jar

[6] https://docs.oracle.com/javase/8/docs/technotes/guides/jar/jar.html

[7] https://maven.apache.org/plugins/maven-assembly-plugin/index.html

[8] https://github.com/cko/predefined_maven_properties/blob/master/README.md

相關文章
相關標籤/搜索