ELK菜鳥手記 (一) 環境配置+log4j日誌記錄

1. 背景介紹java

在大數據時代,日誌記錄和管理變得尤其重要。node

以往的文件記錄日誌的形式,既查詢起來又不方便,又形成日誌在服務器上分散存儲,管理起來至關麻煩,程序員

想根據一個關鍵字查詢日誌中某個關鍵信息至關困難。apache

這個時候,ELK誕生了。瀏覽器

什麼是ELK?ruby

簡單來講:它是一套完整的日誌記錄和分析的解決方案平臺服務器

 

2. 技術棧介紹app

ELK = Elasticsearch + Logstash + Kibanaless

2-1) Elasticsearch:curl

(

Elasticsearch is a distributed open source search engine based on Apache Lucene,

and released under an Apache 2.0 license (which means that it can be downloaded, used, and modi ed free of charge).

It provides horizontal scalability, reliability, and multitenant capability for real-time search.

Elasticsearch features are available through JSON over a RESTful API.

The searching capabilities are backed by a schema-less Apache Lucene Engine,

which allows it to dynamically index data without knowing the structure beforehand.

Elasticsearch is able to achieve fast search responses because it uses indexing to search over the texts. 

)

Elasticsearch是一個分佈式的開源的基於Apache Lucene項目的搜索引擎,它發佈在Apache 2.0協議下

(這也就意味着它能夠被免費地下載,使用而且修改)。

 Elasticsearch提供了水平的,可擴展的,可靠的,多用戶形式的實時搜索。

Elasticsearch的功能能夠經過JSON格式的RESTful API形式訪問。

Elasticsearch的搜索能力是獲得Apache Lucene引擎的支持,容許給文本數據增長加動態索引。

 2-2) Logstash

(

Logstash is a data pipeline that helps collect, parse, and analyze a large variety of structured and unstructured data and events generated across various systems.

It provides plugins to connect to various types of input sources and platforms, and is designed to ef ciently process logs, events,

and unstructured data sources for distribution into a variety of outputs with the use of its output plugins,

namely le, stdout (as output on console running Logstash), or Elasticsearch. 

)

Logstash是一個數據管道,它被用來收集,解析和分析各類結構化的和非結構化的由各類系統產生的數據以及事件。

它提供了插件用來鏈接到各類輸入數據源,能夠高效地處理日誌,事件以及非結構化的數據,並且能夠經過輸出插件的形式

把結果輸出到各類輸出源,好比:標準輸出,控制檯或者Elasticsearch。

 2-3) Kibana

(Kibana is an open source Apache 2.0 licensed data visualization platform that helps in

visualizing any kind of structured and unstructured data stored in Elasticsearch indexes.

Kibana is entirely written in HTML and JavaScript. )

Kibana是一個基於Apache 2.0協議的開源可視化平臺,它用來可視化任何結構化的和非結構化的存儲在Elasticsearch索引

中的數據。Kibana徹底用HTML和Javascript編寫。

 

3. 下載/安裝/配置/啓動

前面說了不少廢話,接下來是每一個程序員感興趣的動手環節了,let's start!

3-1) 安裝Elasticsearch

下載地址:

https://www.elastic.co/downloads/elasticsearch

a) 解壓安裝包elasticsearch-5.2.2.tar.gz

    (因爲本人是mac系統)我把它解壓到了/usr/local目錄下,完整路徑以下:

    /usr/local/elasticsearch-5.2.2

b) 編輯配置文件

   cd config

   vi elasticsearch.yml

   內容以下:

# ---------------------------------- Cluster -----------------------------------
#
# Use a descriptive name for your cluster:
#
cluster.name: my-application
#
# ------------------------------------ Node ------------------------------------
#
# Use a descriptive name for the node:
#
node.name: node-1
#
# Add custom attributes to the node:
#
#node.attr.rack: r1
#
# ----------------------------------- Paths ------------------------------------
#
# Path to directory where to store the data (separate multiple locations by comma):
#
path.data: /tmp/elasticsearch/data
#
# Path to log files:
#
path.logs: /tmp/elasticsearch/logs
# ---------------------------------- Network -----------------------------------
#
# Set the bind address to a specific IP (IPv4 or IPv6):
#
network.host: localhost
#
# Set a custom port for HTTP:
#
http.port: 9200

 

c) 啓動Elasticsearch
./bin/elasticsearch

若是沒有錯誤發生,能夠進入到下一步的驗證環節

d) 驗證

可使用cURL命令:

curl 'http://localhost:9200/?pretty'

或者

直接用瀏覽器打開

http://localhost:9200

 

3-2) 安裝Logstash

a) 解壓安裝包logstash-5.2.2.tar.gz

    完整路徑以下:

    /usr/local/logstash-5.2.2

b) 編輯配置文件

   cd config

   新建配置文件log4j_es.conf

   vi log4j_es.conf

   內容以下:

input {
    log4j {
        host => "127.0.0.1"
        port => 4560
    }
}

output {
    stdout {
      codec => rubydebug
    }
    elasticsearch{
        hosts => ["localhost:9200"]
        index => "log4j-%{+YYYY.MM.dd}"
        document_type => "log4j_type"
    }
}

c) 啓動Logstash

./bin/logstash -f config/log4j-es.conf

 

3-3) 安裝Kibana

a) 解壓安裝包kibana-5.2.2.tar.gz
完整路徑以下:
/usr/local/ kibana-5.2.2


b) 編輯配置文件
cd config
vi kibana.yml
內容以下:

# Kibana is served by a back end server. This setting specifies the port to use.
server.port: 5601

server.host: "localhost"

# The URL of the Elasticsearch instance to use for all your queries.
elasticsearch.url: "http://localhost:9200"

# Kibana uses an index in Elasticsearch to store saved searches, visualizations and
# dashboards. Kibana creates a new index if the index doesn't already exist.
kibana.index: ".kibana"

c) 啓動Kibana

./bin/kibana

 

d) 驗證
打開網址:
http://localhost:5601/

見到以下Logo

 

4. Log4j記錄日誌到Logstash

4-1) 新建maven項目

pom中的關鍵dependency配置以下:

<dependency>
    <groupId>log4j</groupId>
    <artifactId>log4j</artifactId>
    <version>1.2.17</version>
</dependency>

4-2) log4j.properties(放在resources文件夾下)

### 設置###
log4j.rootLogger = debug,stdout,D,E,logstash

### 輸出信息到控制擡 ###
log4j.appender.stdout = org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target = System.out
log4j.appender.stdout.layout = org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern = [%-5p] %d{yyyy-MM-dd HH:mm:ss,SSS} method:%l%n%m%n

### 輸出DEBUG 級別以上的日誌到=/Users/bee/Documents/elk/log4j/debug.log###
log4j.appender.D = org.apache.log4j.DailyRollingFileAppender
log4j.appender.D.File = /Users/KG/Documents/logs/elk/debug.log
log4j.appender.D.Append = true
log4j.appender.D.Threshold = DEBUG
log4j.appender.D.layout = org.apache.log4j.PatternLayout
log4j.appender.D.layout.ConversionPattern = %-d{yyyy-MM-dd HH:mm:ss}  [ %t:%r ] - [ %p ]  %m%n

### 輸出ERROR 級別以上的日誌到=/Users/bee/Documents/elk/log4j/error.log  ###
log4j.appender.E = org.apache.log4j.DailyRollingFileAppender
log4j.appender.E.File =/Users/KG/Documents/logs/elk/error.log
log4j.appender.E.Append = true
log4j.appender.E.Threshold = ERROR
log4j.appender.E.layout = org.apache.log4j.PatternLayout
log4j.appender.E.layout.ConversionPattern = %-d{yyyy-MM-dd HH:mm:ss}  [ %t:%r ] - [ %p ]  %m%n

#輸出日誌到logstash
log4j.appender.logstash=org.apache.log4j.net.SocketAppender
log4j.appender.logstash.RemoteHost=127.0.0.1
log4j.appender.logstash.port=4560
log4j.appender.logstash.ReconnectionDelay=60000
log4j.appender.logstash.LocationInfo=true

4-3) Java代碼(ElkLog4jTest.java):

package org.genesis.arena.elk;

import org.apache.log4j.Logger;

/**
 * Created by KG on 17/3/27.
 */
public class ElkLog4jTest {
    private static final Logger logger = Logger.getLogger(ElkLog4jTest.class);
    public static void main(String[] args) throws Exception {
        logger.debug("This is a debug message!");
        logger.info("This is info message!");
        logger.warn("This is a warn message!");
        logger.error("This is error message!");

        try{
            System.out.println(5/0);
        }catch(Exception e){
            logger.error(e);
        }
    }
}

4-4) 運行結果

[DEBUG] 2017-03-29 12:56:00,454 method:org.genesis.arena.elk.ElkLog4jTest.main(ElkLog4jTest.java:11)
This is a debug message!
[INFO ] 2017-03-29 12:56:00,529 method:org.genesis.arena.elk.ElkLog4jTest.main(ElkLog4jTest.java:12)
This is info message!
[WARN ] 2017-03-29 12:56:00,531 method:org.genesis.arena.elk.ElkLog4jTest.main(ElkLog4jTest.java:13)
This is a warn message!
[ERROR] 2017-03-29 12:56:00,533 method:org.genesis.arena.elk.ElkLog4jTest.main(ElkLog4jTest.java:14)
This is error message!
[ERROR] 2017-03-29 12:56:00,538 method:org.genesis.arena.elk.ElkLog4jTest.main(ElkLog4jTest.java:19)
java.lang.ArithmeticException: / by zero

而後會在Logstash控制檯看到輸出以下圖:

 

5. 連通Kibana

5-1) 打開 http://localhost:5601/

5-2) 建立索引

   還記得咱們以前在logstash配置文件中配置的索引嗎?

   log4j-%{+YYYY.MM.dd}

  所以,咱們應該建立索引爲:log4j-*

5-3)驗證

 

 

從這裏能夠看到本身剛纔在Java代碼中記錄的日誌。

好了,一切大功告成了!!!

是否是頗有成就感啊?

相關文章
相關標籤/搜索