slf4j-logback 日誌以json格式導入ELK

slf4j-logback 日誌以json格式導入ELK

同事整理的,在此分享。logback,log4j2 等slf4j的日誌實現均可以以json格式輸出日誌, 這裏採用的是logback。固然也能夠以文本行的格式輸出,而後在logstash裏經過grok解析,可是直接以json格式輸出,在logstash處理時效率會高一點。html

Logback 輸出 Json格式日誌文件

 爲了讓 logback 輸出JSON 格式的日誌文件,須要在pom.xml 加入以下依賴java

<dependency>git

   <groupId>net.logstash.logback</groupId>github

   <artifactId>logstash-logback-encoder</artifactId>web

   <version>4.8</version>spring

   <scope>runtime</scope>json

</dependency>springboot

logback日誌配置示例app

<appender name="errorFile" class="ch.qos.logback.core.rolling.RollingFileAppender">curl

   <filter class="ch.qos.logback.classic.filter.LevelFilter">

      <level>ERROR</level>

      <onMatch>ACCEPT</onMatch>

      <onMismatch>DENY</onMismatch>

   </filter>

   <file>${log.dir}/elk/error.log</file> <!-- 當前的日誌文件文件放在 elk文件下,該日誌的內容會被filebeat傳送到es --> 

   <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">  <! -- 歷史日誌會放到 bak 文件下,最多保存7天的歷史,最多佔用 1G的空間 -->

      <fileNamePattern>${log.dir}/bak/error.%d{yyyy-MM-dd}.log</fileNamePattern>

      <maxHistory>7</maxHistory>

      <totalSizeCap>1GB</totalSizeCap>

   </rollingPolicy>

   <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">

       

      <providers>

         <pattern>

            <pattern>

              {

              "tags": ["errorlog"],

              "project""myproject",

              "timestamp""%date{\"yyyy-MM-dd'T'HH:mm:ss,SSSZ\"}",

              "log_level""%level",

              "thread""%thread",

              "class_name""%class",

              "line_number""%line",

              "message""%message",

              "stack_trace""%exception{5}",

              "req_id""%X{reqId}",

              "elapsed_time""#asLong{%X{elapsedTime}}"

              }

            </pattern>

         </pattern>

      </providers>

   </encoder>

</appender>

 

 

Json 字段說明:

名稱

說明

備註

 

 

 

 

 

tags 用於說明這條日誌是屬於哪一類日誌            
timestamp
日誌記錄時間            
project
系統名稱,該日誌來自於哪一個系統            
log_level
輸出日誌級別            
thread
輸出產生日誌的線程名。            
class_name
輸出執行記錄請求的調用者的全限定名
 
         
line_number
輸出執行日誌請求的行號
 
         
message
輸出應用程序提供的信息            
stack_trace
異常棧信息            
req_id
請求ID,用於追蹤請求 須要引入aop-logging          
elapsed_time
該方法執行時間,單位: 毫秒 須要引入aop-logging          
%X{key}: 表示該項來自於SLF4j MDC,須要引入 aop-logging

<dependency>

        <groupId>com.cloud</groupId>

         <artifactId>xspring-aop-logging</artifactId>

        <version>0.7.1</version>

</dependency>

 

針對web應用,在 web.xml 中加入 ReqIdFilter,該過濾器會在MDC 加入 reqId

<filter>

    <filter-name>aopLogReqIdFilter</filter-name>

    <filter-class>com.github.nickvl.xspring.core.log.aop.ReqIdFilter</filter-class>

</filter>

 

<filter-mapping>

    <filter-name>aopLogReqIdFilter</filter-name>

    <url-pattern>/*</url-pattern>

</filter-mapping>

 

 

or register in springboot like this:

 

 

@Bean

public FilterRegistrationBean getDemoFilter(){

    ReqIdFilter reqIdFilter=new ReqIdFilter();

    FilterRegistrationBean registrationBean=new FilterRegistrationBean();

    registrationBean.setFilter(reqIdFilter);

    List<String> urlPatterns=new ArrayList<String>();

    urlPatterns.add("/*");

    registrationBean.setUrlPatterns(urlPatterns);

    registrationBean.setOrder(100);

    return registrationBean;

}

 

若是須要記錄該方法執行時間: elapsed_time,若是在該類或者方法上加入以下註解:

 

import com.github.nickvl.xspring.core.log.aop.annotation.LogDebug;

import com.github.nickvl.xspring.core.log.aop.annotation.LogInfo;

 

@LogInfo  // 當logger 設爲level=INFO 會輸出

@LogException(value = {@Exc(value = Exception.class, stacktrace = false)}, warn = {@Exc({IllegalArgumentException.class})}) //

當logger 設爲level=error 會輸出

 

針對dubbo 消費者的日誌記錄,dubbo消費者是經過 javassist 生成的動態類型,若是要監控該dubbo接口的傳入參數,返回值,和調用時間 須要引入aop-logging,

以及在 eye-rpc包中的接口上給對應的類或方法 加上上面的註解。

dubbo 消費者的日誌會輸出以下配置:

 

 <logger name="com.alibaba.dubbo.common.bytecode" level="INFO" additivity="false">

   <appender-ref ref="dubboApiFile"/>

</logger>

 

ElasticSearch 模板設置

curl -XPUT http://localhost:9200/_template/log -d '{

  "mappings": {

    "_default_": {

      "_all": {

        "enabled"false

      },

      "_meta": {

        "version""5.1.1"

      },

      "dynamic_templates": [

        {

          "strings_as_keyword": {

            "mapping": {

              "ignore_above"1024,

              "type""keyword"

            },

            "match_mapping_type""string"

          }

        }

      ],

      "properties": {

        "@timestamp": {

          "type""date"

        },

        "beat": {

          "properties": {

            "hostname": {

              "ignore_above"1024,

              "type""keyword"

            },

            "name": {

              "ignore_above"1024,

              "type""keyword"

            },

            "version": {

              "ignore_above"1024,

              "type""keyword"

            }

          }

        },

        "input_type": {

          "ignore_above"1024,

          "type""keyword"

        },

        "message": {

          "norms"false,

          "type""text"

        },

        "offset": {

          "type""long"

        },

        "source": {

          "ignore_above"1024,

          "type""keyword"

        },

        "tags": {

          "ignore_above"1024,

          "type""keyword"

        },

        "type": {

          "ignore_above"1024,

          "type""keyword"

        }

      }

    }

  },

  "order"0,

  "settings": {

    "index.refresh_interval""5s"

  },

  "template""log-*"

}'

 

curl -XPUT http://localhost:9200/_template/log-java -d '

 

{

  "mappings": {

    "_default_": {

      "properties": {

        "log_level": {

          "ignore_above"1024,

          "type""keyword"

        },

        "project": {

          "ignore_above"1024,

          "type""keyword"

        },

        "thread": {

          "ignore_above"1024,

          "type""keyword"

        },

        "req_id": {

          "ignore_above"1024,

          "type""keyword"

        },

        "class_name": {

          "ignore_above"1024,

          "type""keyword"

        },

        "line_number": {

          "type""long"

        },

        "exception_class":{

          "ignore_above"1024,

          "type""keyword"

        },

        "elapsed_time": {

          "type""long"

        },

        

        "stack_trace": {

          "type""keyword"

        }

      }

    }

  },

  "order"1,

  "settings": {

    "index.refresh_interval""5s"

  },

  "template""log-java-*"

}'

logstatsh 設置

logstash-java-log

if [fields][logType] == "java" {

    json {

        source => "message"

        remove_field => ["offset"]

    }

    date {

        match => ["timestamp","yyyy-MM-dd'T'HH:mm:ss,SSSZ"]

        remove_field => ["timestamp"]

    }

    if [stack_trace] {

         mutate {

            add_field => { "exception_class" => "%{stack_trace}" }

        }

    }

    if [exception_class] {

         mutate {

            gsub => [

                "exception_class""\n""",

                "exception_class"":.*"""

            ]

        }

    }

}

filebeat 設置

filebeat.yml

filebeat.prospectors:

- input_type: log

  paths:

    - /eyebiz/logs/eyebiz-service/elk/*.log   # eyebiz-service 日誌

    - /eyebiz/logs/eyebiz-web/elk/*.log       # eyebiz-web 日誌

  fields:

    logType: "java"

    docType: "log-java-dev"

相關文章
相關標籤/搜索