ELK環境搭建(一)

簡介

ELK是三個開源軟件的縮寫,分別表示:Elasticsearch , Logstash, Kibana , 它們都是開源軟件。新增了一個FileBeat,它是一個輕量級的日誌收集處理工具(Agent),Filebeat佔用資源少,適合於在各個服務器上搜集日誌後傳輸給Logstash,官方也推薦此工具。html

Elasticsearch是個開源分佈式搜索引擎,提供蒐集、分析、存儲數據三大功能。它的特色有:分佈式,零配置,自動發現,索引自動分片,索引副本機制,restful風格接口,多數據源,自動搜索負載等。java

Logstash 主要是用來日誌的蒐集、分析、過濾日誌的工具,支持大量的數據獲取方式。通常工做方式爲c/s架構,client端安裝在須要收集日誌的主機上,server端負責將收到的各節點日誌進行過濾、修改等操做在一併發往elasticsearch上去。web

Kibana 也是一個開源和免費的工具,Kibana能夠爲 Logstash 和 ElasticSearch 提供的日誌分析友好的 Web 界面,能夠幫助彙總、分析和搜索重要數據日誌。spring

安裝

Elasticsearch json

下載地址:https://www.elastic.co/cn/downloads/elasticsearchbash

下載以後進行解壓縮,進入bin目錄elasticsearch.bat文件便可啓動。服務器

官方文檔配置地址:https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-http.htmlrestful

Kibana 架構

下載地址:https://www.elastic.co/cn/downloads/kibana併發

Logstash :

下載地址:https://www.elastic.co/cn/downloads/logstash

配置

logstash解壓縮目錄/conf目錄下建立配置文件,名稱隨意

log.conf

input {  
  tcp {
      port => 4560
      codec => json_lines  
  }  
}
  
output {  
  elasticsearch { 
     //elasticsearch 地址
     hosts => ["localhost:9200"]  
     index => "applog"  
  }  
}

啓動Elasticsearch

啓動logstash

.\logstash -f C:\work\logstash-6.2.3\config\log.conf --debug

啓動Kibana

在程序中增長logback的配置:logback-spring.xml

<?xml version="1.0" encoding="UTF-8"?>
<configuration scan="true" scanPeriod="10 seconds">

    <springProperty scope="context" name="springAppName"
                    source="spring.application.name" />

    <property name="CONSOLE_LOG_PATTERN"
              value="%date [%thread] %-5level %logger{36} - %msg%n" />

    <appender name="stdout" class="ch.qos.logback.core.ConsoleAppender">
        <withJansi>true</withJansi>
        <encoder>
            <pattern>${CONSOLE_LOG_PATTERN}</pattern>
            <charset>utf8</charset>
        </encoder>
    </appender>

    <appender name="logstash"
              class="net.logstash.logback.appender.LogstashTcpSocketAppender">
        <destination>127.0.0.1:4560</destination>
        <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
            <providers>
                <timestamp>
                    <timeZone>UTC</timeZone>
                </timestamp>
                <pattern>
                    <pattern>
                        {
                        "severity":"%level",
                        "service": "${springAppName:-}",
                        "trace": "%X{X-B3-TraceId:-}",
                        "span": "%X{X-B3-SpanId:-}",
                        "exportable": "%X{X-Span-Export:-}",
                        "pid": "${PID:-}",
                        "thread": "%thread",
                        "class": "%logger{40}",
                        "rest": "%message"
                        }
                    </pattern>
                </pattern>
            </providers>
        </encoder>
    </appender>

    <appender name="dailyRollingFileAppender" class="ch.qos.logback.core.rolling.RollingFileAppender">
        <File>main.log</File>
        <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
            <FileNamePattern>main.%d{yyyy-MM-dd}.log</FileNamePattern>
            <maxHistory>30</maxHistory>
        </rollingPolicy>
        <encoder>
            <Pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{35} - %msg %n</Pattern>
        </encoder>
        <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
            <level>DEBUG</level>
        </filter>
    </appender>

    <springProfile name="!production">
        <logger name="com.example" level="DEBUG" />
        <logger name="org.springframework.web" level="INFO"/>
        <root level="info">
            <appender-ref ref="stdout" />
            <appender-ref ref="dailyRollingFileAppender" />
            <appender-ref ref="logstash" />
        </root>
    </springProfile>

    <springProfile name="production">
        <logger name="com.example" level="DEBUG" />
        <logger name="org.springframework.web" level="INFO"/>
        <root level="info">
            <appender-ref ref="stdout" />
            <appender-ref ref="dailyRollingFileAppender" />
            <appender-ref ref="logstash" />
        </root>
    </springProfile>
</configuration>

pom中增長logback的依賴

<dependency>
			<groupId>net.logstash.logback</groupId>
			<artifactId>logstash-logback-encoder</artifactId>
			<version>4.9</version>
		</dependency>

增長測試代碼

package com.example.sharding.controller;


import com.dangdang.ddframe.rdb.sharding.id.generator.IdGenerator;
import com.example.sharding.entity.Order;
import com.example.sharding.service.OrderService;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

@RestController
@RequestMapping("/order")
public class OrderController {

    private static  final Logger LOGGER =  LoggerFactory.getLogger(OrderController.class);

    @Autowired
    private IdGenerator idGenerator;
    @Autowired
    private OrderService orderService;

    @RequestMapping("/add")
    public Object add() {
        for (int i = 0; i < 50; i++) {
            Order order = new Order();
            order.setUserId(idGenerator.generateId().longValue());
            order.setOrderId(idGenerator.generateId().longValue());
            orderService.save(order);
            LOGGER.info(order.toString());
        }
        return "success";
    }

    @RequestMapping("query")
    private Object queryAll() {
        LOGGER.info("queryAll");
        return orderService.findAll();
    }

    @RequestMapping("deleteAll")
    private void deleteAll() {
        LOGGER.info("deleteAll");
        orderService.deleteAll();
    }
}

頻繁訪問接口記錄日誌以後,刷新Elasticsearch,能夠看到有日誌進來了

訪問kabina

http://localhost:5601/app/kibana

相關文章
相關標籤/搜索