Elasticsearch是個開源分佈式搜索引擎,它的特色有:分佈式,零配置,自動發現,索引自動分片,索引副本機制,restful風格接口,多數據源,自動搜索負載等。java
Logstash是一個徹底開源的工具,他能夠對你的日誌進行收集、過濾,並將其存儲供之後使用(如,搜索)。git
Kibana 也是一個開源和免費的工具,它Kibana能夠爲 Logstash 和 ElasticSearch 提供的日誌分析友好的 Web 界面,能夠幫助您彙總、分析和搜索重要數據日誌。程序員
咱們採用的 docker 鏡像安裝。github
#下載鏡像 docker pull sebp/elk
#啓動鏡像 , 指定es的內存 docker run -e ES_JAVA_OPTS="-Xms256m -Xmx256m" -p 5601:5601 -p 5044:5044 -p 9200:9200 -p 9300:9300 -it --name elk 2fbf0a30426d
須要修改 logstash 配置,新建命令窗口,進行下面的docker命令web
#經過exec命令進入容器 docker exec -it elk /bin/bash
進入容器後,修改 /etc/logstash/conf.d/02-beats-input.confspring
input { tcp { port => 5044 codec => json_lines } } output{ elasticsearch { hosts => ["localhost:9200"] } }
保存後,咱們使用 control + P + Q 退出容器。而後重啓容器,讓咱們的配置生效。docker
docker restart elk
咱們訪問http://127.0.0.1:5601apache
建立工程springboot-elk ,並使用logback 記錄日誌。json
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.gf</groupId> <artifactId>springboot-elk</artifactId> <version>0.0.1-SNAPSHOT</version> <packaging>jar</packaging> <name>springboot-elk</name> <description>Demo project for Spring Boot</description> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>2.1.1.RELEASE</version> <relativePath/> <!-- lookup parent from repository --> </parent> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding> <java.version>1.8</java.version> </properties> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> </dependency> <!-- logback --> <dependency> <groupId>ch.qos.logback</groupId> <artifactId>logback-classic</artifactId> </dependency> <dependency> <groupId>net.logstash.logback</groupId> <artifactId>logstash-logback-encoder</artifactId> <version>5.2</version> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <configuration> <fork>true</fork> </configuration> </plugin> </plugins> </build> </project>
@RestController @SpringBootApplication public class SpringbootElkApplication { private final static Logger logger = LoggerFactory.getLogger( SpringbootElkApplication.class ); public static void main(String[] args) { SpringApplication.run(SpringbootElkApplication.class, args); } @GetMapping("/{name}") public String hi(@PathVariable(value = "name") String name) { logger.info( "name = {}" , name ); return "hi , " + name; } }
<?xml version="1.0" encoding="UTF-8"?> <!--該日誌將日誌級別不一樣的log信息保存到不一樣的文件中 --> <configuration> <include resource="org/springframework/boot/logging/logback/defaults.xml" /> <springProperty scope="context" name="springAppName" source="spring.application.name" /> <!-- 日誌在工程中的輸出位置 --> <property name="LOG_FILE" value="${BUILD_FOLDER:-build}/${springAppName}" /> <!-- 控制檯的日誌輸出樣式 --> <property name="CONSOLE_LOG_PATTERN" value="%clr(%d{yyyy-MM-dd HH:mm:ss.SSS}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([%15.15t]){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}}" /> <!-- 控制檯輸出 --> <appender name="console" class="ch.qos.logback.core.ConsoleAppender"> <filter class="ch.qos.logback.classic.filter.ThresholdFilter"> <level>INFO</level> </filter> <!-- 日誌輸出編碼 --> <encoder> <pattern>${CONSOLE_LOG_PATTERN}</pattern> <charset>utf8</charset> </encoder> </appender> <!-- 爲logstash輸出的JSON格式的Appender --> <appender name="logstash" class="net.logstash.logback.appender.LogstashTcpSocketAppender"> <destination>127.0.0.1:5044</destination> <!-- 日誌輸出編碼 --> <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder"> <providers> <timestamp> <timeZone>UTC</timeZone> </timestamp> <pattern> <pattern> { "severity": "%level", "service": "${springAppName:-}", "trace": "%X{X-B3-TraceId:-}", "span": "%X{X-B3-SpanId:-}", "exportable": "%X{X-Span-Export:-}", "pid": "${PID:-}", "thread": "%thread", "class": "%logger{40}", "rest": "%message" } </pattern> </pattern> </providers> </encoder> </appender> <!-- 日誌輸出級別 --> <root level="INFO"> <appender-ref ref="console" /> <appender-ref ref="logstash" /> </root> </configuration>
啓動工程,日誌會存入elasticsearch中,經過Kibana 的web界面,配置後,咱們就可看到,下面我簡單的修改下配置。springboot
配置 pattern 輸入*,匹配全部數據。
選擇時間@timestamp,這樣數據展現會以時間排序
好了 ,點擊discover,就能夠看到咱們springboot-elk項目的日誌信息了。
源碼下載: https://github.com/gf-huanchupk/SpringBootLearning
歡迎掃碼或微信搜索公衆號《程序員果果》關注我,關注有驚喜~