springboot裏鏈接elk裏的logstash,而後寫指定index索引的日誌,而以後使用kibana去查詢和分析日誌,使用elasticsearch去保存日誌。spring
implementation 'net.logstash.logback:logstash-logback-encoder:5.3'
<?xml version="1.0" encoding="UTF-8"?> <configuration debug="false"> <!--定義日誌文件的存儲地址 勿在 LogBack 的配置中使用相對路徑--> <property name="LOG_HOME" value="./logs" /> <!-- 控制檯輸出 --> <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"> <encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder"> <!--格式化輸出:%d表示日期,%thread表示線程名,%-5level:級別從左顯示5個字符寬度%msg:日誌消息,%n是換行符--> <pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{50} - %msg %n</pattern> </encoder> </appender> <appender name="logstash" class="net.logstash.logback.appender.LogstashTcpSocketAppender"> <param name="Encoding" value="UTF-8"/> <destination>127.0.0.1:5000</destination> <encoder charset="UTF-8" class="net.logstash.logback.encoder.LogstashEncoder" > <customFields>{"appname":"elkDemo"}</customFields> </encoder> </appender> <!-- 日誌輸出級別 --> <root level="INFO"> <appender-ref ref="STDOUT" /> <appender-ref ref="logstash" /> </root> </configuration>
server.port=81 logging.config=classpath:logback-spring.xml
input { tcp { port => 5000 codec => "json" } } ## Add your filters / logstash plugins configuration here output { elasticsearch { hosts => "elasticsearch:9200" user => "elastic" password => "changeme" index => "%{[appname]}" } }
經過kiban菜單去創建索引:Management>Index patterns>Create index pattern,這裏會顯示可用的索引名稱。json