上篇介紹了經過filebeat將日誌直接寫入到elasticsearch中去。但大多時候咱們須要對日誌的不一樣字段拆分後寫入elasticsearch中,方便查詢和統計,這就須要用到logstash了。html
安裝logstash(centos)
這裏介紹經過yum方式安裝:java
- 執行
sudo rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch
- 在
/etc/yum.repos.d/
目錄下添加elastic.repo
文件,內容以下:[logstash-7.x] name=Elastic repository for 7.x packages baseurl=https://artifacts.elastic.co/packages/7.x/yum gpgcheck=1 gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch enabled=1 autorefresh=1 type=rpm-md
- 執行
sudo yum install logstash
配置logstash
- 在
/etc/logstash/conf.d
下建立規則文件xxx.conf
,推薦一個grok規則驗證site - 在
xxx.conf
寫入:input { beats { port => 5044 } } filter { grok { match => { "message" => "%{TIMESTAMP_ISO8601:logtime} \[%{NOTSPACE:threadname}\] %{LOGLEVEL:loglevel} %{DATA:javamethod} - %{JAVALOGMESSAGE:logcontent}" } } date { match => ["logtime", "ISO8601"] target => "@timestamp" } mutate { add_field => { "[@metadata][target_index]" => "fb-%{filetype}-%{+YYYY.MM.dd}" } remove_field => ["logtime", "message", "tags"] } } output { elasticsearch { hosts => ["127.0.0.1:9200"] index => "%{[@metadata][target_index]}" user => "elastic" password => "search" } }
grok對message的解析對應的日誌格式以下:shell
%d [%t] %-5level %logger{ 36}.%M\(%file:%line\) - %msg%n
- 重啓logstash服務
service logstash restart
=>logstash doc<=centos