在工做中須要在elk中展現haproxy的日誌,用於分析訪問客戶IP、URI、PV等;nginx
日誌樣例:我只取出了2種類型的日誌,在logstash解析時同時解析2種日誌格式;web
Sep 5 10:36:55 localhost haproxy[567197]: 221.238.230.162:49225 [05/Sep/2018:10:36:27.130] SBD-Security SBD-Reglog/shvnginx02 0/0/896 5489 -- 1401/2/2/0/0 0/0 Sep 5 10:36:54 localhost haproxy[567197]: 58.220.76.39:59768 [05/Sep/2018:10:36:54.026] SBD-Nginx imageservers/image01 0/0/0/2/2 200 5126 - - ---- 1353/1351/0/1/0 0/0 {|s06.abc001.cn} "GET /ftp_product_img/cn1100017322EA_1_thb.jpg?t=201709101850 HTTP/1.1"
filebeat先讀取haproxy.log的日誌,寫入redis中:redis
filebeat.inputs: - type: log paths: - /var/log/haproxy/haproxy.log tags: ["sbd_haproxy"] fields: type: sbd_haproxy fields_under_root: true output.redis: hosts: ["10.78.1.181"] key: "sbd_haproxy" type: list
logstash向redis讀取數據,解析過濾以後寫入elastic中:ruby
input { redis { host => "10.78.1.181" port => 6379 data_type => list key => "sbd_haproxy" } } filter { grok { match => ["message","%{HAPROXYHTTP}","message","%{HAPROXYTCP}"] } mutate { remove_field => ["host","captured_response_cookie","haproxy_time","haproxy_month","haproxy_hour","srvconn","backend_queue","retries","termination_state"] remove_field => ["haproxy_monthday","syslog_server","time_backend_connect","pid","srv_queue","beat","beconn","client_port","haproxy_milliseconds","@version"] remove_field => ["haproxy_minute","offset","haproxy_second","actconn","source","program","haproxy_year","feconn","http_version"] remove_field => ["message","prospector","time_duration","time_queue","syslog_timestamp","captured_request_cookie","prospector"] } date { match => ["accept_date","dd/MMM/yyyy:HH:mm:ss.SSS"] } mutate { remove_field => ["accept_date"] } } output { if [type] == "sbd_haproxy" { if [tags][0] == "sbd_haproxy" { elasticsearch { hosts => ["10.78.1.184:9200","10.78.1.185:9200","10.78.1.188:9200"] index => "%{type}-%{+YYYY.MM.dd}" } #stdout { codec => rubydebug } } } }
kibana展現:bash