1. 問題描述 apache
咱們須要將不一樣服務器(如Web Server)上的log4j日誌傳輸到同一臺ELK服務器,介於公司服務器資源緊張(^_^)ruby
2. 咱們須要用到filebeat服務器
什麼是filebeat?app
filebeat被用來ship events,即把一臺服務器上的文件日誌經過socket的方式,傳輸到遠程的ELK。socket
能夠傳輸到logstash,也能夠直接傳輸到elasticsearch。elasticsearch
3. 咱們這裏講解如何傳輸到遠程的logstash,而後再由elasticsearch講數據傳輸到kibana展現測試
3-1) 首先你要在你的本地測試機器上安裝filebeatfetch
如下是下載路徑:spa
https://www.elastic.co/downloads/beats/filebeatdebug
3-2) 其次你應該配置你的filebeat.xml
filebeat.prospectors:
- input_type: log
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /Users/KG/Documents/logs/t-server/*.log
#----------------------------- Logstash output --------------------------------
output.logstash:
# The Logstash hosts
hosts: ["xx.xx.xx.xx:5000"]
3-3) 啓動filebeat
須要先給配置文件加權限
input{
beats {
port => 5000
}
}
output{
stdout{ codec => rubydebug }
elasticsearch {
hosts => "localhost:9200"
index => "t-server-%{+YYYY.MM.dd}"
document_type => "log4j_type"
user => your-username
password => your-password
}
}
啓動:
3-5) Java客戶端日誌配置和程序
log4j.properties
### 設置###
log4j.rootLogger = debug,stdout,D
### 輸出信息到控制擡 ###
log4j.appender.stdout = org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target = System.out
log4j.appender.stdout.layout = org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern = [%-5p] %d{yyyy-MM-dd HH:mm:ss,SSS} method:%l%n%m%n
### 輸出DEBUG 級別以上的日誌到=/Users/bee/Documents/elk/log4j/debug.log###
log4j.appender.D = org.apache.log4j.DailyRollingFileAppender
log4j.appender.D.File = /Users/KG/Documents/logs/t-server/app.log
log4j.appender.D.Append = true
log4j.appender.D.Threshold = DEBUG
log4j.appender.D.layout = org.apache.log4j.PatternLayout
log4j.appender.D.layout.ConversionPattern = %-d{yyyy-MM-dd HH:mm:ss} [ %t:%r ] - [ %p ] %m%n
Java API
package org.genesis.arena.elk; import org.apache.log4j.Logger; /** * Created by KG on 17/3/27. */ public class ElkLog4jTest { private static final Logger logger = Logger.getLogger(ElkLog4jTest.class); public static void main(String[] args) throws Exception { logger.debug("最新的日誌!!"); } }
在logstash看到結果以下:
在kibana看到結果以下:
同理,咱們使用另外一個端口啓動另一個logstash後臺進程
logstash配置文件以下:
log4j_fliebeat2.conf
input{
beats {
port => 5001
}
}
output{
stdout{ codec => rubydebug }
elasticsearch {
hosts => "localhost:9200"
index => "t-yxc-finance-%{+YYYY.MM.dd}"
document_type => "log4j_type"
user => your-username
password => your-password
}
}
啓動:
filebeat.yml
filebeat.prospectors:
# Each - is a prospector. Most options can be set at the prospector level, so
# you can use different prospectors for various configurations.
# Below are the prospector specific configurations.
- input_type: log
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /Users/KG/Documents/logs/t-yxc-finance/*.log
#----------------------------- Logstash output --------------------------------
output.logstash:
# The Logstash hosts
hosts: ["xx.xx.xx.xx:5001"]
客戶端配置文件和代碼:
### 設置###
log4j.rootLogger = debug,stdout,D
### 輸出信息到控制擡 ###
log4j.appender.stdout = org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target = System.out
log4j.appender.stdout.layout = org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern = [%-5p] %d{yyyy-MM-dd HH:mm:ss,SSS} method:%l%n%m%n
### 輸出DEBUG 級別以上的日誌到=/Users/bee/Documents/elk/log4j/debug.log###
log4j.appender.D = org.apache.log4j.DailyRollingFileAppender
log4j.appender.D.File = /Users/KG/Documents/logs/t-yxc-finance/app.log
log4j.appender.D.Append = true
log4j.appender.D.Threshold = DEBUG
log4j.appender.D.layout = org.apache.log4j.PatternLayout
log4j.appender.D.layout.ConversionPattern = %-d{yyyy-MM-dd HH:mm:ss} [ %t:%r ] - [ %p ] %m%n
package org.genesis.arena.elk; import org.apache.log4j.Logger; /** * Created by KG on 17/3/27. */ public class ElkLog4jTest { private static final Logger logger = Logger.getLogger(ElkLog4jTest.class); public static void main(String[] args) throws Exception { logger.debug("另一臺服務器,最新的日誌!!"); } }
運行結果以下: