window環境下搭建簡單ELK日誌收集html
前言
本文主要介紹如何在window環境下部署elk日誌收集,網絡上大部分是linux的,恰好這邊服務須要用到window 環境,配置方式有點不一樣,大致是同樣的。linux
部署環境window server 2008 R2
依賴第三方組件JDK8+
elasticsearch-5.6.14
logstash-5.6.14
kibana-5.6.14-windows-x86
redis-3.2
redis
下載地址官網: https://www.elastic.co/json
簡單的流程示意圖
redis跟elasticsearch 都支持集羣部署,本文只簡單描述如何單獨部署
segmentfault
部署步驟windows
elk三件套版本最好一致,elasticsearch小版本能夠高,可是不能夠低。
關於elasticsearch 的搭建和配置能夠參考以前的一篇文章
DUBBO監控環境搭建緩存
redis
redis默認只容許本地訪問,若是要redis能夠遠程須要修改redis.windows.conf
將 bind 127.0.0.1 註釋掉,在redis3.2版本以上增長了一個保護模式,因此還須要修改protected-mode no網絡
#bind 127.0.0.1 # Protected mode is a layer of security protection, in order to avoid that # Redis instances left open on the internet are accessed and exploited. # # When protected mode is on and if: # # 1) The server is not binding explicitly to a set of addresses using the # "bind" directive. # 2) No password is configured.
# By default protected mode is enabled. You should disable it only if # you are sure you want clients from other hosts to connect to Redis # even if no authentication is configured, nor a specific set of interfaces # are explicitly listed using the "bind" directive. protected-mode no
logstash
logstash的做用是將redis中的日誌信息收集並保存到elasticsearch中,redis只是做爲緩存保存數據,logstash取完以後會刪除redis中的數據信息。app
1.指定啓動JDKcurl
指定JDK信息,若是有多個JDK須要指定,若是系統默認1.8能夠跳過
在 logstash-5.6.14bin的setup.bat 中加入這句
set JAVA_HOME=C:Program FilesJavajdk1.8.0_181
set JAVA_HOME=C:\Program Files\Java\jdk1.8.0_181 rem ### 1: determine logstash home rem to do this, we strip from the path until we rem find bin, and then strip bin (there is an assumption here that there is no rem nested directory under bin also named bin)
2.配置輸入輸出源
在 logstash-5.6.14bin中新建 logback.config(名字任意) 配置文件,定義輸入輸出
input接受數據的輸入,這邊指定到redis;
output輸出文件到指定的地方, elasticsearch;這邊的index索引會在kibana那邊配置用到
這個輸入和輸出的配置支持的類型比較多,有興趣專研的能夠參考官網配置
logstash輸入配置
logstash輸出配置
另外logstash還支持filter過濾器設置
input { redis { data_type => "list" key => "logstash" host => "127.0.0.1" port => 6379 threads => 5 codec => "json" } } filter { } output { elasticsearch { hosts => ["127.0.0.1:9200"] index => "logstash-%{type}-%{+YYYY.MM.dd}" document_type => "%{type}" workers => 1 flush_size => 20 idle_flush_time => 1 template_overwrite => true } stdout{} }
啓動logstash命令 logstash.bat -f logback.config
kibana
找到kibana-5.6.14-windows-x86config 中kibana.yml,1.配置host 0.0.0.0,運行外網訪問,默認只能本地訪問 2.配置es的地址。
server.host: "0.0.0.0" # Enables you to specify a path to mount Kibana at if you are running behind a proxy. This only affects # the URLs generated by Kibana, your proxy is expected to remove the basePath value before forwarding requests # to Kibana. This setting cannot end in a slash. #server.basePath: "" # The maximum payload size in bytes for incoming server requests. #server.maxPayloadBytes: 1048576 # The Kibana server's name. This is used for display purposes. #server.name: "your-hostname" # The URL of the Elasticsearch instance to use for all your queries. elasticsearch.url: "http://localhost:9200"
啓動Kibana.bat
跟logback的結合
<dependency> <groupId>com.cwbase</groupId> <artifactId>logback-redis-appender</artifactId> <version>1.1.3</version> <exclusions> <exclusion> <groupId>redis.clients</groupId> <artifactId>jedis</artifactId> </exclusion> </exclusions> </dependency>
配置logback.xml,這裏的key須要跟上面配置的logback.config配置文件中配置的key相匹配
<appender name="LOGSTASH" class="com.cwbase.logback.RedisAppender"> <source>demo</source> <type>dubbo</type> <host>127.0.0.1</host> <key>logstash</key> <tags>dev</tags> <mdc>true</mdc> <location>true</location> <callerStackIndex>0</callerStackIndex> </appender> <!-- 語句監控 end --> <root level="INFO"> <appender-ref ref="stdout" /> <appender-ref ref="file-stdout" /> <appender-ref ref="file-error" /> <appender-ref ref="LOGSTASH" /> </root>
啓動服務工程,logback會將日誌發送到redis,打開kibana的頁面,第一次打開會提示建立索引規則,就是logback.config中es配置的索引logstash-*,建立好以後就能夠看到日誌已經被採集到elasticsearch中,咱們在頁面就能夠查看日誌信息。
kibana默認端口http://127.0.0.1:5601
最後說明下關於清除日誌,ElasticSearch 5.x版本中刪除過時ttl屬性,因此咱們須要定時清理es的索引,在服務中建一個bat定時執行,如下爲刪除30天前的所有索引。
http://{Elasticsearch IP}:9200/_cat/indices?v 查看es的全部索引
set DATA=`date -d "-30 days" +%Y.%m.%d` curl -XDELETE http://127.0.0.1:9200/*-${DATA}
以上
參考文檔
http://www.cnblogs.com/ASPNET...
https://blog.csdn.net/xuezhan...