ELK收集日誌到mysql數據庫

場景需求

在使用ELK對日誌進行收集的時候,若是須要對數據進行存檔,能夠考慮使用數據庫的方式。爲了便於查詢,能夠同時寫一份數據到Elasticsearch 中。java

環境準備

CentOS7系統:node

  • 192.168.20.60 node1 Kibana ES Logstash Nginx
  • 192.168.20.61 node2 ES MariaDB
    這裏使用收集Nginx日誌到數據庫和ES中做爲示例。

配置數據庫

安裝好數據庫後,配置,並受權:mysql

MariaDB [(none)]> create database elk character set utf8 collate utf8_bin;
Query OK, 1 row affected (0.00 sec)

MariaDB [(none)]> grant all privileges on elk.* to elk@'192.168.20.%' identified by '123456';
Query OK, 0 rows affected (0.00 sec)

MariaDB [(none)]> flush privileges;
Query OK, 0 rows affected (0.00 sec)

在node1上測試數據庫的鏈接:nginx

[root@node1 ~]# yum install mariadb -y
[root@node1 ~]# mysql -uelk -p123456 -h 192.168.20.61

在node1上安裝Logstash, 能夠直接從官方下載rpm包安裝,Elasticsearch和Kibana的安裝跳過,可參考Kibana使用Nginx代理驗證ELK日誌管理平臺部署簡介的前半部分,這裏再也不贅述。sql

[root@node1 ~]# yum install logstash-5.6.5.rpm -y
[root@node1 ~]# systemctl start logstash

配置JDBC數據庫驅動

按裝logstash的數據庫驅動須要先安裝gem源:數據庫

[root@node1 ~]# yum install gem -y
[root@node1 ~]# gem -v
2.0.14.1
[root@node1 ~]# gem sources --add https://gems.ruby-china.org/ --remove https://rubygems.org/
https://gems.ruby-china.org/ added to sources
https://rubygems.org/ removed from sources
[root@node1 ~]# gem source list
*** CURRENT SOURCES ***

https://gems.ruby-china.org/

查看當前已經安裝的插件:json

[root@node1 ~]# /usr/share/logstash/bin/logstash-plugin list

安裝JDBC驅動:vim

[root@node1 ~]# /usr/share/logstash/bin/logstash-plugin install logstash-output-jdbc
Validating logstash-output-jdbc
Installing logstash-output-jdbc
Installation successful

安裝須要等待一段時間,查看是否安裝成功:ruby

[root@node1 ~]# /usr/share/logstash/bin/logstash-plugin list|grep jdbc
logstash-input-jdbc
logstash-output-jdbc

下載數據庫的JDBC驅動:https://dev.mysql.com/downloads/connector/j/ 上傳到服務器。驅動的路徑必須嚴格一致,不然鏈接數據庫會報錯。服務器

[root@node1 ~]# tar xf mysql-connector-java-5.1.45.tar.gz 
[root@node1 ~]# cd mysql-connector-java-5.1.45
[root@node1 mysql-connector-java-5.1.45]# mkdir -p /usr/share/logstash/vendor/jar/jdbc
[root@node1 mysql-connector-java-5.1.45]# cp mysql-connector-java-5.1.45-bin.jar  /usr/share/logstash/vendor/jar/jdbc/
[root@node1 ~]# chown -R logstash.logstash /usr/share/logstash/vendor/jar/jdbc/

配置Nginx日誌格式

要使日誌以指定的表中字段的方式存儲,須要將Nginx的日誌格式改寫爲json格式,修改nginx.conf問,將日誌格式配置部分替換爲:

log_format   access_log_json '{"host":"$http_x_real_ip","client_ip":"$remote_addr","log_time":"$time_iso8601","request":"$request","status":"$status","body_bytes_sent":"$body_bytes_sent","req_time":"$request_time","AgentVersion":"$http_user_agent"}';

access_log  /var/log/nginx/access.log  access_log_json;

檢查語法,並從新加載nginx:

nginx -t
nginx -s reload

查看日誌中新日誌的格式是不是json格式。

建立數據表

咱們在數據庫中存儲數據的時候,沒有必要存儲日誌的全部內容,只需存儲咱們須要的重要信息便可,能夠根據自身的需求進行取捨。

注意:數據表中須要建立time字段,time的默認值設置爲CURRENT_TIMESTAMP.

建立數據表語句(只獲取部分數據):

MariaDB [elk]> create table nginx_log(host varchar(128),client_ip varchar(128),status int(4),req_time float(8,3),AgentVersion varchar(512), time TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin;

查看錶結構:

MariaDB [elk]> desc nginx_log;
+-------------+--------------+------+-----+-------------------+-------+
| Field       | Type         | Null | Key | Default           | Extra |
+-------------+--------------+------+-----+-------------------+-------+
| host        | varchar(128) | YES  |     | NULL              |       |
| client_ip   | varchar(128) | YES  |     | NULL              |       |
| status      | int(4)       | YES  |     | NULL              |       |
| req_time    | float(8,3)   | YES  |     | NULL              |       |
| AgentVersion | varchar(512) | YES  |     | NULL              |       |
| time        | timestamp    | NO   |     | CURRENT_TIMESTAMP |       |
+-------------+--------------+------+-----+-------------------+-------+
6 rows in set (0.00 sec)

配置Logstash將日誌寫入數據庫

建立Logstash的配置文件:
[root@node1 ~]# vim /etc/logstash/conf.d/nginx_log.conf

[root@node1 ~]# cat /etc/logstash/conf.d/nginx_log.conf 
input{
  file{
    path => "/var/log/nginx/access.log"     # 指定文件
    start_position => "beginning"     # 從開始收集
    stat_interval => "2"            # 間隔時間爲2s
    codec => "json"            # 使用json格式
}  

}

output{
  elasticsearch {
    hosts => ["192.168.20.60:9200"]
    index => "nginx-log-%{+YYYY.MM.dd}"
}

  jdbc{
    connection_string => "jdbc:mysql://192.168.20.61/elk?user=elk&password=123456&useUnicode=true&characterEncoding=UTF8"
    statement => ["insert into nginx_log(host,client_ip,status,req_time,AgentVersion) VALUES(?,?,?,?,?)", "host","client_ip","status","req_time","AgentVersion"]
}
}

測試文件,查看是否正確:

[root@node1 ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/nginx_log.conf -t
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
Configuration OK

若是發現配置文件正確,可是日誌沒法收集,能夠使用前臺啓動的方式,查看日誌信息:

[root@node1 ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/nginx_log.conf

提示: 若是是使用root操做,使用前臺啓動的方式也就是以root用戶啓動,使用系統systemd啓動使用的是logstash用戶,若是前臺啓動正常,然後臺啓動沒法收集日誌,通常是目錄或者文件權限問題。

重啓logstash,訪問nginx生成日誌,並查看Elasticsearch是否已經收集日誌:

[root@node1 ~]# systemctl restart logstash

ES中已經自動建立的數據信息:
ELK收集日誌到mysql數據庫

在Kibana中添加日誌信息,用於展現,在輸入名稱以後,選擇使用時間戳的方式,會自動檢索出對應的信息:
ELK收集日誌到mysql數據庫

當Kibana上有數據展現後,咱們能夠查看數據庫,發現數據庫中已經存儲了日誌信息:

[root@node2 elasticsearch-head]# mysql -uroot -p123456 -e "select * from elk.nginx_log;"|head -10
host    client_ip   status  req_time    AgentVersion    time
-   192.168.20.191  304 0.023   Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 2017-12-18 11:33:39
-   192.168.20.191  200 0.042   Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 2017-12-18 11:33:39
-   192.168.20.191  200 0.030   Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 2017-12-18 11:33:39
-   192.168.20.191  200 0.042   Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 2017-12-18 11:34:33
-   192.168.20.191  200 0.380   Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 2017-12-18 11:34:33
-   192.168.20.191  200 0.195   Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 2017-12-18 11:34:37
-   192.168.20.191  200 0.034   Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 2017-12-18 11:34:45
-   192.168.20.191  200 0.016   Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 2017-12-18 11:34:59
-   192.168.20.191  200 0.570   Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36 2017-12-18 11:35:00

將Nginx 日誌收集到數據庫中就完成了。若是在收集日誌的時候,出現數據庫中沒有數據,或者有些字段的數據沒法獲取,故障排除的思路是:一、查看nginx的日誌文件格式是否爲json格式,日誌輸出是否正常。二、查看es 或者kibana上的日誌是否完整,展現是否正常。三、查看logstash的conf配置文件是否正常,字段標註的個數,名稱有無對應上。四、測試配置文件,刷新日誌,若是kibana上展現正常,通常是logstash配置文件沒有對應上字段,或者數據庫權限問題。

相關文章
相關標籤/搜索