1 ELK 是一個實時分佈式的日誌分析平臺
ELK 是一整套的解決方案
(E)lasticsearch -- 數據庫
(L)ogstash -- 收集日誌、標準化的程序
(K)ibana -- 圖形的展現工具java
2 數據批量導入
-X 導入使用的方法 POST
--data-binary 導入數據的格式
@urfile 導入數據的文件名
_bulk 導入關鍵字
curl -X "POST" "http://192.168.1.13:9200/_bulk" --data-binary @shakespeare.jsonpython
若是沒有 index 和 type ,咱們須要本身指定一下 index 和 type
curl -X "POST" "http://192.168.1.13:9200/haha/xixi/_bulk" --data-binary @accounts.jsongit
3 批量查詢數據
查詢一條數據
curl -X "GET" "http://192.168.1.12:9200/haha/xixi/1"github
查詢多條數據,使用 _mget
curl -XGET 'http://192.168.1.11:9200/_mget?pretty' -d '{
"docs":[
{
"_index": "haha",
"_type:": "xixi",
"_id": 1
},
{
"_index": "haha",
"_type:": "xixi",
"_id": 2
},
{
"_index": "shakespeare",
"_type:": "act",
"_id": 91400
}
]
}'web
logstash 的安裝
安裝依賴包 openjdk
yum install java-1.8.0-openjdk -y
yum install logstash-2.3.4-1.noarch.rpm -y數據庫
ELK 工做結構模型apache
+-----------------logstash-------------------+
+--------+ | +--------+ +---------+ +-----------+ | +---------+ +----------+
| 數據源 | --->| INPUT | -->| FILTER | --> | OUTPUT | ---> | ES 集羣 | -->| KIBANA |
+--------+ | +--------+ +---------+ +-----------+ | +---------+ +----------+
+----------------------------------------------+json
logstash.conf 初始配置
input{
stdin{}
}ruby
filter{
}curl
output{
stdout{}
}
插件文檔的位置
https://github.com/logstash-plugins
codec 插件
stdout{ codec => "rubydebug" }
file 插件
file{
sincedb_path => "/var/lib/logstash/since.db"
start_position => "beginning"
path => ["/var/tmp/a.log", "/tmp/b.log"]
type => "filelog"
}
import socket
def tcpmsg(msg):
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM, socket.IPPROTO_TCP)
s.connect(("192.168.1.10", 8888))
s.sendall(msg+'\n')
s.close()
def udpmsg(msg):
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
s.sendto(msg+'\n', ("192.168.1.10", 9999))
logstash.conf 配置
syslog{
host => "192.168.1.10"
port => 514
type => "syslog"
}
寫 syslog 日誌的命令
logger -p local0.info -t mylog "hello world"
配置 /etc/rsyslog.conf
local0.info @192.168.1.10:514
authpriv.info @@192.168.1.10:514
正則宏路徑
/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.5/patterns/grok-patterns
filter 配置,解析標準 apache 日誌 grok{
match => ["message","%{COMBINEDAPACHELOG}"]
}
output 配置寫入 ES 集羣
if [type] == "filelog"{
elasticsearch {
hosts => ["192.168.1.14:9200"]
index => "weblog"
flush_size => 2000
idle_flush_time => 10
}}
完整的 logstash.conf 配置
input{
file{
sincedb_path => "/var/lib/logstash/since.db"
start_position => "beginning"
path => ["/var/tmp/a.log"]
type => "filelog"
codec => "json"
}
}
filter{
}
output{if [type] == "filelog"{elasticsearch {hosts => ["192.168.1.14:9200"]index => "weblog"flush_size => 2000idle_flush_time => 10}} }