環境:Centos7.3java
目錄:/usr/local/elk/esnode
1.獲取ElasticSearch安裝包linux
wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.1.2.tar.gz
2.解壓後運行git
tar xf elasticsearch-6.1.2.tar.gz sh elasticsearch-6.1.2/bin/elasticsearch
會報以下錯誤:github
[2018-01-24T13:59:16,633][WARN ][o.e.b.ElasticsearchUncaughtExceptionHandler] [] uncaught exception in thread [main] org.elasticsearch.bootstrap.StartupException: java.lang.RuntimeException: can not run elasticsearch as root at org.elasticsearch.bootstrap.Elasticsearch.init(Elasticsearch.java:125) ~[elasticsearch-6.1.2.jar:6.1.2] at org.elasticsearch.bootstrap.Elasticsearch.execute(Elasticsearch.java:112) ~[elasticsearch-6.1.2.jar:6.1.2] at org.elasticsearch.cli.EnvironmentAwareCommand.execute(EnvironmentAwareCommand.java:86) ~[elasticsearch-6.1.2.jar:6.1.2] at org.elasticsearch.cli.Command.mainWithoutErrorHandling(Command.java:124) ~[elasticsearch-cli-6.1.2.jar:6.1.2] at org.elasticsearch.cli.Command.main(Command.java:90) ~[elasticsearch-cli-6.1.2.jar:6.1.2] at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:92) ~[elasticsearch-6.1.2.jar:6.1.2] at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:85) ~[elasticsearch-6.1.2.jar:6.1.2] Caused by: java.lang.RuntimeException: can not run elasticsearch as root
該問題是由於運行es不能使用root用戶,所以要新建用戶esspring
useradd es passwd es 修改文件所屬爲es chown -R es:es /usr/local/es
修改elasticsearch.ymlexpress
network.host: 192.168.15.38 http.port: 9200
再次啓動: 報以下問題:apache
[1]: max file descriptors [4096] for elasticsearch process is too low, increase to at least [65536] 解決: vim /etc/security/limits.conf 在最後面追加下面內容 es hard nofile 65536 es soft nofile 65536 [2]: max virtual memory areas vm.max_map_count [65530] is too low, increase to at least [262144] 解決: 切換到root用戶 vi /etc/sysctl.conf 添加 vm.max_map_count=655360 執行命令: sysctl -p
再次執行./elasticsearchnpm
[2018-01-24T15:36:35,412][INFO ][o.e.n.Node ] [] initializing ... [2018-01-24T15:36:35,508][INFO ][o.e.e.NodeEnvironment ] [KMyyO-3] using [1] data paths, mounts [[/ (rootfs)]], net usable_space [46.8gb], net total_space [49.9gb], types [rootfs] [2018-01-24T15:36:35,509][INFO ][o.e.e.NodeEnvironment ] [KMyyO-3] heap size [990.7mb], compressed ordinary object pointers [true] [2018-01-24T15:36:35,510][INFO ][o.e.n.Node ] node name [KMyyO-3] derived from node ID [KMyyO-3KRPy_Q3Eb0mYDaw]; set [node.name] to override [2018-01-24T15:36:35,511][INFO ][o.e.n.Node ] version[6.1.2], pid[3404], build[5b1fea5/2018-01-10T02:35:59.208Z], OS[Linux/3.10.0-514.el7.x86_64/amd64], JVM[Oracle Corporation/Java HotSpot(TM) 64-Bit Server VM/1.8.0_144/25.144-b01] [2018-01-24T15:36:35,511][INFO ][o.e.n.Node ] JVM arguments [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -XX:-OmitStackTraceInFastThrow, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -XX:+HeapDumpOnOutOfMemoryError, -Des.path.home=/usr/local/es/elasticsearch-6.1.2, -Des.path.conf=/usr/local/es/elasticsearch-6.1.2/config] [2018-01-24T15:36:36,449][INFO ][o.e.p.PluginsService ] [KMyyO-3] loaded module [aggs-matrix-stats] [2018-01-24T15:36:36,450][INFO ][o.e.p.PluginsService ] [KMyyO-3] loaded module [analysis-common] [2018-01-24T15:36:36,450][INFO ][o.e.p.PluginsService ] [KMyyO-3] loaded module [ingest-common] [2018-01-24T15:36:36,450][INFO ][o.e.p.PluginsService ] [KMyyO-3] loaded module [lang-expression] [2018-01-24T15:36:36,450][INFO ][o.e.p.PluginsService ] [KMyyO-3] loaded module [lang-mustache] [2018-01-24T15:36:36,450][INFO ][o.e.p.PluginsService ] [KMyyO-3] loaded module [lang-painless] [2018-01-24T15:36:36,450][INFO ][o.e.p.PluginsService ] [KMyyO-3] loaded module [mapper-extras] [2018-01-24T15:36:36,450][INFO ][o.e.p.PluginsService ] [KMyyO-3] loaded module [parent-join] [2018-01-24T15:36:36,450][INFO ][o.e.p.PluginsService ] [KMyyO-3] loaded module [percolator] [2018-01-24T15:36:36,451][INFO ][o.e.p.PluginsService ] [KMyyO-3] loaded module [reindex] [2018-01-24T15:36:36,451][INFO ][o.e.p.PluginsService ] [KMyyO-3] loaded module [repository-url] [2018-01-24T15:36:36,451][INFO ][o.e.p.PluginsService ] [KMyyO-3] loaded module [transport-netty4] [2018-01-24T15:36:36,451][INFO ][o.e.p.PluginsService ] [KMyyO-3] loaded module [tribe] [2018-01-24T15:36:36,451][INFO ][o.e.p.PluginsService ] [KMyyO-3] no plugins loaded [2018-01-24T15:36:37,956][INFO ][o.e.d.DiscoveryModule ] [KMyyO-3] using discovery type [zen] [2018-01-24T15:36:38,643][INFO ][o.e.n.Node ] initialized [2018-01-24T15:36:38,643][INFO ][o.e.n.Node ] [KMyyO-3] starting ... [2018-01-24T15:36:38,880][INFO ][o.e.t.TransportService ] [KMyyO-3] publish_address {192.168.15.38:9300}, bound_addresses {192.168.15.38:9300} [2018-01-24T15:36:38,890][INFO ][o.e.b.BootstrapChecks ] [KMyyO-3] bound or publishing to a non-loopback address, enforcing bootstrap checks [2018-01-24T15:36:41,955][INFO ][o.e.c.s.MasterService ] [KMyyO-3] zen-disco-elected-as-master ([0] nodes joined), reason: new_master {KMyyO-3}{KMyyO-3KRPy_Q3Eb0mYDaw}{RY8JlkNjT3iTPoO_VT1isw}{192.168.15.38}{192.168.15.38:9300} [2018-01-24T15:36:41,961][INFO ][o.e.c.s.ClusterApplierService] [KMyyO-3] new_master {KMyyO-3}{KMyyO-3KRPy_Q3Eb0mYDaw}{RY8JlkNjT3iTPoO_VT1isw}{192.168.15.38}{192.168.15.38:9300}, reason: apply cluster state (from master [master {KMyyO-3}{KMyyO-3KRPy_Q3Eb0mYDaw}{RY8JlkNjT3iTPoO_VT1isw}{192.168.15.38}{192.168.15.38:9300} committed version [1] source [zen-disco-elected-as-master ([0] nodes joined)]]) [2018-01-24T15:36:41,990][INFO ][o.e.h.n.Netty4HttpServerTransport] [KMyyO-3] publish_address {192.168.15.38:9200}, bound_addresses {192.168.15.38:9200} [2018-01-24T15:36:41,990][INFO ][o.e.n.Node ] [KMyyO-3] started [2018-01-24T15:36:41,997][INFO ][o.e.g.GatewayService ] [KMyyO-3] recovered [0] indices into cluster_state
啓動成功json
瀏覽器中輸入:http://192.168.15.38:9200/
{ "name" : "KMyyO-3", "cluster_name" : "elasticsearch", "cluster_uuid" : "Z2ReGjxgTx28uA3wHT-gZg", "version" : { "number" : "6.1.2", "build_hash" : "5b1fea5", "build_date" : "2018-01-10T02:35:59.208Z", "build_snapshot" : false, "lucene_version" : "7.1.0", "minimum_wire_compatibility_version" : "5.6.0", "minimum_index_compatibility_version" : "5.0.0" }, "tagline" : "You Know, for Search" }
下面進行安裝ElasticSearch的head插件 下載安裝包:
wget https://nodejs.org/dist/v8.9.1/node-v8.9.1-linux-x64.tar.xz
解壓:
tar -xJf node-v8.9.1-linux-x64.tar.xz
配置環境變量:
vi /etc/profile
添加:
export JAVA_BIN=$JAVA_HOME/bin export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$NODE_HOME/bin:$PATH
source /etc/profile
當即生效
查看版本:
[root@localhost node]# node -v v8.9.1 [root@localhost node]# npm -v 5.5.1
2.安裝git
yum install git -y
查看git版本:
[root@localhost node]# git --version git version 1.8.3.1
卸載git命令
yum remove git
3.經過git獲取head插件
git clone git://github.com/mobz/elasticsearch-head.git
進入head根目錄,切換到root用戶進行安裝
[root@localhost elasticsearch-head]# npm install
啓動head插件
[es@localhost elasticsearch-head]$ npm run start
4.配置config/elasticsearch.yml文件 在最後添加:
http.cors.enabled: true http.cors.allow-origin: "*"
5.啓動es服務和head插件 切換到es用戶下
[es@localhost elasticsearch-head]$ sh ../../elasticsearch-6.1.2/bin/elasticsearch -d [es@localhost elasticsearch-head]$ npm run start
瀏覽器中訪問:http://192.168.15.38:9100/
目錄:/usr/local/elk/logstash 獲取安裝包:
wget https://artifacts.elastic.co/downloads/logstash/logstash-6.1.2.tar.gz
tar zxvf logstash-6.1.2.tar.gz
解壓後進入config目錄下,建立log_to_es.conf配置文件
添加下面內容:
input用來開啓tcp鏈接的4560端口,用來接收日誌信息
output配置ElasticSearch的地址,將接收到的信息輸出到ES中
input { tcp { host => "192.168.15.38" port => 4560 mode => "server" tags => ["tags"] codec => json_lines } } output { stdout{codec =>rubydebug} elasticsearch { hosts => "192.168.15.38" } }
開啓logstash
./logstash -f ../config/log_to_es.conf
[root@localhost bin]# ./logstash -f ../config/log_to_es.conf Sending Logstash's logs to /usr/local/logstash/logstash-6.1.2/logs which is now configured via log4j2.properties [2018-01-30T15:54:55,100][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/local/logstash/logstash-6.1.2/modules/fb_apache/configuration"} [2018-01-30T15:54:55,118][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/local/logstash/logstash-6.1.2/modules/netflow/configuration"} [2018-01-30T15:54:55,724][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified [2018-01-30T15:54:56,433][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.1.2"} [2018-01-30T15:54:56,856][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2018-01-30T15:55:02,031][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://192.168.15.38:9200/]}} [2018-01-30T15:55:02,044][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://192.168.15.38:9200/, :path=>"/"} [2018-01-30T15:55:02,263][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://192.168.15.38:9200/"} [2018-01-30T15:55:02,342][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>nil} [2018-01-30T15:55:02,346][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6} [2018-01-30T15:55:02,365][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil} [2018-01-30T15:55:02,384][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}} [2018-01-30T15:55:02,436][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.15.38"]} [2018-01-30T15:55:02,458][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x2731383b run>"} [2018-01-30T15:55:02,554][INFO ][logstash.inputs.tcp ] Starting tcp input listener {:address=>"192.168.15.38:4560", :ssl_enable=>"false"} [2018-01-30T15:55:02,765][INFO ][logstash.pipeline ] Pipeline started {"pipeline.id"=>"main"} [2018-01-30T15:55:02,882][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}
輸出上面的內容後,logstash搭建成功
目錄:/usr/local/elk/kibana
wget https://artifacts.elastic.co/downloads/kibana/kibana-6.1.2-linux-x86_64.tar.gz
解壓:
tar zxvf kibana-6.1.2-linux-x86_64.tar.gz
執行命令:
[root@localhost bin]# ./kibana log [07:03:29.712] [info][status][plugin:kibana@6.1.2] Status changed from uninitialized to green - Ready log [07:03:29.775] [info][status][plugin:elasticsearch@6.1.2] Status changed from uninitialized to yellow - Waiting for Elasticsearch log [07:03:29.817] [info][status][plugin:console@6.1.2] Status changed from uninitialized to green - Ready log [07:03:29.840] [info][status][plugin:elasticsearch@6.1.2] Status changed from yellow to green - Ready log [07:03:29.856] [info][status][plugin:metrics@6.1.2] Status changed from uninitialized to green - Ready log [07:03:30.088] [info][status][plugin:timelion@6.1.2] Status changed from uninitialized to green - Ready log [07:03:30.094] [info][listening] Server running at http://192.168.15.38:5601
啓動成功
1.添加依賴
<dependency> <groupId>net.logstash.logback</groupId> <artifactId>logstash-logback-encoder</artifactId> <version>4.11</version> </dependency>
2.添加配置文件:logback.xml
<?xml version="1.0" encoding="UTF-8"?> <configuration> <appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender"> <destination>192.168.15.38:4560</destination> <encoder charset="UTF-8" class="net.logstash.logback.encoder.LogstashEncoder" /> </appender> <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"> <encoder charset="UTF-8"> <!-- encoder 能夠指定字符集,對於中文輸出有意義 --> <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger - %msg%n </pattern> </encoder> </appender> <root level="INFO"> <appender-ref ref="LOGSTASH" /> <appender-ref ref="STDOUT" /> </root> </configuration>
3.application.yml配置
logging: config: classpath:logback.xml
4.集成測試
@Autowired private StudentService studentService; @RequestMapping("/findAll") public List<Student> findAll() { logger.info("log info ..."); logger.error("log error ..."); logger.debug("log debug ..."); return studentService.findAll(); }
查看Logstash輸出:
{ "logger_name" => "com.spark.Application", "level_value" => 20000, "thread_name" => "main", "level" => "INFO", "host" => "10.10.30.98", "@version" => 1, "message" => "Starting Application on DESKTOP-DBPFNEL with PID 6980 (E:\\workplace\\es-demo\\target\\classes started by admin in E:\\workplace\\es-demo)", "port" => 63561, "tags" => [ [0] "tags" ], "@timestamp" => 2018-01-30T07:04:27.523Z } { "logger_name" => "com.spark.Application", "level_value" => 20000, "thread_name" => "main", "level" => "INFO", "host" => "10.10.30.98", "@version" => 1, "message" => "No active profile set, falling back to default profiles: default", "port" => 63561, "tags" => [ [0] "tags" ], "@timestamp" => 2018-01-30T07:04:27.525Z } { "logger_name" => "org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext", "level_value" => 20000, "thread_name" => "main", "level" => "INFO", "host" => "10.10.30.98", "@version" => 1, "message" => "Refreshing org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext@7e3181aa: startup date [Tue Jan 30 15:04:27 CST 2018]; root of context hierarchy", "port" => 63561, "tags" => [ [0] "tags" ], "@timestamp" => 2018-01-30T07:04:27.604Z } { "logger_name" => "org.hibernate.validator.internal.util.Version", "level_value" => 20000, "thread_name" => "background-preinit", "level" => "INFO", "host" => "10.10.30.98", "@version" => 1, "message" => "HV000001: Hibernate Validator 5.3.6.Final", "port" => 63561, "tags" => [ [0] "tags" ], "@timestamp" => 2018-01-30T07:04:27.677Z }
輸出以下內容,說明咱們的ELK環境搭建成功 在瀏覽器中查看: http://192.168.15.38:5601/
SpringBoot測試項目源碼:https://github.com/YunDongTeng/springboot-es.git