Asp.net Core + Log4net + ELK 搭建日誌中心

Docker中一鍵安裝ELK

對於這種工具類的東西,第一步就直接到docker的hub中查找了,很幸運,不只有Elasticsearch,kibana,logstash 單獨的鏡像,並且還直接 有ELK的鏡像。
git

sudo docker run -p 5601:5601 -p 9200:9200 -p 5044:5044 -d --name log-platform --restart always  sebp/elk

這固然能少好多配置,堅決果斷就選擇了elk的鏡像, 運行起來!若是沒有異常的話相信就很容易的跑起來了(最有可能出現的問題就是虛擬內存不足了,能夠百度找解決方案這裏就不在詳細說了)
github

項目中添加log4net到Elasticseach的Appender

由於在.net core 以前就有搭建過日誌中心,因此對於appender還記得有一個Log4net.Elasticsearch的dll,可是在查看資料以後發現好久沒有更新 也不支持.net standard。在決定本身實現appender以前,抱着僥倖心理去查找了一翻,既然找到一個支持.net core的開源項目log4stash。很幸運,又能夠不要造輪子了,哈哈。log4Stash使用很簡單,可是配置的東西還挺多的並且做者也沒有很好的文檔介紹,先無論其餘的用起來在說。docker

  • 項目中添加log4stash
Install-Package log4stash -Version 2.2.1
  • 修改log4net.config
    在log4net.config中添加appender
<appender name="ElasticSearchAppender" type="log4stash.ElasticSearchAppender, log4stash">
    <Server>localhost</Server>
    <Port>9200</Port>
    <IndexName>log_test_%{+yyyy-MM-dd}</IndexName>
    <IndexType>LogEvent</IndexType>
    <Bulksize>2000</Bulksize>
    <BulkIdleTimeout>10000</BulkIdleTimeout>
    <IndexAsync>True</IndexAsync>
</appender>

另外附上所有的配置信息數據庫

<appender name="ElasticSearchAppender" type="log4stash.ElasticSearchAppender, log4stash">
    <Server>localhost</Server>
    <Port>9200</Port>
    <!-- optional: in case elasticsearch is located behind a reverse proxy the URL is like http://Server:Port/Path, default = empty string -->
    <Path>/es5</Path>
    <IndexName>log_test_%{+yyyy-MM-dd}</IndexName>
    <IndexType>LogEvent</IndexType>
    <Bulksize>2000</Bulksize>
    <BulkIdleTimeout>10000</BulkIdleTimeout>
    <IndexAsync>False</IndexAsync>
    <DocumentIdSource>IdSource</DocumentIdSource> <!-- obsolete! use IndexOperationParams -->
    
    <!-- Serialize log object as json (default is true).
      -- This in case you log the object this way: `logger.Debug(obj);` and not: `logger.Debug("string");` -->
    <SerializeObjects>True</SerializeObjects> 

    <!-- optional: elasticsearch timeout for the request, default = 10000 -->
    <ElasticSearchTimeout>10000</ElasticSearchTimeout>

    <!--You can add parameters to the request to control the parameters sent to ElasticSearch.
    for example, as you can see here, you can add a routing specification to the appender.
    The Key is the key to be added to the request, and the value is the parameter's name in the log event properties.-->
    <IndexOperationParams>
      <Parameter>
        <Key>_routing</Key>
        <Value>%{RoutingSource}</Value>
      </Parameter>
      <Parameter>
        <Key>_id</Key>
        <Value>%{IdSource}</Value>
      </Parameter>
      <Parameter>
        <Key>key</Key>
        <Value>value</Value>
      </Parameter>
    </IndexOperationParams>

    <!-- for more information read about log4net.Core.FixFlags -->
    <FixedFields>Partial</FixedFields>
    
    <Template>
      <Name>templateName</Name>
      <FileName>path2template.json</FileName>
    </Template>

    <!--Only one credential type can used at once-->
    <!--Here we list all possible types-->
    <AuthenticationMethod>
      <!--For basic authentication purposes-->
      <Basic>
          <Username>Username</Username>
          <Password>Password</Password>
      </Basic>
      <!--For AWS ElasticSearch service-->
      <Aws>
          <Aws4SignerSecretKey>Secret</Aws4SignerSecretKey>
          <Aws4SignerAccessKey>AccessKey</Aws4SignerAccessKey>
          <Aws4SignerRegion>Region</Aws4SignerRegion>
      </Aws>
    </AuthenticationMethod>
    
    <!-- all filters goes in ElasticFilters tag -->
    <ElasticFilters>
      <Add>
        <Key>@type</Key>
        <Value>Special</Value>
      </Add>

      <!-- using the @type value from the previous filter -->
      <Add>
        <Key>SmartValue</Key>
        <Value>the type is %{@type}</Value>
      </Add>

      <Remove>
        <Key>@type</Key>
      </Remove>

      <!-- you can load custom filters like I do here -->
      <Filter type="log4stash.Filters.RenameKeyFilter, log4stash">
        <Key>SmartValue</Key>
        <RenameTo>SmartValue2</RenameTo>
      </Filter>
    
      <!-- converts a json object to fields in the document -->
      <Json>
        <SourceKey>JsonRaw</SourceKey>
        <FlattenJson>false</FlattenJson>
        <!-- the separator property is only relevant when setting the FlattenJson property to 'true' -->
        <Separator>_</Separator> 
      </Json>

      <!-- converts an xml object to fields in the document -->
      <Xml>
        <SourceKey>XmlRaw</SourceKey>
        <FlattenXml>false</FlattenXml>
      </Xml>
      
      <!-- kv and grok filters similar to logstash's filters -->
      <Kv>
        <SourceKey>Message</SourceKey>
        <ValueSplit>:=</ValueSplit>
        <FieldSplit> ,</FieldSplit>
      </kv>

      <Grok>
        <SourceKey>Message</SourceKey>
        <Pattern>the message is %{WORD:Message} and guid %{UUID:the_guid}</Pattern>
        <Overwrite>true</Overwrite>
      </Grok>

      <!-- Convert string like: "1,2, 45 9" into array of numbers [1,2,45,9] -->
      <ConvertToArray>
        <SourceKey>someIds</SourceKey>
        <!-- The separators (space and comma) -->
        <Seperators>, </Seperators> 
      </ConvertToArray>

      <Convert>
        <!-- convert given key to string -->
        <ToString>shouldBeString</ToString>

        <!-- same as ConvertToArray. Just for convenience -->
        <ToArray>
           <SourceKey>anotherIds</SourceKey>
        </ToArray>
      </Convert>
    </ElasticFilters>
</appender>

最後別忘了在root中添加上appenderjson

<root>
    <level value="WARN" />
    <appender-ref ref="ElasticSearchAppender" />
  </root>

OK,項目的配置就到這裏結束了,能夠運行項目寫入一些測試的日誌了。app

在kibana創建Index Pattern

Elasticsearch的Index跟關係數據庫中的Database挺相似的,雖然咱們項目在寫了測試數據後,Elasticsearch中就已經有Index了,可是若是咱們須要在可視化工具中查詢數據的話創建Index Pattern
進入Management - Create Index Pattern,輸入咱們項目日誌配置文件中的Index名稱log_test-*(若是有數據,這邊應該是會自動帶出來的),而後建立,以後就能夠在Kibana中瀏覽,查詢咱們的日誌信息了。
elasticsearch

相關文章
相關標籤/搜索