這一篇仍是承接以前的思路,在上一篇文章中,咱們分享了一個思路,能夠把storage account的訪問日誌,經過PowerShell腳本下載下來,而後自動上傳到Log Analytics,這樣就能夠直接在LA中對日誌進行分析和查詢了,這種方法雖然簡單方便,但終究算不上是自動化的方式,擴展一下思路以後,咱們不妨嘗試着用eventgrid結合function的方式,把這套方法轉換成一套自動化的方案
json
首先先把EventGrid和Function是什麼普及下api
EventGrid服務器
經過 Azure 事件網格,可以使用基於事件的體系結構輕鬆生成應用程序。 首先,選擇要訂閱的 Azure 資源,而後提供要向其發送事件的事件處理程序或 WebHook 終結點。 事件網格包含來自 Azure 服務對事件的內置支持,如存儲 blob 和資源組。 事件網格還使用自定義主題支持本身的事件。微信
簡單來講就像是個觸發器同樣,能夠觸發各類事件,而後作出針對性的響應,聽起來和logic apps裏的觸發器有點像,但eventgrid只是個單純的事件中轉工具,下游的處理方式徹底由其餘產品完成,如下這張圖也能夠看出來eventgrid所處的位置,相似一個消息隊列同樣用來作事件的傳輸app
https://docs.microsoft.com/en-us/azure/event-grid/?WT.mc_id=AZ-MVP-5001235 ide
Function工具
Azure Functions 是一種無服務器解決方案,可使用戶減小代碼編寫、減小須要維護的基礎結構並節省成本。 無需擔憂部署和維護服務器,雲基礎結構提供保持應用程序運行所需的全部最新資源。post
Function其實比較好理解,如今各個雲基本上都有相似的產品,橫向對比的話就是AWS的lambda,一個徹底託管的代碼運行平臺ui
https://docs.microsoft.com/zh-cn/azure/azure-functions/functions-overview?WT.mc_id=AZ-MVP-5001235 this
結合上邊的圖,其實就能看出來咱們的思路,eventgird內置了blob的觸發器,也就是說當新的blob出現時,就會自動觸發eventgrid,而下游的處理程序,咱們就能夠結合function來作,代碼實際上是現成的,就用以前的就能夠,只不過須要稍加改動而已,整體的工做量很小
固然這裏其實有個隱藏的問題,由於storage account的log是存儲在$logs容器裏,而這個容器在Azure後臺是不會觸發eventgrid的,這就有點坑了,咱們採用的方法是能夠建立個function,而後用azcopy按期的把log sync到另一個container便可,一行代碼就搞定,很簡單,因此暫時先不寫出來了
實現步驟
下邊就來看具體的實現步驟了,首先先建立好function app,function app和function的關係很簡單,function app至關因而運行function的平臺,代碼是跑在這個平臺上的,function app中能夠包含不少個function
建立Function App
function app建立過程很是簡單,咱們選擇runtime是PowerShell Core便可
建立Function
Function app建立好以後,就能夠在裏邊建立function了,azure其實內置了evenr grid trigger的function,建立時直接選擇便可
建立Event Grid Subscription
function準備好了,接下來就能夠準備eventgrid了,能夠直接在function裏建立event grid subscription
建立event grid subscription時,能夠選擇的type有不少,這裏注意要選擇storage類型的
配置Storage account的event type
而後注意須要配置一下filter,由於咱們要把路徑限制到某個特定的範圍,而不是全部blob都會觸發事件
觸發器也準備完成了,接下來就能夠準備在function裏處理的代碼了
編寫Code
由於以前的代碼是循環處理的,而eventgrid其實是一條條推送過來的,因此這裏的邏輯須要進行些許調整
Function Build-Signature ($customerId, $sharedKey, $date, $contentLength, $method, $contentType, $resource) { $xHeaders = "x-ms-date:" + $date $stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource $bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash) $keyBytes = [Convert]::FromBase64String($sharedKey) $sha256 = New-Object System.Security.Cryptography.HMACSHA256 $sha256.Key = $keyBytes $calculatedHash = $sha256.ComputeHash($bytesToHash) $encodedHash = [Convert]::ToBase64String($calculatedHash) $authorization = 'SharedKey {0}:{1}' -f $customerId,$encodedHash return $authorization } Function Post-LogAnalyticsData($customerId, $sharedKey, $body, $logType) { $method = "POST" $contentType = "application/json" $resource = "/api/logs" $rfc1123date = [DateTime]::UtcNow.ToString("r") $contentLength = $body.Length $signature = Build-Signature ` -customerId $customerId ` -sharedKey $sharedKey ` -date $rfc1123date ` -contentLength $contentLength ` -method $method ` -contentType $contentType ` -resource $resource $uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01" $headers = @{ "Authorization" = $signature; "Log-Type" = $logType; "x-ms-date" = $rfc1123date; "time-generated-field" = $TimeStampField; } $response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing return $response.StatusCode } Function ConvertSemicolonToURLEncoding([String] $InputText) { $ReturnText = "" $chars = $InputText.ToCharArray() $StartConvert = $false foreach($c in $chars) { if($c -eq '"') { $StartConvert = ! $StartConvert } if($StartConvert -eq $true -and $c -eq ';') { $ReturnText += "%3B" } else { $ReturnText += $c } } return $ReturnText } Function FormalizeJsonValue($Text) { $Text1 = "" if($Text.IndexOf("`"") -eq 0) { $Text1=$Text } else {$Text1="`"" + $Text+ "`""} if($Text1.IndexOf("%3B") -ge 0) { $ReturnText = $Text1.Replace("%3B", ";") } else { $ReturnText = $Text1 } return $ReturnText } Function ConvertLogLineToJson([String] $logLine) { $logLineEncoded = ConvertSemicolonToURLEncoding($logLine) $elements = $logLineEncoded.split(';') $FormattedElements = New-Object System.Collections.ArrayList foreach($element in $elements) { $NewText = FormalizeJsonValue($element) $FormattedElements.Add($NewText) > null } $Columns = ( "version-number", "request-start-time", "operation-type", "request-status", "http-status-code", "end-to-end-latency-in-ms", "server-latency-in-ms", "authentication-type", "requester-account-name", "owner-account-name", "service-type", "request-url", "requested-object-key", "request-id-header", "operation-count", "requester-ip-address", "request-version-header", "request-header-size", "request-packet-size", "response-header-size", "response-packet-size", "request-content-length", "request-md5", "server-md5", "etag-identifier", "last-modified-time", "conditions-used", "user-agent-header", "referrer-header", "client-request-id" ) $logJson = "[{"; For($i = 0;$i -lt $Columns.Length;$i++) { $logJson += "`"" + $Columns[$i] + "`":" + $FormattedElements[$i] if($i -lt $Columns.Length - 1) { $logJson += "," } } $logJson += "}]"; return $logJson } $storageAccount = Get-AzStorageAccount -ResourceGroupName $ResourceGroup -Name $StorageAccountName -ErrorAction SilentlyContinue if($null -eq $storageAccount) { throw "The storage account specified does not exist in this subscription." } $storageContext = $storageAccount.Context $token = $Null $maxReturn = 5000 $successPost = 0 $failedPost = 0 $subject=$eventGridEvent.subject.ToString() $BlobArray=$subject.Split('/') $container=$BlobArray[$BlobArray.indexof('containers')+1] $BlobIndex=$subject.indexof('blobs/')+6 $Blob=$subject.substring($BlobIndex,$subject.length - $BlobIndex) Write-Output("> Downloading blob: {0}" -f $blob) $filename = ".\log.txt" Get-AzStorageBlobContent -Context $storageContext -Container $container -Blob $blob -Destination $filename -Force > Null Write-Output("> Posting logs to log analytic workspace: {0}" -f $blob) $lines = Get-Content $filename foreach($line in $lines) { $json = ConvertLogLineToJson($line) $response = Post-LogAnalyticsData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($json)) -logType $logType if($response -eq "200") { $successPost++ } else { $failedPost++ Write-Output "> Failed to post one log to Log Analytics workspace" } } remove-item $filename -Force Write-Output "> Log lines posted to Log Analytics workspace: success = $successPost, failure = $failedPost"
最後還有一個步驟是須要給function app受權來訪問storage,這步就不詳細講了,也能夠用storage account key來作,固然這種方法其實不是很推薦
最後能夠在function裏的監控裏看到完成的效果
總結
整體來講,其實和單純經過PowerShell腳本的方式相比變化並不大,可是由於增長了eventgrid和function,讓整個方案變得更加靈活,相似的思路還能夠擴展到不少其餘任務上,以更cloud native的方式來看待和處理問題