在nginx.conf中定義的日誌格式以下:前端
http { ... log_format main '$remote_addr - $remote_user [$time_local] "$request" ' '$status [$request_body] $body_bytes_sent "$http_referer" ' '"$http_user_agent" "$http_x_forwarded_for"'; ... }
日誌文件以下:mysql
116.2.52.247 - - [26/Oct/2017:15:04:00 +0000] "POST /api/v1/f1_static/ HTTP/1.1" 200 [{\x22user_id\x22:\x229b999d46dd6149f49\x22}] 323 "http://www.abc.com/ProductPerspective/detail/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36" "-"116.2.52.247 - - [26/Oct/2017:15:04:00 +0000] "OPTIONS /api/v1/fund_info/ HTTP/1.1" 200 [-] 31 "http://www.abc.com/ProductPerspective/detail/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36" "-"
nginx沒有命令直接將日誌按天分割,咱們寫了一個shell腳本,每日0點定時執行。nginx
#!/bin/bashlogs_path="/mydata/nginx/logs/" mv ${logs_path}access-web.log ${logs_path}access-web-$(date -d "yesterday" +"%Y%m%d").logmv ${logs_path}access-api.log ${logs_path}access-api-$(date -d "yesterday" +"%Y%m%d").log
cron:web
0 0 * * * /mydata/nginx/nginx.log.sh
從nginx服務器將日誌數據傳輸到日誌服務器sql
[root@VM_231_116_centos ~] root@10.105.83.34's password: access-power-20170929.log 100% 126KB 125.8KB/s 00:00 access-web-20171016.log 100% 2616KB 2.6MB/s 00:00 access-power-20170907.log 100% 1687KB 1.7MB/s 00:00 access-api-20170911.log 100% 1209KB 1.2MB/s 00:00 access-power-20170930.log 100% 1354KB 1.3MB/s 00:00 access.log 100% 45MB 45.2MB/s 00:00 access-api-20170907.log 100% 2960KB 2.9MB/s 00:00 access-power-20170906.log 100% 669KB 669.1KB/s 00:01 access-api-20170904.log 100% 9186KB 9.0MB/s 00:00
服務器之間文件(夾)複製shell
scp local_file remote_username@remote_ip:remote_folder 或者 scp local_file remote_username@remote_ip:remote_file scp -r local_folder remote_username@remote_ip:remote_folder
主要有幾點:數據庫
逐行解析json
正則匹配centos
日期的處理api
批量寫入數據庫
import reimport timeimport osimport arrowimport pandas as pdimport jsonimport io_tosqlimport shutil from sqlalchemy import create_engine engine_user_info = create_engine( "mysql+pymysql://{}:{}@{}:{}/{}".format('usr', 'pwd', 'host','port', 'db'), connect_args={"charset": "utf8"}) def parse(filename): month_abr = {"Jan":"01", "Feb":"02", "Mar":"03", "Apr":"04", "May":"05", "Jun":"06", "Jul":"07", "Aug":"08", "Sep":"09", "Oct":"10", "Nov":"11", "Dec":"12"} dfs = [] try: i = 0 file = open(filename) for line in file: pattern = "(\d+\.\d+\.\d+\.\d+).*?\[(.*?)\].*?(\w+) (/.*?) .*?\" (\d+) \[(.*?)\] (\d+) \"(.*?)\" \"(.*?)\" \"(.*?)\"" s = re.search(pattern, line) if s: remote_addr = s.group(1) local_time = s.group(2) request_method = s.group(3) request_url = s.group(4) status = s.group(5) request_body = s.group(6) body_bytes_sent = s.group(7) http_referer = s.group(8) http_user_agent = s.group(9) http_x_forwarded_for = s.group(10) for mon in month_abr.keys(): if mon in local_time: local_time = local_time.replace(mon, month_abr[mon]) break lt = arrow.get(local_time, "DD/MM/YYYY:HH:mm:ss") lt = lt.shift(hours=8) local_time = str(lt.datetime) i = i+1 if request_body != '-': try: request_body = request_body.replace(r'\x22', '"').replace("null", '""') request_body_dict = json.loads(request_body) fund_id = request_body_dict.get('fund_id', None) user_id = request_body_dict.get('user_id', None) if user_id is None: user_id = request_body_dict.get('userId', None) except Exception as e: print("request_body:{}".format(request_body)) print(e) fund_id = None user_id = None else: fund_id = None user_id = None if request_method not in ("GET", "POST"): continue df = pd.DataFrame({"remote_addr": [remote_addr], "request_method": [request_method], "local_time": [local_time], "request_url": [request_url], "status": [status], "request_body": [request_body], "body_bytes_sent": [body_bytes_sent], "http_referer": [http_referer], "http_user_agent": [http_user_agent], "http_x_forwarded_for": [http_x_forwarded_for], "fund_id": [fund_id], "user_id": [user_id] }) df['create_at'] = time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(time.time())) dfs.append(df) if len(dfs) >= 100: df_all = pd.concat(dfs) df_all = df_all.drop_duplicates(subset=['remote_addr', 'request_url','local_time']) df_all.to_sql("log_table", engine, if_exists="append", index=False) print("寫入長度爲:" + str(len(df_all))) dfs = [] df_all = pd.concat(dfs) df_all = df_all.drop_duplicates(subset=['remote_addr', 'request_url','local_time']) df_all.to_sql("log_table", engine, if_exists="append", index=False) except Exception as e: print(e)
日誌結構化寫入數據庫後,到前端頁面能夠多維度展現,下面是展現頁面示例:
統計每日活躍IP數
統計每日API請求次數
分類分析