我在後臺調試 在後臺調試scrapy spider的時候,老是以爲後臺命令窗口 打印的東西太多了不便於觀察日誌,所以須要一個日誌文件記錄信息,這樣之後會 方便查找問題。html
分兩種方法吧。python
1.簡單粗暴。直接命令裏面配置LOG_FILEapi
scrapy crawl hupu -s LOG_FILE=scrapy_hupu_log.logscrapy
2.使用loggingide
在setting中加入配置
LOG_FILE = "hupuSpider.log"
LOG_LEVEL = 'INFO'
# LOG_ENABLED 默認: True,啓用logging
# LOG_ENCODING 默認: 'utf-8',logging使用的編碼
# LOG_FILE 默認: None,在當前目錄裏建立logging輸出文件的文件名
# LOG_LEVEL 默認: 'DEBUG',log的最低級別
# LOG_STDOUT 默認: False 若是爲 True,進程全部的標準輸出(及錯誤)將會被重定向到log中。例如,執行 print "hello" ,其將會在Scrapy log中顯示
使用 this
import logging
logging.log(logging.INFO, 'log content')
logging模塊是Python提供的本身的程序日誌記錄模塊。編碼
在大型軟件使用過程當中,出現的錯誤有時候很難進行重現,所以須要經過分析日誌來確認錯誤位置,這也是寫程序時要使用日誌的最重要的緣由。url
scrapy使用python內置的logging模塊記錄日誌.net
1. logging.CRITICAL - for critical errors (highest severity)debug
2. logging.ERROR - for regular errors
3. logging.WARNING - for warning messages
4. logging.INFO - for informational messages
5. logging.DEBUG - for debugging messages (lowest severity)
1.簡單使用方法
import logging
Logging.warning(「this is a test 」)
執行結果:
2.通用的記錄日誌的方法,可加入日誌的級別
import logging
Logging.log(logging.WARNING,」this is a warning」)
3,經過logger記錄日誌
import logging
logger=logging.getLogger(_name_)
Logger.warning(「this is a warning」)
Scrapy provides a logger within each Spider instance, that can be accessed and used like this:
import scrapy
class MySpider(scrapy.Spider):
name = 'myspider'
start_urls = ['http://scrapinghub.com']
def parse(self, response):
self.logger.info('Parse function called on %s', response.url)
That logger is created using the Spider’s name, but you can use any custom Python logger you want. For example:
import logging import scrapy
logger = logging.getLogger('mycustomlogger')
class MySpider(scrapy.Spider):
name = 'myspider'
start_urls = ['http://scrapinghub.com']
def parse(self, response):
logger.info('Parse function called on %s', response.url)
These settings can be used to configure the logging:
• LOG_FILE
• LOG_ENABLED
• LOG_ENCODING
• LOG_LEVEL
• LOG_FORMAT
• LOG_DATEFORMAT
• LOG_STDOUT
可參考https://www.cnblogs.com/sufei-duoduo/p/5880988.html,https://doc.scrapy.org/en/0.12/topics/logging.html,https://www.cnblogs.com/similarface/p/5179193.html