擴展提供了一種機制,添加自定義的功能到 scrapyweb
例如,爬蟲狀態統計擴展,統計爬蟲的運行信息框架
在 scrapy 啓動時建立擴展的單一實例對象,添加擴展的配置到 settings.py 文件scrapy
# 下面的字典定義了加載的擴展,字符串是擴展的路徑名,數字是加載插件的順序 EXTENSIONS = { 'scrapy.contrib.corestats.CoreStats': 500, 'scrapy.webservice.WebService': 500, 'scrapy.telnet.TelnetConsole': 500,} # 啓用插件 MYEXT_ENABLED = True
from_crawler 是建立插件(下載插件,擴展插件等)實例時,被框架調用的方法。能夠檢查配置,傳遞配置信息(能夠從 crawler.settings 對象獲取 settings.py 中定義的配置信息)ide
調用 crawler.signals.connect 註冊事件回調函數,在事件發生時被框架回調函數
from scrapy import signals from scrapy.exceptions import NotConfigured class SpiderOpenCloseLogging(object): def __init__(self, item_count): self.item_count = item_count self.items_scraped = 0 @classmethod def from_crawler(cls, crawler): # 檢查插件是否啓用,若是沒設置插件,拋出異常 NotConfigured if not crawler.settings.getbool('MYEXT_ENABLED'): raise NotConfigured # 從配置文件 settings.py 獲取配置 item_count = crawler.settings.getint('MYEXT_ITEMCOUNT', 1000) # 建立插件對象實例 ext = cls(item_count) # 監聽 spider 打開事件 crawler.signals.connect(ext.spider_opened, signal=signals.spider_opened) # 監聽 spider 關閉事件 crawler.signals.connect(ext.spider_closed, signal=signals.spider_closed) # 監聽 item 獲取事件(獲取抓取的結果) crawler.signals.connect(ext.item_scraped, signal=signals.item_scraped) # 返回擴展對象 return ext def spider_opened(self, spider): spider.log("opened spider %s" % spider.name) def spider_closed(self, spider): spider.log("closed spider %s" % spider.name) def item_scraped(self, item, spider): self.items_scraped += 1 if self.items_scraped == self.item_count: spider.log("scraped %d items, resetting counter" % self.items_scraped) self.item_count = 0