(1):第一種方法是在命令上加上-s USER_AGENT='Mozilla/5.0'scrapy
(2):第二種方法是修改scrapy的user-agent默認值spa
找到Python的:安裝目錄下的default_settings.py文件,.net
C:\Program Files (x86)\Anaconda2\envs\scrapy\Lib\site-packages\scrapy\settings\default_settings.pycode
修改269行,USER_AGENT;blog
USER_AGENT = 'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36' get
(2):第在請求頭部構造一個User Agent,以下所示:requests
1 def start_requests(self): 2 yield Request("http://www.baidu.com/", 3 headers={'User-Agent': "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36"})