UA池和代理池

UA池和代理池

一. 下載代理池

下載中間件(Downloader Middlewares) 位於scrapy引擎和下載器之間的一層組件。dom

  • 做用:

(1)引擎將請求傳遞給下載器過程當中, 下載中間件能夠對請求進行一系列處理。好比設置請求的 User-Agent,設置代理等scrapy

(2)在下載器完成將Response傳遞給引擎中,下載中間件能夠對響應進行一系列處理。好比進行gzip解壓等。ide

咱們主要使用下載中間件處理請求,通常會對請求設置隨機的User-Agent ,設置隨機的代理。目的在於防止爬取網站的反爬蟲策略。網站

UA池:User-Agent池

  • 做用:儘量多的將scrapy工程中的請求假裝成不一樣類型的瀏覽器身份。url

  • 操做流程:代理

    1.在下載中間件中攔截請求code

    2.將攔截到的請求的請求頭信息中的UA進行篡改假裝中間件

    3.在配置文件中開啓下載中間件blog

    #導包
      from scrapy.contrib.downloadermiddleware.useragent import UserAgentMiddleware
      import random
      #UA池代碼的編寫(單獨給UA池封裝一個下載中間件的一個類)
      class RandomUserAgent(UserAgentMiddleware):
    
          def process_request(self, request, spider):
              #從列表中隨機抽選出一個ua值
              ua = random.choice(user_agent_list)
              #ua值進行當前攔截到請求的ua的寫入操做
              request.headers.setdefault('User-Agent',ua)
    
    
      user_agent_list = [
              "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 "
              "(KHTML, like Gecko) Chrome/22.0.1207.1 Safari/537.1",
              "Mozilla/5.0 (X11; CrOS i686 2268.111.0) AppleWebKit/536.11 "
              "(KHTML, like Gecko) Chrome/20.0.1132.57 Safari/536.11",
              "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.6 "
              "(KHTML, like Gecko) Chrome/20.0.1092.0 Safari/536.6",
              "Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.6 "
              "(KHTML, like Gecko) Chrome/20.0.1090.0 Safari/536.6",
              "Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.1 "
              "(KHTML, like Gecko) Chrome/19.77.34.5 Safari/537.1",
              "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/536.5 "
              "(KHTML, like Gecko) Chrome/19.0.1084.9 Safari/536.5",
              "Mozilla/5.0 (Windows NT 6.0) AppleWebKit/536.5 "
              "(KHTML, like Gecko) Chrome/19.0.1084.36 Safari/536.5",
              "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 "
              "(KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3",
              "Mozilla/5.0 (Windows NT 5.1) AppleWebKit/536.3 "
              "(KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3",
              "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_0) AppleWebKit/536.3 "
              "(KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3",
              "Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 "
              "(KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3",
              "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 "
              "(KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3",
              "Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 "
              "(KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",
              "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 "
              "(KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",
              "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.3 "
              "(KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",
              "Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 "
              "(KHTML, like Gecko) Chrome/19.0.1061.0 Safari/536.3",
              "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.24 "
              "(KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24",
              "Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/535.24 "
              "(KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24"
      ]

代理池

  • 做用:儘量多的將scrapy工程中的請求的IP設置成不一樣的。

  • 操做流程:

    1.在下載中間件中攔截請求

    2.將攔截到的請求的IP修改爲某一代理IP

    3.在配置文件中開啓下載中間件

    #批量對攔截到的請求進行ip更換
      #單獨封裝下載中間件類
      class Proxy(object):
          def process_request(self, request, spider):
              #對攔截到請求的url進行判斷(協議頭究竟是http仍是https)
              #request.url返回值:http://www.xxx.com
              h = request.url.split(':')[0]  #請求的協議頭
              if h == 'https':
                  ip = random.choice(PROXY_https)
                  request.meta['proxy'] = 'https://'+ip
              else:
                  ip = random.choice(PROXY_http)
                  request.meta['proxy'] = 'http://' + ip
    
      #可被選用的代理IP
      PROXY_http = [
          '153.180.102.104:80',
          '195.208.131.189:56055',
      ]
      PROXY_https = [
          '120.83.49.90:9000',
          '95.189.112.214:35508',
      ]
相關文章
相關標籤/搜索