阿布雲代理ip

阿布雲對爬蟲新手比較友好,經過購買以後能夠生成通行證書以及通行密鑰,能夠選取http隧道socks隧道,以及專業版,經典版,動態版進行生成,在接入文檔中有你須要選擇的ip代理池使用信息,php-python:php

我選取了單py和scrapy框架處理的方式:python

 scrapy框架: import requests # 要訪問的目標頁面 targetUrl = "http://test.abuyun.com" # 代理服務器 proxyHost = "http-dyn.abuyun.com" proxyPort = "9020" # 代理隧道驗證信息通行證書以及通行密鑰 proxyUser = "************" proxyPass = "************" proxyMeta = "http://%(user)s:%(pass)s@%(host)s:%(port)s" % { "host" : proxyHost, "port" : proxyPort, "user" : proxyUser, "pass" : proxyPass, } proxies = { "http" : proxyMeta, "https" : proxyMeta, } resp = requests.get(targetUrl, proxies=proxies) print(resp.status_code) print(resp.text) 
  --以上取材來自阿布雲代理,無任何商用,自認爲比較方便好用的代理軟件,望採用,謝謝(qq:858703032) import base64 # 代理服務器 proxyServer = "http://http-dyn.abuyun.com:9020" # 代理隧道驗證信息 proxyUser = "H01234567890123D" proxyPass = "0123456789012345" # for Python2 proxyAuth = "Basic " + base64.b64encode(proxyUser + ":" + proxyPass) # for Python3 #proxyAuth = "Basic " + base64.urlsafe_b64encode(bytes((proxyUser + ":" + proxyPass), "ascii")).decode("utf8") class ProxyMiddleware(object): def process_request(self, request, spider): request.meta["proxy"] = proxyServer request.headers["Proxy-Authorization"] = proxyAuth 
相關文章
相關標籤/搜索