今天又要抓取一個網站了,選擇恐懼症使得我不知道該拿誰下手,找來找去,算了,仍是抓取CSDN學院吧,CSDN學院的網站爲 https://edu.csdn.net/courses 我看了一下這個網址,課程數量也不是不少,大概有 6000+
門課程,數據量不大,用單線程其實就能很快的爬取完畢,不過爲了秒爬,我仍是選用了一個異步數據操做。html
仍是須要好好的分析一下頁碼規律mongodb
https://edu.csdn.net/courses/p2 https://edu.csdn.net/courses/p3 https://edu.csdn.net/courses/p4 ... ... https://edu.csdn.net/courses/p271
頁碼仍是很是有規律的,直接編寫代碼就能夠快速的爬取下來。出於人文關懷,我仍是把協程數限制在3,要不順發271個請求仍是有點攻擊的性質了。這樣很差,不符合咱們的精神。app
import asyncio import aiohttp from lxml import etree sema = asyncio.Semaphore(3) async def get_html(url): headers = { "user-agent": "本身找個UA便可" } ''' 本文來自 夢想橡皮擦 的博客 地址爲: https://blog.csdn.net/hihell 能夠任意轉載,可是但願給我留個版權。 ''' print("正在操做{}".format(url)) async with aiohttp.ClientSession() as s: try: async with s.get(url, headers=headers, timeout=3) as res: if res.status==200: html = await res.text() html = etree.HTML(html) get_content(html) # 解析網頁 print("數據{}插入完畢".format(url)) except Exception as e: print(e) print(html) time.sleep(1) print("休息一下") await get_html(url) async def x_get_html(url): with(await sema): await get_html(url) if __name__ == '__main__': url_format = "https://edu.csdn.net/courses/p{}" urls = [url_format.format(index) for index in range(1, 272)] loop = asyncio.get_event_loop() tasks = [x_get_html(url) for url in urls] request = loop.run_until_complete(asyncio.wait(tasks))
網頁下載到了以後,須要進行二次處理,而後才能夠把他放入到mongodb中,咱們只須要使用lxml
庫便可異步
def get_content(html): course_item = html.xpath("//div[@class='course_item']") data = [] for item in course_item: link = item.xpath("./a/@href")[0] # 獲取課程詳情的連接,方便咱們後面抓取 tags = item.xpath(".//div[@class='titleInfor']/span[@class='tags']/text()") # 獲取標籤 title = item.xpath(".//div[@class='titleInfor']/span[@class='title']/text()")[0] # 獲取標題 num = item.xpath(".//p[@class='subinfo']/span/text()")[0] # 學習人數 subinfo = item.xpath(".//p[@class='subinfo']/text()")[1].strip() # 做者 price = item.xpath(".//p[contains(@class,'priceinfo')]/i/text()")[0].strip() # 做者 data.append({ "title":title, "link":link, "tags":tags, "num":num, "subinfo":subinfo, "price":price }) collection.insert_many(data)
數據保存到mongodb中,完成。async