WebMagic是一個簡單靈活的爬蟲框架。基於WebMagic,你能夠快速開發出一個高效、易維護的爬蟲。java
官網文檔寫的比較清楚,建議你們直接閱讀官方文檔,也能夠閱讀下面的內容。地址以下:node
官網:http://webmagic.iogit
中文文檔地址: http://webmagic.io/docs/zh/github
English: http://webmagic.io/docs/enweb
spring boot
與webmagic
的結合主要有三個模塊,分別爲爬取模塊Processor
,入庫模塊Pipeline
,向數據庫存入爬取數據,和定時任務模塊Scheduled
,複製定時爬取網站數據。spring
<dependency> <groupId>us.codecraft</groupId> <artifactId>webmagic-core</artifactId> <version>0.5.3</version> </dependency> <dependency> <groupId>us.codecraft</groupId> <artifactId>webmagic-extension</artifactId> <version>0.5.3</version> </dependency>
Processor
爬取簡書首頁Processor,分析簡書首頁的頁面數據,獲取響應的簡書連接和標題,放入wegmagic的Page中,到入庫模塊取出添加到數據庫。代碼以下:數據庫
package com.shang.spray.common.processor; import com.shang.spray.entity.News; import com.shang.spray.entity.Sources; import com.shang.spray.pipeline.NewsPipeline; import us.codecraft.webmagic.Page; import us.codecraft.webmagic.Site; import us.codecraft.webmagic.Spider; import us.codecraft.webmagic.processor.PageProcessor; import us.codecraft.webmagic.selector.Selectable; import java.util.List; /** * info:簡書首頁爬蟲 * Created by shang on 16/9/9. */ public class JianShuProcessor implements PageProcessor { private Site site = Site.me() .setDomain("jianshu.com") .setSleepTime(100) .setUserAgent("Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36"); ; public static final String list = "http://www.jianshu.com"; @Override public void process(Page page) { if (page.getUrl().regex(list).match()) { List<Selectable> list=page.getHtml().xpath("//ul[@class='article-list thumbnails']/li").nodes(); for (Selectable s : list) { String title=s.xpath("//div/h4/a/text()").toString(); String link=s.xpath("//div/h4").links().toString(); News news=new News(); news.setTitle(title); news.setInfo(title); news.setLink(link); news.setSources(new Sources(5)); page.putField("news"+title, news); } } } @Override public Site getSite() { return site; } public static void main(String[] args) { Spider spider=Spider.create(new JianShuProcessor()); spider.addUrl("http://www.jianshu.com"); spider.addPipeline(new NewsPipeline()); spider.thread(5); spider.setExitWhenComplete(true); spider.start(); } }
Pipeline
入庫模塊結合spring boot的Repository模塊一塊兒組合成入庫方法,繼承webmagic的Pipeline,而後實現方法,在process方法中獲取爬蟲模塊的數據,而後調用spring boot的save方法。代碼以下:apache
package com.shang.spray.pipeline; import com.shang.spray.entity.News; import com.shang.spray.entity.Sources; import com.shang.spray.repository.NewsRepository; import org.apache.commons.lang3.StringUtils; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.data.jpa.domain.Specification; import org.springframework.stereotype.Repository; import us.codecraft.webmagic.ResultItems; import us.codecraft.webmagic.Task; import us.codecraft.webmagic.pipeline.Pipeline; import javax.persistence.criteria.CriteriaBuilder; import javax.persistence.criteria.CriteriaQuery; import javax.persistence.criteria.Predicate; import javax.persistence.criteria.Root; import java.util.ArrayList; import java.util.Date; import java.util.List; import java.util.Map; /** * info:新聞 * Created by shang on 16/8/22. */ @Repository public class NewsPipeline implements Pipeline { @Autowired protected NewsRepository newsRepository; @Override public void process(ResultItems resultItems, Task task) { for (Map.Entry<String, Object> entry : resultItems.getAll().entrySet()) { if (entry.getKey().contains("news")) { News news=(News) entry.getValue(); Specification<News> specification=new Specification<News>() { @Override public Predicate toPredicate(Root<News> root, CriteriaQuery<?> criteriaQuery, CriteriaBuilder criteriaBuilder) { return criteriaBuilder.and(criteriaBuilder.equal(root.get("link"),news.getLink())); } }; if (newsRepository.findOne(specification) == null) {//檢查連接是否已存在 news.setAuthor("水花"); news.setTypeId(1); news.setSort(1); news.setStatus(1); news.setExplicitLink(true); news.setCreateDate(new Date()); news.setModifyDate(new Date()); newsRepository.save(news); } } } } }
Scheduled
使用spring boot自帶的定時任務註解@Scheduled(cron = "0 0 0/2 * * ? ")
,天天從0天開始,每兩個小時執行一次爬取任務,在定時任務裏調取webmagic的爬取模塊Processor
。代碼以下:微信
package com.shang.spray.common.scheduled; import com.shang.spray.common.processor.DevelopersProcessor; import com.shang.spray.common.processor.JianShuProcessor; import com.shang.spray.common.processor.ZhiHuProcessor; import com.shang.spray.entity.Config; import com.shang.spray.pipeline.NewsPipeline; import com.shang.spray.service.ConfigService; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.data.jpa.domain.Specification; import org.springframework.scheduling.annotation.Scheduled; import org.springframework.stereotype.Component; import us.codecraft.webmagic.Spider; import javax.persistence.criteria.CriteriaBuilder; import javax.persistence.criteria.CriteriaQuery; import javax.persistence.criteria.Predicate; import javax.persistence.criteria.Root; /** * info:新聞定時任務 * Created by shang on 16/8/22. */ @Component public class NewsScheduled { @Autowired private NewsPipeline newsPipeline; /** * 簡書 */ @Scheduled(cron = "0 0 0/2 * * ? ")//從0點開始,每2個小時執行一次 public void jianShuScheduled() { System.out.println("----開始執行簡書定時任務"); Spider spider = Spider.create(new JianShuProcessor()); spider.addUrl("http://www.jianshu.com"); spider.addPipeline(newsPipeline); spider.thread(5); spider.setExitWhenComplete(true); spider.start(); spider.stop(); } }
在spring boot的Application裏啓用定時任務註解,@EnableScheduling
。代碼以下:app
package com.shang.spray; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.EnableAutoConfiguration; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.boot.builder.SpringApplicationBuilder; import org.springframework.boot.context.web.SpringBootServletInitializer; import org.springframework.context.annotation.ComponentScan; import org.springframework.context.annotation.Configuration; import org.springframework.scheduling.annotation.EnableScheduling; /** * info: * Created by shang on 16/7/8. */ @Configuration @EnableAutoConfiguration @ComponentScan @SpringBootApplication @EnableScheduling public class SprayApplication extends SpringBootServletInitializer{ @Override protected SpringApplicationBuilder configure(SpringApplicationBuilder application) { return application.sources(SprayApplication.class); } public static void main(String[] args) throws Exception { SpringApplication.run(SprayApplication.class, args); } }
使用webmagic是我在水花一現項目中爬取網站數據時使用的的爬蟲框架,在綜合比較的其餘幾個爬蟲框架後,選擇了這個框架,這個框架比較簡單易學,且功能強大,我這裏只使用了基本的功能,還有許多強大的功能都沒有使用。有興趣的能夠去看看官方文檔!
有須要代碼的能夠去個人github上去拉取相關代碼,此代碼在水花一現項目中使用過。 歡迎你們關注個人水花一現
項目。
水花一現APP下載地址:https://www.pgyer.com/0qj6
微信公衆號:水花一現,shuihuayixian
Github:https://github.com/shangjing105
QQ:787019494