scrapyd配置文件: 若是沒有沒有配置文件,scrapyd會使用自身的默認值,好比默認每一個CPU 最多隻執行4個scrapy進程。 CentOS 6.5 64 位 scrapy 1.3.3 scrapyd 1.1.1web
若是設置了scrapyd的配置文件: scrapyd會搜索路徑:json
• /etc/scrapyd/scrapyd.conf (Unix) • c:\scrapyd\scrapyd.conf (Windows) • /etc/scrapyd/conf.d/* (in alphabetical order, Unix) • scrapyd.conf • ~/.scrapyd.conf (users home directory)
個人配置文件放在etc /scrapyd/scrapyd.conf 下app
[scrapyd] eggs_dir = /usr/scrapyd/eggs logs_dir = /usr/scrapyd/logs jobs_to_keep = 100 dbs_dir = /usr/scrapyd/dbs max_proc = 0 max_proc_per_cpu = 800 finished_to_keep = 100 poll_interval = 5.0 bind_address = 192.168.17.30 http_port = 6800 debug = off runner = scrapyd.runner application = scrapyd.app.application launcher = scrapyd.launcher.Launcher webroot = scrapyd.website.Root [services] schedule.json = scrapyd.webservice.Schedule cancel.json = scrapyd.webservice.Cancel addversion.json = scrapyd.webservice.AddVersion listprojects.json = scrapyd.webservice.ListProjects listversions.json = scrapyd.webservice.ListVersions listspiders.json = scrapyd.webservice.ListSpiders delproject.json = scrapyd.webservice.DeleteProject delversion.json = scrapyd.webservice.DeleteVersion listjobs.json = scrapyd.webservice.ListJobs
其中在打開web界面時,若是長時間沒有操做,後臺會報出Timing out..scrapy