Scrapy順序執行多個爬蟲 有兩種方式:
第一種:bat方式運行
????????新建bat文件?
cd C:\python_web\spiders\tiktokSelenium & C: & scrapy crawl spider1 & scrapy crawl spider2 & scrapy crawl spider3 & scrapy crawl spider4
第二種:
使用subprocess按照順序執行多個爬蟲,新建一個start.py文件,輸入一下內容,
def crawl_work():subprocess.Popen('scrapy crawl spider1', shell=True).wait()subprocess.Popen('scrapy crawl spider2', shell=True).wait()subprocess.Popen('scrapy crawl spider3', shell=True).wait()subprocess.Popen('scrapy crawl spider4', shell=True).wait()
if __name__ == '__main__':crawl_work()
然后?
cd C:\python_web\spiders\tiktokSelenium & C: & python ./start.py