ICode9

精准搜索请尝试: 精确搜索
首页 > 编程语言> 文章详细

python – scrapy新手:教程.运行scrapy crawl dmoz时出错

2019-07-15 17:56:44  阅读:65  来源: 互联网

标签:python scrapy


我已经设置了PATH变量,我认为我正在配置一切正确.但是当我在startproject文件夹中运行“scrapy crawl dmoz”时,我收到以下错误消息:

c:\matt\testing\dmoz>scrapy crawl dmoz
2012-04-24 18:12:56-0400 [scrapy] INFO: Scrapy 0.14.0.2841 started (bot: dmoz)
2012-04-24 18:12:56-0400 [scrapy] DEBUG: Enabled extensions: LogStats, TelnetConsole,         
CloseSpider, WebService, CoreStats, SpiderState
2012-04-24 18:12:56-0400 [scrapy] DEBUG: Enabled downloader middlewares:    
HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware,
faultHeadersMiddleware, RedirectMiddleware, CookiesMiddleware,   
HttpCompressionMiddleware, ChunkedTransferMiddleware, DownloaderStats
2012-04-24 18:12:56-0400 [scrapy] DEBUG: Enabled spider middlewares:   
HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware,  DepthMiddware
2012-04-24 18:12:56-0400 [scrapy] DEBUG: Enabled item pipelines:
Traceback (most recent call last):
File "c:\Python27\Scripts\scrapy", line 4, in <module>
execute()
File "c:\Python27\lib\site-packages\scrapy-0.14.0.2841-py2.7- 
win32.egg\scrapy\cmdline.py", line 132, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "c:\Python27\lib\site-packages\scrapy-0.14.0.2841-py2.7-
win32.egg\scrapy\cmdline.py", line 97, in _run_print_help
func(*a, **kw)
File "c:\Python27\lib\site-packages\scrapy-0.14.0.2841-py2.7-
win32.egg\scrapy\cmdline.py", line 139, in _run_command
cmd.run(args, opts)
File "c:\Python27\lib\site-packages\scrapy-0.14.0.2841-py2.7-
win32.egg\scrapy\commands\crawl.py", line 43, in run
spider = self.crawler.spiders.create(spname, **opts.spargs)
File "c:\Python27\lib\site-packages\scrapy-0.14.0.2841-py2.7-  
win32.egg\scrapy\spidermanager.py", line 43, in create
raise KeyError("Spider not found: %s" % spider_name)
KeyError: 'Spider not found: dmoz'

有谁知道可能会发生什么?

解决方法:

我也有这个问题.

这是因为scrapy教程要求你将你创建的蜘蛛放在/ dmoz / spiders /中,但scrapy正在查看tutorial / tutorial / spiders.

将dmoz_spider.py保存在tutorial / tutorial / spiders中,爬行应该可以正常工作.

标签:python,scrapy
来源: https://codeday.me/bug/20190715/1469957.html

本站声明: 1. iCode9 技术分享网(下文简称本站)提供的所有内容,仅供技术学习、探讨和分享;
2. 关于本站的所有留言、评论、转载及引用,纯属内容发起人的个人观点,与本站观点和立场无关;
3. 关于本站的所有言论和文字,纯属内容发起人的个人观点,与本站观点和立场无关;
4. 本站文章均是网友提供,不完全保证技术分享内容的完整性、准确性、时效性、风险性和版权归属;如您发现该文章侵犯了您的权益,可联系我们第一时间进行删除;
5. 本站为非盈利性的个人网站,所有内容不会用来进行牟利,也不会利用任何形式的广告来间接获益,纯粹是为了广大技术爱好者提供技术内容和技术思想的分享性交流网站。

专注分享技术,共同学习,共同进步。侵权联系[81616952@qq.com]

Copyright (C)ICode9.com, All Rights Reserved.

ICode9版权所有