找到问题的那一刻,骂了自己N次的猪脑子!!!

问题:爬虫脚本本来一切正常的,临时有其他事情耽搁了,然后回头正式运行的时候发现一只报错 [twisted] CRITICAL: Unhandled error in Deferred,错误就是下面这样的(因为都一样的错误,所以直接从网上贴过来的):

2016-03-13 08:50:50 [scrapy] INFO: Enabled extensions: CloseSpider, TelnetConsole, LogStats, CoreStats, SpiderState
Unhandled error in Deferred:
2016-03-13 08:50:50 [twisted] CRITICAL: Unhandled error in Deferred:


Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 150, in _run_command
    cmd.run(args, opts)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py", line 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 153, in crawl
    d = crawler.crawl(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1274, in unwindGenerator
    return _inlineCallbacks(None, gen, Deferred())
--- <exception caught here> ---
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1128, in _inlineCallbacks
    result = g.send(result)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 70, in crawl
    self.spider = self._create_spider(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 80, in _create_spider
    return self.spidercls.from_crawler(self, *args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/spiders/crawl.py", line 91, in from_crawler
    spider = super(CrawlSpider, cls).from_crawler(crawler, *args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/spiders/__init__.py", line 50, in from_crawler
    spider = cls(*args, **kwargs)
exceptions.TypeError: __init__() takes at least 3 arguments (1 given)
2016-03-13 08:50:50 [twisted] CRITICAL:

问题解决思路:

1、网上有说可能是twisted版本过高的,本人scrapy版本为1.3.3,twisted版本为13.1.0,已经是1.3.3版本要求的最低的twisted版本,依然不行。

2、有考虑过pip源的问题,所以换了源,依然不行。(但是本人之前在安装cryptography库的时候确实是因为pypi源有问题,更换为非官方的库就好了。)

官方库下载地址:https://pypi.org/

非官方库下载地址:https://www.lfd.uci.edu/~gohlke/pythonlibs/#numpy

3、在python安装目录script下运行安装:

python C:\Python36\Scripts\pywin32_postinstall.py -install

依然不行!!!

问题解决:

要崩溃了,然后看到 Stack Overflow上一个人(https://stackoverflow.com/questions/35970518/scrapy-twisted-critical-unhandled-error-in-deferred)解决这个问题的思路,灵光一闪,对比一下自己代码及配置文件,然后就发现上次清理文件时手残的把scrapy.cfg配置文件删除了!!!

所以出现诡异报错时还是要认真检查配置项呀。

11-18 20:45