Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ImportError: No module named statscollectors in Scrapinghub #245

Open
Sammeeey opened this issue Mar 8, 2020 · 0 comments
Open

ImportError: No module named statscollectors in Scrapinghub #245

Sammeeey opened this issue Mar 8, 2020 · 0 comments

Comments

@Sammeeey
Copy link
Contributor

Sammeeey commented Mar 8, 2020

I configured a monitor as described here and tried to run it in scrapinghub manually.
But then I get the following ERROR:

[root] Job runtime exception Less
Traceback (most recent call last):   
File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode     _run(args, settings)
File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 103, in _run     _run_scrapy(args, settings)   
File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy     execute(settings=settings)   
File "/usr/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 150, in execute     _run_print_help(parser, _run_command, cmd, args, opts)   
File "/usr/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 90, in _run_print_help     func(*a, **kw)   
File "/usr/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 157, in _run_command     cmd.run(args, opts)   File "/usr/local/lib/python2.7/site-packages/scrapy/commands/crawl.py", line 57, in run     self.crawler_process.crawl(spname, **opts.spargs)   File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 171, in crawl     crawler = self.create_crawler(crawler_or_spidercls)   File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 200, in create_crawler     return self._create_crawler(crawler_or_spidercls)   
File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 205, in _create_crawler     return Crawler(spidercls, self.settings)   
File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 41, in __init__     self.stats = load_object(self.settings['STATS_CLASS'])(self)   
File "/usr/local/lib/python2.7/site-packages/scrapy/utils/misc.py", line 44, in load_object     mod = import_module(module)   
File "/usr/local/lib/python2.7/importlib/__init__.py", line 37, in import_module     __import__(name)
ImportError: No module named statscollectors

DotScrapy Persistence Add-on is enabled.
And in Spiders > Settings
STATS_CLASS = spidermon.contrib.stats.statscollectors.LocalStorageStatsHistoryCollector

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant