2017-11-03 1 views
0

나는 Scrapy 프로젝트에서 exe 파일을 만들려고합니다. 오류를 피하기 위해 수십 개의 숨겨진 가져 오기를 추가해야했지만 지금은 ImportError: No module named pipelines이되어 무엇을해야할지 모르겠습니다.ImportError : No pipeline - Scrapy/PyInstaller

bot/ 
    engine_bot/ 
     engine_bot/ 
      spiders/ 
       __init__.py 
       main_spider.py 
      __init__.py 
      items.py 
      middlewares.py 
      pipelines.py 
      settings.py 
      utils.py 
     __init__.py 
     helper.py 
    main.py 
    __init__.py 

Main.py :

2017-11-03 14:01:47 [twisted] CRITICAL: Unhandled error in Deferred: 
2017-11-03 14:01:47 [twisted] CRITICAL: 
Traceback (most recent call last): 
    File "site-packages\twisted\internet\defer.py", line 1386, in _inlineCallbacks 
    File "site-packages\scrapy\crawler.py", line 95, in crawl 
    File "site-packages\scrapy\crawler.py", line 77, in crawl 
    File "site-packages\scrapy\crawler.py", line 102, in _create_engine 
    File "site-packages\scrapy\core\engine.py", line 70, in __init__ 
    File "site-packages\scrapy\core\scraper.py", line 71, in __init__ 
    File "site-packages\scrapy\middleware.py", line 58, in from_crawler 
    File "site-packages\scrapy\middleware.py", line 34, in from_settings 
    File "site-packages\scrapy\utils\misc.py", line 44, in load_object 
    File "importlib\__init__.py", line 37, in import_module 
ImportError: No module named pipelines 

pyInstaller 중에 CMD

pyinstaller main.py --hidden-import scrapy.spiderloader --hidden-import scrapy.statscollectors --hidden-import scrapy.logformatter --hidden-import scrapy.extensions.closespider 

--hidden-import scrapy.extensions.feedexport --hidden-import scrapy.extensions.memdebug --hidden-import scrapy.statscollectors --hidden-import scrapy.logformatter --hidden-import scrapy.extensions.closespider --hidden-import scrapy.extensions.feedexport --hidden-import scrapy.extensions.logstats --hidden-import scrapy.spiderloader --hidden-import scrapy.statscollectors --hidden-import scrapy.logformatter --hidden-import scrapy.extensions.closespider --hidden-import scrapy.extensions.feedexport --hidden-import scrapy.spiderloader --hidden-import scrapy.statscollectors --hidden-import scrapy.logformatter --hidden-import scrapy.extensions.closespider --hidden-import scrapy.extensions.feedexport --hidden-import scrapy.extensions.telnet --hidden-import scrapy.spiderloader --hidden-import scrapy.statscollectors --hidden-import scrapy.logformatter --hidden-import scrapy.extensions.closespider --hidden-import scrapy.extensions.feedexport --hidden-import scrapy.extensions.memusage --hidden-import scrapy.spiderloader --hidden-import scrapy.statscollectors --hidden-import scrapy.logformatter --hidden-import scrapy.extensions.closespider --hidden-import scrapy.extensions.feedexport --hidden-import scrapy.extensions.logstats --hidden-import scrapy.spiderloader --hidden-import scrapy.statscollectors --hidden-import scrapy.logformatter --hidden-import scrapy.extensions.closespider --hidden-import scrapy.extensions.feedexport --hidden-import scrapy.spiderloader --hidden-import scrapy.statscollectors --hidden-import scrapy.logformatter --hidden-import scrapy.extensions.closespider --hidden-import scrapy.extensions.corestats --hidden-import scrapy.extensions.spiderstate --hidden-import scrapy.extensions.throttle --hidden-import scrapy.core.scheduler --hidden-import scrapy.core.downloader --hidden-import scrapy.downloadermiddlewares.robotstxt --hidden-import scrapy.downloadermiddlewares.httpauth --hidden-import scrapy.downloadermiddlewares.downloadtimeout --hidden-import scrapy.downloadermiddlewares.useragent --hidden-import scrapy.downloadermiddlewares.defaultheaders --hidden-import scrapy.downloadermiddlewares.ajaxcrawl --hidden-import scrapy.downloadermiddlewares.retry --hidden-import scrapy.downloadermiddlewares.redirect --hidden-import scrapy.downloadermiddlewares.cookies --hidden-import scrapy.downloadermiddlewares.httpcompression --hidden-import scrapy.downloadermiddlewares.httpproxy --hidden-import scrapy.downloadermiddlewares.httpcache --hidden-import scrapy.downloadermiddlewares.stats --hidden-import scrapy.downloadermiddlewares.chunked --hidden-import scrapy.downloadermiddlewares.decompression --hidden-import scrapy.downloadermiddlewares.httperror --hidden-import scrapy.downloadermiddlewares.stats --hidden-import scrapy.downloadermiddlewares.stats --hidden-import scrapy.spidermiddlewares.depth --hidden-import scrapy.spidermiddlewares.httperror --hidden-import scrapy.spidermiddlewares.offsite --hidden-import scrapy.spidermiddlewares.referer --hidden-import scrapy.spidermiddlewares.urllength --hidden-import scrapy.pipelines --hidden-import engine_bot.pipelines

(숨겨진 수입의 많은 scrapy.pipelines을 추가하려고) : celery.log
from scrapy.crawler import CrawlerProcess 
from engine_bot.engine_bot.spiders.main_spider import MainSpider 


if __name__ == '__main__': 
    process = CrawlerProcess({'BOT_NAME':'engine_bot', 
    'SPIDER_MODULES':['engine_bot.engine_bot.spiders'], 
    'NEWSPIDER_MODULE':'engine_bot.engine_bot.spiders', 
    'ROBOTSTXT_OBEY':False, 
    'DOWNLOAD_DELAY':0.20, 
    'LOG_FILE':'scrapy.log', 
    'LOG_LEVEL':'DEBUG', 
    'ITEM_PIPELINES':{ 
    'engine_bot.engine_bot.pipelines.XmlExportPipeline': 300, 

    } 
    }) 
    process.crawl(MainSpider) 
    process.start() 

나는 무엇을해야할지 모르겠다.

답변

0

나는 문제가 Main.py이 부분으로 생각 : 구비 프로젝트 구조에 따르면

'ITEM_PIPELINES':{ 
    'engine_bot.engine_bot.pipelines.XmlExportPipeline': 300, 
} 

그것이 경로

'ITEM_PIPELINES':{ 
    'engine_bot.pipelines.XmlExportPipeline': 300, 
} 

즉 하나 engine_bot 이하 읽어야한다.

bot/ 
    some_new_name/ 
     engine_bot/ 
      spiders/ 
       __init__.py 
       main_spider.py 
      __init__.py 
      items.py 
      middlewares.py 
      pipelines.py 
      settings.py 
      utils.py 
     __init__.py 
     helper.py 
    main.py 
    __init__.py 

으로

bot/ 
    engine_bot/ 
     engine_bot/ 
      spiders/ 
       __init__.py 
       main_spider.py 
      __init__.py 
      items.py 
      middlewares.py 
      pipelines.py 
      settings.py 
      utils.py 
     __init__.py 
     helper.py 
    main.py 
    __init__.py 

에서

+0

engine_bot 이름을 변경을 참조하십시오. main.py는 bot_dir이 아닌 engine_bot에 있습니다. –

+0

그래,하지만이 모듈은 구성에서 읽는 Scrapy에 의해로드되고, 프로젝트 디렉토리와 관련되어야한다고 생각했습니다. 그러나 분명히 그렇지 않습니다. –

0

변경 프로젝트 strucutre 내가 그것을하지만 성공하지 않고 시도 some_new_name

+0

불행히도 이것은 도움이되지 못했습니다. 같은 오류가 발생합니다. –

+0

@MilanoSlesarik도''engine_bot.engine_bot.pipelines.XmlExportPipeline ': 300'을''some_new_name.engine_bot.pipelines.XmlExportPipeline': 300'으로 변경 했습니까? – Umair