Django ORM与Scrapy集成
发布日期:2021-07-01 05:19:53 浏览次数:2 分类:技术文章

本文共 3685 字,大约阅读时间需要 12 分钟。

将Scrapy置于Django项目内部,使用Django的ORM来处理Scrapy的Item

1、更改scrapy中settings.py,将爬虫的环境设置为django的环境,导入django的环境:

import osimport djangoos.environ.setdefault('DJANGO_SETTINGS_MODULE', 'Soufan_crawl.settings')  # 导入django的环境,第二个参数是django项目名.settingsdjango.setup()  # 启用Django环境

2、更改scrapy中pipelines.py

新建pipeline:

from apps.spider_manage.models import Dataclass DjangoORMPipeline(object):    def process_item(self, item, spider):        for data in item['data']:            title = data['title']            number = data['number']            Data.objects.create(title=title, number=number)        return item  # 下一个pipline接着处理

settings.py中注册:

ITEM_PIPELINES = {
'data_acquisition.pipelines.DjangoORMPipeline': 2,}

补充:Data

class Data(models.Model):    title = models.CharField(max_length=128)    number = models.IntegerField()    create_time = models.DateTimeField(default=now)

以上做法虽能给数据的保存带来一点方便,但事实上造成了一些弊端,scrapy的正常调试失效,比如:

(Py3_spider) Soufan_crawl\data_acquisition> scrapyTraceback (most recent call last):  File "c:\users\onefine\appdata\local\programs\python\python37\Lib\runpy.py", line 193, in _run_module_as_main    "__main__", mod_spec)  File "c:\users\onefine\appdata\local\programs\python\python37\Lib\runpy.py", line 85, in _run_code    exec(code, run_globals)  File "C:\Users\ONEFINE\Envs\Py3_spider\Scripts\scrapy.exe\__main__.py", line 9, in 
File "c:\users\onefine\envs\py3_spider\lib\site-packages\scrapy\cmdline.py", line 110, in execute settings = get_project_settings() File "c:\users\onefine\envs\py3_spider\lib\site-packages\scrapy\utils\project.py", line 68, in get_project_settings settings.setmodule(settings_module_path, priority='project') File "c:\users\onefine\envs\py3_spider\lib\site-packages\scrapy\settings\__init__.py", line 292, in setmodule module = import_module(module) File "c:\users\onefine\envs\py3_spider\lib\importlib\__init__.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "
", line 1006, in _gcd_import File "
", line 983, in _find_and_load File "
", line 967, in _find_and_load_unlocked File "
", line 677, in _load_unlocked File "
", line 728, in exec_module File "
", line 219, in _call_with_frames_removed File "D:\PythonProject\DjangoProject\Soufan_crawl\data_acquisition\data_acquisition\settings.py", line 15, in
django.setup() # 启用Django环境 File "c:\users\onefine\envs\py3_spider\lib\site-packages\django\__init__.py", line 19, in setup configure_logging(settings.LOGGING_CONFIG, settings.LOGGING) File "c:\users\onefine\envs\py3_spider\lib\site-packages\django\conf\__init__.py", line 57, in __getattr__ self._setup(name) File "c:\users\onefine\envs\py3_spider\lib\site-packages\django\conf\__init__.py", line 44, in _setup self._wrapped = Settings(settings_module) File "c:\users\onefine\envs\py3_spider\lib\site-packages\django\conf\__init__.py", line 107, in __init__ mod = importlib.import_module(self.SETTINGS_MODULE) File "c:\users\onefine\envs\py3_spider\lib\importlib\__init__.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "
", line 1006, in _gcd_import File "
", line 983, in _find_and_load File "
", line 953, in _find_and_load_unlocked File "
", line 219, in _call_with_frames_removed File "
", line 1006, in _gcd_import File "
", line 983, in _find_and_load File "
", line 965, in _find_and_load_unlockedModuleNotFoundError: No module named 'Soufan_crawl'(Py3_spider) Soufan_crawl\data_acquisition>

所以不推荐这种做法,若要使用Django的ORM,请直接使用scrapy-djangoitem,如下。

scrapy-djangoitem使用

由于github上介绍的够详细,所以直接给出github地址。


附:

GitHub地址

在这里插入图片描述

转载地址:https://onefine.blog.csdn.net/article/details/86849865 如侵犯您的版权,请留言回复原文章的地址,我们会给您删除此文章,给您带来不便请您谅解!

上一篇:Git使用3:Git文件删除
下一篇:Git使用2:Git撤销操作

发表评论

最新留言

能坚持,总会有不一样的收获!
[***.219.124.196]2024年04月09日 23时28分43秒