1.在settings中设置log级别,在settings.py中添加一行:
LOG_LEVEL = 'WARNING'
Scrapy提供5层logging级别:
-
CRITICAL
- 严重错误(critical) -
ERROR
- 一般错误(regular errors) -
WARNING
- 警告信息(warning messages) -
INFO
- 一般信息(informational messages) -
DEBUG
- 调试信息(debugging messages)
scrapy默认显示DEBUG级别的log信息
- 将输出的结果保存为log日志,在settings.py中添加路径:
LOG_FILE = './log.log'
3.显示log位置,在pipelines.py中:
import logging
logger = logging.gerlogger(__name__)
def process_item(self,item,spider):
logger.warning(item)
4.在spider
文件中引入Log日志
class DcdappSpider(scrapy.Spider):
name = 'dcdapp'
allowed_domains = ['m.dcdapp.com']
custom_settings = {
# 设置管道下载
'ITEM_PIPELINES': {
'autospider.pipelines.DcdAppPipeline': 300,
},
# 设置log日志
'LOG_LEVEL':'DEBUG',
'LOG_FILE':'./././Log/dcdapp_log.log'
}
文章转载:csdn:https://blog.csdn.net/iswangrl/article/details/78286467