How to save Scrapy crawl Command output

You can add these lines to your settings.py:

LOG_STDOUT = True
LOG_FILE = '/tmp/scrapy_output.txt'

And then start your crawl normally:

scrapy crawl someSpider

You need to redirect stderr too. You are redirecting only stdout. You can redirect it somehow like this:

scrapy crawl someSpider -o some.json -t json 2> some.text

The key is number 2, which "selects" stderr as source for redirection.

If you would like to redirect both stderr and stdout into one file, you can use:

scrapy crawl someSpider -o some.json -t json &> some.text

For more about output redirection: http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO-3.html


For all scrapy commands you can add --logfile NAME_OF_FILE to log to a file e.g.

scrapy crawl someSpider -o some.json --logfile some.text

There are two other useful command line options for logging:

  • -L or --loglevel to control the logging level e.g. -L INFO (the default is DEBUG)

  • --nolog to disable logging completely

These commands are documented here.

Tags:

Python

Scrapy