1. A bit of logrotate config:
| + lastaction | ||
| + echo $* | sed -r 's/\.log/\.log\.1/g' | # convert .log to .log.1 - get current names of rotated files. input is original file names. | ||
| + lsof | awk '$1 ~ /python/ {print $2}' | # get ids of python (or ipython, or python3) processes that use rotated files | ||
| + sort | uniq | # as single process can write to multiple files, we make this list unique. | ||
| + xargs kill -SIGUSR1 # send a signal to processes | ||
| + endscript | ||
2. A function to add python SIGUSR1 handler, it reopens all file handlers:
| +import logging | ||
| +import signal | ||
| + | ||
| + | ||
| +def sigusr1_react(): | ||
| + """ | ||
| + Run this function to make your process react on SIGUSR1 signal with | ||
| + reload (close and next open) of FileHandlers. Please, run it in main | ||
| + thread (see https://docs.python.org/2/library/signal.html#module-signal). | ||
| + | ||
| + It is a convenience function in case you are obliged to use FileHandler | ||
| + with rotating files. In our case it's scrapy + logrotate. | ||
| + """ | ||
| + def reload_filehandlers(signum, frame): | ||
| + handler_list = logging._handlerList | ||
| + file_handlers = [ | ||
| + x() for x in handler_list if isinstance(x(), logging.FileHandler) | ||
| + ] | ||
| + for fh in file_handlers: | ||
| + fh.stream.flush() | ||
| + fh.stream.close() | ||
| + fh.stream = None | ||
| + fh.stream = fh._open() | ||
| + | ||
| + signal.signal(signal.SIGUSR1, reload_filehandlers) |