Right now I have a central module in a framework that spawns multiple processes using the Python 2.6 multiprocessing
module. Because it uses multiprocessing
, there is module-level multiprocessing-aware log, LOG = multiprocessing.get_logger()
. Per the docs, this logger has process-shared locks so that you don't garble things up in sys.stderr
(or whatever filehandle) by having multiple processes writing to it simultaneously.
The issue I have now is that the other modules in the framework are not multiprocessing-aware. The way I see it, I need to make all dependencies on this central module use multiprocessing-aware logging. That's annoying within the framework, let alone for all clients of the framework. Are there alternatives I'm not thinking of?
Best Answer
I just now wrote a log handler of my own that just feeds everything to the parent process via a pipe. I've only been testing it for ten minutes but it seems to work pretty well.
(Note: This is hardcoded to
RotatingFileHandler
, which is my own use case.)Update: @javier now maintains this approach as a package available on Pypi - see multiprocessing-logging on Pypi, github at https://github.com/jruere/multiprocessing-logging
Update: Implementation!
This now uses a queue for correct handling of concurrency, and also recovers from errors correctly. I've now been using this in production for several months, and the current version below works without issue.