Python – Logging in Multiprocess Applications

loggingmultiprocessingpython

We have written an application that spawns at least 9 parallel processes. All processes generate a lot of logging information.

Currently we are using Python’s QueueHandler to consolidate all logs into one file. Unfortunately, this sometimes results in very messy files which are hard to read (making it difficult to track exactly what’s going on in one thread).

Do you think it is a viable option to separate all messages into dedicated files? Or will this make things even messier, due to the high number of files?

What are your general experiences when writing log files for multiprocessed/multithreaded applications?

Best Answer

Tag each log line with the process id as an identifier, then use grep when looking at the logs. Or something else besides the process id.

Related Topic