[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

logging with multiprocessing

Multiprocessing, not multithreading.  Different processes.   This is pretty easy to do.

I have done this from a Python script to run an analysis program on many sets of data, at once.  To do it: 1) if there is going to be an output file, each output file must have a distinct name.   2) To use logging, log to file, and each log file will have to have  a distinct name.   This is not hard to do.  I assume input data is different for each run, so we don't have to do anything about that.  Then there won't be any conflict.  Input files are distinct output files are distinct, and log files are distinct.  

When I did this, we had the pleasure of running on a 20 core dual Xeon based system,  I don't remember if ran 20 processes at a time or slightly less.  Anyway, we really did achieve nearly linear speed up. Windows did assign these processes to separate cores.

--- Joe S. 

-----Original Message-----
From: jenil.desai25 at gmail.com <jenil.desai25 at gmail.com> 
Sent: Thursday, June 7, 2018 2:46 PM
To: python-list at python.org
Subject: logging with multiprocessing


I am new to logging module. I want to use logging module with multiprocessing. can anyone help me understand how can I do it?. Any help would be appreciated.

Thank you.