python - How can I handle the output in multiple processes without passing parameters? -


how share queue number of processes,the code of these processes in multiple files, , don't want pass queue parameter.

i try solve problem, failed. have 3 files

main.py

from p_1 import test p_2 import queue_run import multiprocessing   if __name__ == '__main__':     process_1 = multiprocessing.process(target=test)     process_2 = queue_run()     process_1.start()     process_2.start()     process_1.join()     process_2.join() 

p_1.py

import time p_2 import queue_put   def test():     var = ['a', 'b', 'c', 'd']     v in var:         queue_put('something : ' + v)         time.sleep(0.8) 

p_2.py

import multiprocessing  queue = multiprocessing.queue()   def queue_put(something):     queue.put(something)  class queue_run(multiprocessing.process):     def __init__(self):         multiprocessing.process.__init__(self)      def run(self):         while true:             try:                 data = queue.get(timeout=1)                 print data             except:                 break 

then run main.py, without output.

the docs exchanging data between processes explain hand on queue process argument. code worked on mac (i call "by chance" it's using undocumented feature might side effect of 1 python version), not on windows.

an important fact processes started multiprocessing don't have shared memory space (unlike threads, share memory). mechanisms share objects pipes, queues , objects created via shared memory or server processes.

that said: there few things improve in code:

  • the split 3 modules not make sense me, p_1.py , p_2.py
  • it's not clear why did class queue_run when normal function similar test trick well

here's example, condensed 1 file. left class can see changed:

import multiprocessing import time   def test(queue):     var = ['a', 'b', 'c', 'd']     v in var:         queue_put(queue, 'something : ' + v)         time.sleep(0.8)   def queue_put(queue, something):     queue.put(something)  class queue_run(multiprocessing.process):     def __init__(self, q):         multiprocessing.process.__init__(self)         self.queue = q      def run(self):         print "started"         while true:             try:                 data = self.queue.get(timeout=1)                 print "got queue: ", data             except:                 print "timed out"                 break  if __name__ == '__main__':     queue = multiprocessing.queue()     process_1 = multiprocessing.process(target=test, args=(queue,))     process_2 = queue_run(queue)     process_1.start()     process_2.start()     process_1.join()     process_2.join() 

p.s. may want call test producer , queue_run consumer, that's how called in programming jargon

update:

if pass queue parameter sub modules solve above problem, need modify files, complex project

as you're working on windows (as noted in comment) there's important fact know: if create new process, spawned: new python interpreter started, python modules loaded new, , global variables (like queue) initiated afresh new instance. means there no way share global variable processes.

if want stay multiprocessing, way pass them submodules. cannot imagine have more 50 submodules, hour of work editor should trick.

the alternative though use threading: downside can use 1 cpu core, upside threads share same memory space. thing need care use thread safe data structures, such queue.


Comments

Popular posts from this blog

python - How to insert QWidgets in the middle of a Layout? -

python - serve multiple gunicorn django instances under nginx ubuntu -

module - Prestashop displayPaymentReturn hook url -