問題描述
我正在使用 Python 處理一個相當大的項目,該項目需要將計算密集型后臺任務之一卸載到另一個核心,以便不會降低主要服務的速度.在使用 multiprocessing.Queue
來傳達工作進程的結果時,我遇到了一些明顯奇怪的行為.為 threading.Thread
和 multiprocessing.Process
使用相同的隊列以進行比較,線程工作得很好,但在放入大項目后進程無法加入隊列.觀察:
I'm working on a fairly large project in Python that requires one of the compute-intensive background tasks to be offloaded to another core, so that the main service isn't slowed down. I've come across some apparently strange behaviour when using multiprocessing.Queue
to communicate results from the worker process. Using the same queue for both a threading.Thread
and a multiprocessing.Process
for comparison purposes, the thread works just fine but the process fails to join after putting a large item in the queue. Observe:
import threading
import multiprocessing
class WorkerThread(threading.Thread):
def __init__(self, queue, size):
threading.Thread.__init__(self)
self.queue = queue
self.size = size
def run(self):
self.queue.put(range(size))
class WorkerProcess(multiprocessing.Process):
def __init__(self, queue, size):
multiprocessing.Process.__init__(self)
self.queue = queue
self.size = size
def run(self):
self.queue.put(range(size))
if __name__ == "__main__":
size = 100000
queue = multiprocessing.Queue()
worker_t = WorkerThread(queue, size)
worker_p = WorkerProcess(queue, size)
worker_t.start()
worker_t.join()
print 'thread results length:', len(queue.get())
worker_p.start()
worker_p.join()
print 'process results length:', len(queue.get())
我發現這對于 size = 10000
工作正常,但對于 size = 100000
會掛在 worker_p.join()
.multiprocessing.Process
實例可以放入 multiprocessing.Queue
的內容是否存在一些固有的大小限制?還是我在這里犯了一些明顯的根本性錯誤?
I've seen that this works fine for size = 10000
, but hangs at worker_p.join()
for size = 100000
. Is there some inherent size limit to what multiprocessing.Process
instances can put in a multiprocessing.Queue
? Or am I making some obvious, fundamental mistake here?
作為參考,我在 Ubuntu 10.04 上使用 Python 2.6.5.
For reference, I am using Python 2.6.5 on Ubuntu 10.04.
推薦答案
似乎底層管道已滿,因此在寫入管道時,饋線線程阻塞(實際上是在嘗試獲取保護管道免受并發訪問的鎖時).
Seems the underlying pipe is full, so the feeder thread blocks on the write to the pipe (actually when trying to acquire the lock protecting the pipe from concurrent access).
檢查這個問題http://bugs.python.org/issue8237
這篇關于multiprocessing.Queue 項目的最大大小?的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!