久久久久久久av_日韩在线中文_看一级毛片视频_日本精品二区_成人深夜福利视频_武道仙尊动漫在线观看

<i id='uhABa'><tr id='uhABa'><dt id='uhABa'><q id='uhABa'><span id='uhABa'><b id='uhABa'><form id='uhABa'><ins id='uhABa'></ins><ul id='uhABa'></ul><sub id='uhABa'></sub></form><legend id='uhABa'></legend><bdo id='uhABa'><pre id='uhABa'><center id='uhABa'></center></pre></bdo></b><th id='uhABa'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='uhABa'><tfoot id='uhABa'></tfoot><dl id='uhABa'><fieldset id='uhABa'></fieldset></dl></div>
        <bdo id='uhABa'></bdo><ul id='uhABa'></ul>
      <legend id='uhABa'><style id='uhABa'><dir id='uhABa'><q id='uhABa'></q></dir></style></legend>

      <tfoot id='uhABa'></tfoot>

      1. <small id='uhABa'></small><noframes id='uhABa'>

        多處理 - 管道與隊列

        Multiprocessing - Pipe vs Queue(多處理 - 管道與隊列)
          <bdo id='QCRcc'></bdo><ul id='QCRcc'></ul>
          <tfoot id='QCRcc'></tfoot>

              <i id='QCRcc'><tr id='QCRcc'><dt id='QCRcc'><q id='QCRcc'><span id='QCRcc'><b id='QCRcc'><form id='QCRcc'><ins id='QCRcc'></ins><ul id='QCRcc'></ul><sub id='QCRcc'></sub></form><legend id='QCRcc'></legend><bdo id='QCRcc'><pre id='QCRcc'><center id='QCRcc'></center></pre></bdo></b><th id='QCRcc'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='QCRcc'><tfoot id='QCRcc'></tfoot><dl id='QCRcc'><fieldset id='QCRcc'></fieldset></dl></div>
                <tbody id='QCRcc'></tbody>

              <small id='QCRcc'></small><noframes id='QCRcc'>

                1. <legend id='QCRcc'><style id='QCRcc'><dir id='QCRcc'><q id='QCRcc'></q></dir></style></legend>
                2. 本文介紹了多處理 - 管道與隊列的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學習吧!

                  問題描述

                  限時送ChatGPT賬號..

                  Python 的多處理包中的隊列和管道的根本區別是什么?p>

                  在什么情況下應該選擇一個而不是另一個?什么時候使用 Pipe() 比較有利?什么時候使用 Queue() 比較有利?

                  解決方案

                  • 一個 Pipe() 只能有兩個端點.

                  • 一個 Queue() 可以有多個生產者和消費者.

                  何時使用它們

                  如果您需要兩個以上的點進行通信,請使用 Queue().

                  如果您需要絕對性能,Pipe() 更快,因為 Queue() 是建立在 Pipe() 之上的.

                  績效基準測試

                  假設您想要生成兩個進程并盡快在它們之間發送消息.這些是使用 Pipe()Queue() 的類似測試之間的拉力賽的計時結果......這是在運行 Ubuntu 11.10 和 Python 2.7 的 ThinkpadT61 上.2.

                  僅供參考,我輸入了 JoinableQueue() 作為獎勵;JoinableQueue() 在調用 queue.task_done() 時計算任務(它甚至不知道具體的任務,它只計算隊列中未完成的任務),以便 queue.join() 知道工作已完成.

                  此答案底部的每個代碼...

                  mpenning@mpenning-T61:~$ python multi_pipe.py向 Pipe() 發送 10000 個數字需要 0.0369849205017 秒向 Pipe() 發送 100000 個數字需要 0.328398942947 秒向 Pipe() 發送 1000000 個數字需要 3.17266988754 秒mpenning@mpenning-T61:~$ python multi_queue.py向 Queue() 發送 10000 個號碼需要 0.105256080627 秒向 Queue() 發送 100000 個號碼需要 0.980564117432 秒向 Queue() 發送 1000000 個數字需要 10.1611330509 秒mpnening@mpenning-T61:~$ python multi_joinablequeue.py向 JoinableQueue() 發送 10000 個號碼需要 0.172781944275 秒向 JoinableQueue() 發送 100000 個號碼需要 1.5714070797 秒向 JoinableQueue() 發送 1000000 個號碼需要 15.8527247906 秒mpenning@mpenning-T61:~$

                  總而言之,Pipe()Queue() 快大約三倍.甚至不要考慮 JoinableQueue() 除非你真的必須有好處.

                  獎勵材料 2

                  多處理在信息流中引入了細微的變化,這使得調試變得困難,除非您知道一些捷徑.例如,您可能有一個腳本在許多條件下通過字典索引時工作正常,但很少會因某些輸入而失敗.

                  通常當整個python進程崩潰時我們會得到失敗的線索;但是,如果多處理功能崩潰,您不會將未經請求的崩潰回溯打印到控制臺.如果不知道是什么導致了進程崩潰,就很難追蹤未知的多進程崩潰.

                  我發現追蹤多處理崩潰信息的最簡單方法是將整個多處理函數包裝在 try/except 中并使用 traceback.print_exc():

                  導入回溯定義運行(自我,參數):嘗試:# 在這里插入要多處理的東西返回 args[0]['那個']除了:print "FATAL: reader({0}) exited while multiprocessing".format(args)追溯.print_exc()

                  現在,當您發現崩潰時,您會看到如下內容:

                  FATAL: reader([{'crash': 'this'}]) 在多處理時退出回溯(最近一次通話最后):__init__ 中的文件foo.py",第 19 行自我運行(參數)運行中的文件foo.py",第 46 行鍵錯誤:'那個'

                  源代碼:

                  <小時>

                  <代碼>"""multi_pipe.py"""從多處理導入過程,管道進口時間def reader_proc(管道):## 從管道中讀取;這將作為一個單獨的進程產生p_output, p_input = 管道p_input.close() # 我們只是在閱讀而真:msg = p_output.recv() # 從輸出管道讀取,什么也不做如果味精=='完成':休息def writer(計數,p_input):對于 xrange(0, count) 中的 ii:p_input.send(ii) # 將 'count' 數字寫入輸入管道p_input.send('完成')如果 __name__=='__main__':對于 [10**4, 10**5, 10**6] 中的計數:# 管道是單向的,有兩個端點:p_input ------>p_outputp_output, p_input = Pipe() # writer() 從 _this_ 進程寫入 p_inputreader_p = Process(target=reader_proc, args=((p_output, p_input),))reader_p.daemon = Truereader_p.start() # 啟動閱讀器進程p_output.close() # 我們不再需要這部分 Pipe()_start = time.time()writer(count, p_input) # 發送很多東西到 reader_proc()p_input.close()reader_p.join()print("向 Pipe() 發送 {0} 個數字需要 {1} 秒".format(count,(time.time() - _start)))

                  <小時>

                  <代碼>"""多隊列.py"""從多處理導入進程,隊列進口時間導入系統def reader_proc(隊列):## 從隊列中讀取;這將作為一個單獨的進程產生而真:msg = queue.get() # 從隊列中讀取,什么也不做如果(味精 == '完成'):休息def writer(計數,隊列):## 寫入隊列對于范圍內的 ii(0,計數):queue.put(ii) # 將 'count' 數寫入隊列queue.put('完成')如果 __name__=='__main__':pqueue = Queue() # writer() 從 _this_ 進程寫入 pqueue對于 [10**4, 10**5, 10**6] 中的計數:### reader_proc() 作為單獨的進程從 pqueue 讀取reader_p = Process(target=reader_proc, args=((pqueue),))reader_p.daemon = Truereader_p.start() # 啟動 reader_proc() 作為一個單獨的 python 進程_start = time.time()writer(count, pqueue) # 發送很多東西給 reader()reader_p.join() # 等待閱讀器完成print("發送 {0} 個數字到 Queue() 花費了 {1} 秒".format(count,(time.time() - _start)))

                  <小時>

                  <代碼>"""multi_joinablequeue.py"""從多處理導入流程,JoinableQueue進口時間def reader_proc(隊列):## 從隊列中讀取;這將作為一個單獨的進程產生而真:msg = queue.get() # 從隊列中讀取,什么也不做queue.task_done()def writer(計數,隊列):對于 xrange(0, count) 中的 ii:queue.put(ii) # 將 'count' 數寫入隊列如果 __name__=='__main__':對于 [10**4, 10**5, 10**6] 中的計數:jqueue = JoinableQueue() # writer() 從 _this_ 進程寫入 jqueue# reader_proc() 作為不同的進程從 jqueue 中讀取...reader_p = Process(target=reader_proc, args=((jqueue),))reader_p.daemon = Truereader_p.start() # 啟動閱讀器進程_start = time.time()writer(count, jqueue) # 向 reader_proc() 發送很多東西(在不同的進程中)jqueue.join() # 等待閱讀器完成print("向 JoinableQueue() 發送 {0} 個數字需要 {1} 秒".format(count,(time.time() - _start)))

                  What are the fundamental differences between queues and pipes in Python's multiprocessing package?

                  In what scenarios should one choose one over the other? When is it advantageous to use Pipe()? When is it advantageous to use Queue()?

                  解決方案

                  • A Pipe() can only have two endpoints.

                  • A Queue() can have multiple producers and consumers.

                  When to use them

                  If you need more than two points to communicate, use a Queue().

                  If you need absolute performance, a Pipe() is much faster because Queue() is built on top of Pipe().

                  Performance Benchmarking

                  Let's assume you want to spawn two processes and send messages between them as quickly as possible. These are the timing results of a drag race between similar tests using Pipe() and Queue()... This is on a ThinkpadT61 running Ubuntu 11.10, and Python 2.7.2.

                  FYI, I threw in results for JoinableQueue() as a bonus; JoinableQueue() accounts for tasks when queue.task_done() is called (it doesn't even know about the specific task, it just counts unfinished tasks in the queue), so that queue.join() knows the work is finished.

                  The code for each at bottom of this answer...

                  mpenning@mpenning-T61:~$ python multi_pipe.py 
                  Sending 10000 numbers to Pipe() took 0.0369849205017 seconds
                  Sending 100000 numbers to Pipe() took 0.328398942947 seconds
                  Sending 1000000 numbers to Pipe() took 3.17266988754 seconds
                  mpenning@mpenning-T61:~$ python multi_queue.py 
                  Sending 10000 numbers to Queue() took 0.105256080627 seconds
                  Sending 100000 numbers to Queue() took 0.980564117432 seconds
                  Sending 1000000 numbers to Queue() took 10.1611330509 seconds
                  mpnening@mpenning-T61:~$ python multi_joinablequeue.py 
                  Sending 10000 numbers to JoinableQueue() took 0.172781944275 seconds
                  Sending 100000 numbers to JoinableQueue() took 1.5714070797 seconds
                  Sending 1000000 numbers to JoinableQueue() took 15.8527247906 seconds
                  mpenning@mpenning-T61:~$
                  

                  In summary Pipe() is about three times faster than a Queue(). Don't even think about the JoinableQueue() unless you really must have the benefits.

                  BONUS MATERIAL 2

                  Multiprocessing introduces subtle changes in information flow that make debugging hard unless you know some shortcuts. For instance, you might have a script that works fine when indexing through a dictionary in under many conditions, but infrequently fails with certain inputs.

                  Normally we get clues to the failure when the entire python process crashes; however, you don't get unsolicited crash tracebacks printed to the console if the multiprocessing function crashes. Tracking down unknown multiprocessing crashes is hard without a clue to what crashed the process.

                  The simplest way I have found to track down multiprocessing crash informaiton is to wrap the entire multiprocessing function in a try / except and use traceback.print_exc():

                  import traceback
                  def run(self, args):
                      try:
                          # Insert stuff to be multiprocessed here
                          return args[0]['that']
                      except:
                          print "FATAL: reader({0}) exited while multiprocessing".format(args) 
                          traceback.print_exc()
                  

                  Now, when you find a crash you see something like:

                  FATAL: reader([{'crash': 'this'}]) exited while multiprocessing
                  Traceback (most recent call last):
                    File "foo.py", line 19, in __init__
                      self.run(args)
                    File "foo.py", line 46, in run
                      KeyError: 'that'
                  

                  Source Code:


                  """
                  multi_pipe.py
                  """
                  from multiprocessing import Process, Pipe
                  import time
                  
                  def reader_proc(pipe):
                      ## Read from the pipe; this will be spawned as a separate Process
                      p_output, p_input = pipe
                      p_input.close()    # We are only reading
                      while True:
                          msg = p_output.recv()    # Read from the output pipe and do nothing
                          if msg=='DONE':
                              break
                  
                  def writer(count, p_input):
                      for ii in xrange(0, count):
                          p_input.send(ii)             # Write 'count' numbers into the input pipe
                      p_input.send('DONE')
                  
                  if __name__=='__main__':
                      for count in [10**4, 10**5, 10**6]:
                          # Pipes are unidirectional with two endpoints:  p_input ------> p_output
                          p_output, p_input = Pipe()  # writer() writes to p_input from _this_ process
                          reader_p = Process(target=reader_proc, args=((p_output, p_input),))
                          reader_p.daemon = True
                          reader_p.start()     # Launch the reader process
                  
                          p_output.close()       # We no longer need this part of the Pipe()
                          _start = time.time()
                          writer(count, p_input) # Send a lot of stuff to reader_proc()
                          p_input.close()
                          reader_p.join()
                          print("Sending {0} numbers to Pipe() took {1} seconds".format(count,
                              (time.time() - _start)))
                  


                  """
                  multi_queue.py
                  """
                  
                  from multiprocessing import Process, Queue
                  import time
                  import sys
                  
                  def reader_proc(queue):
                      ## Read from the queue; this will be spawned as a separate Process
                      while True:
                          msg = queue.get()         # Read from the queue and do nothing
                          if (msg == 'DONE'):
                              break
                  
                  def writer(count, queue):
                      ## Write to the queue
                      for ii in range(0, count):
                          queue.put(ii)             # Write 'count' numbers into the queue
                      queue.put('DONE')
                  
                  if __name__=='__main__':
                      pqueue = Queue() # writer() writes to pqueue from _this_ process
                      for count in [10**4, 10**5, 10**6]:             
                          ### reader_proc() reads from pqueue as a separate process
                          reader_p = Process(target=reader_proc, args=((pqueue),))
                          reader_p.daemon = True
                          reader_p.start()        # Launch reader_proc() as a separate python process
                  
                          _start = time.time()
                          writer(count, pqueue)    # Send a lot of stuff to reader()
                          reader_p.join()         # Wait for the reader to finish
                          print("Sending {0} numbers to Queue() took {1} seconds".format(count, 
                              (time.time() - _start)))
                  


                  """
                  multi_joinablequeue.py
                  """
                  from multiprocessing import Process, JoinableQueue
                  import time
                  
                  def reader_proc(queue):
                      ## Read from the queue; this will be spawned as a separate Process
                      while True:
                          msg = queue.get()         # Read from the queue and do nothing
                          queue.task_done()
                  
                  def writer(count, queue):
                      for ii in xrange(0, count):
                          queue.put(ii)             # Write 'count' numbers into the queue
                  
                  if __name__=='__main__':
                      for count in [10**4, 10**5, 10**6]:
                          jqueue = JoinableQueue() # writer() writes to jqueue from _this_ process
                          # reader_proc() reads from jqueue as a different process...
                          reader_p = Process(target=reader_proc, args=((jqueue),))
                          reader_p.daemon = True
                          reader_p.start()     # Launch the reader process
                          _start = time.time()
                          writer(count, jqueue) # Send a lot of stuff to reader_proc() (in different process)
                          jqueue.join()         # Wait for the reader to finish
                          print("Sending {0} numbers to JoinableQueue() took {1} seconds".format(count, 
                              (time.time() - _start)))
                  

                  這篇關于多處理 - 管道與隊列的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!

                  【網站聲明】本站部分內容來源于互聯網,旨在幫助大家更快的解決問題,如果有圖片或者內容侵犯了您的權益,請聯系我們刪除處理,感謝您的支持!

                  相關文檔推薦

                  What exactly is Python multiprocessing Module#39;s .join() Method Doing?(Python 多處理模塊的 .join() 方法到底在做什么?)
                  Passing multiple parameters to pool.map() function in Python(在 Python 中將多個參數傳遞給 pool.map() 函數)
                  multiprocessing.pool.MaybeEncodingError: #39;TypeError(quot;cannot serialize #39;_io.BufferedReader#39; objectquot;,)#39;(multiprocessing.pool.MaybeEncodingError: TypeError(cannot serialize _io.BufferedReader object,)) - IT屋-程序員軟件開
                  Python Multiprocess Pool. How to exit the script when one of the worker process determines no more work needs to be done?(Python 多進程池.當其中一個工作進程確定不再需要完成工作時,如何退出腳本?) - IT屋-程序員
                  How do you pass a Queue reference to a function managed by pool.map_async()?(如何將隊列引用傳遞給 pool.map_async() 管理的函數?)
                  yet another confusion with multiprocessing error, #39;module#39; object has no attribute #39;f#39;(與多處理錯誤的另一個混淆,“模塊對象沒有屬性“f)
                  <tfoot id='RZ6uo'></tfoot>

                3. <legend id='RZ6uo'><style id='RZ6uo'><dir id='RZ6uo'><q id='RZ6uo'></q></dir></style></legend>

                        <bdo id='RZ6uo'></bdo><ul id='RZ6uo'></ul>
                        <i id='RZ6uo'><tr id='RZ6uo'><dt id='RZ6uo'><q id='RZ6uo'><span id='RZ6uo'><b id='RZ6uo'><form id='RZ6uo'><ins id='RZ6uo'></ins><ul id='RZ6uo'></ul><sub id='RZ6uo'></sub></form><legend id='RZ6uo'></legend><bdo id='RZ6uo'><pre id='RZ6uo'><center id='RZ6uo'></center></pre></bdo></b><th id='RZ6uo'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='RZ6uo'><tfoot id='RZ6uo'></tfoot><dl id='RZ6uo'><fieldset id='RZ6uo'></fieldset></dl></div>
                          <tbody id='RZ6uo'></tbody>

                        <small id='RZ6uo'></small><noframes id='RZ6uo'>

                          • 主站蜘蛛池模板: 日本黄色a级片 | 一区二区高清视频 | 天天有av| 国产午夜精品久久久 | 欧美一级大片 | 色婷婷在线视频 | 蜜臀久久99精品久久久久宅男 | 日韩精品欧美 | 成人深夜福利视频 | 深夜福利免费 | 欧美激情视频一区二区三区 | 午夜精品久久久久久久久久蜜桃 | 97视频免费| 黄色影院在线观看 | 日韩黄色一级 | 一区二区三区视频在线播放 | 色爱综合网 | 欧美三级三级三级爽爽爽 | 中文字字幕在线中文 | 亚欧在线观看 | 在线日韩一区 | 亚洲二区在线 | 欧美激情国产精品 | 久久久久久久av | 亚洲精品久久久 | 久久久久精 | 黄色综合网 | 国产性猛交96 | 97色在线 | 午夜私人影院 | 黄色大片av| 亚洲福利在线观看 | 久久成人在线 | 性爱一级视频 | 一区二区三区黄色 | 毛片av在线 | 春色导航 | 精品国产久 | 国产精品成人免费视频 | 欧美美女视频 | 性生活视频网站 |