久久久久久久av_日韩在线中文_看一级毛片视频_日本精品二区_成人深夜福利视频_武道仙尊动漫在线观看

<i id='uhABa'><tr id='uhABa'><dt id='uhABa'><q id='uhABa'><span id='uhABa'><b id='uhABa'><form id='uhABa'><ins id='uhABa'></ins><ul id='uhABa'></ul><sub id='uhABa'></sub></form><legend id='uhABa'></legend><bdo id='uhABa'><pre id='uhABa'><center id='uhABa'></center></pre></bdo></b><th id='uhABa'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='uhABa'><tfoot id='uhABa'></tfoot><dl id='uhABa'><fieldset id='uhABa'></fieldset></dl></div>
        <bdo id='uhABa'></bdo><ul id='uhABa'></ul>
      <legend id='uhABa'><style id='uhABa'><dir id='uhABa'><q id='uhABa'></q></dir></style></legend>

      <tfoot id='uhABa'></tfoot>

      1. <small id='uhABa'></small><noframes id='uhABa'>

        多處理 - 管道與隊列

        Multiprocessing - Pipe vs Queue(多處理 - 管道與隊列)
          <bdo id='QCRcc'></bdo><ul id='QCRcc'></ul>
          <tfoot id='QCRcc'></tfoot>

              <i id='QCRcc'><tr id='QCRcc'><dt id='QCRcc'><q id='QCRcc'><span id='QCRcc'><b id='QCRcc'><form id='QCRcc'><ins id='QCRcc'></ins><ul id='QCRcc'></ul><sub id='QCRcc'></sub></form><legend id='QCRcc'></legend><bdo id='QCRcc'><pre id='QCRcc'><center id='QCRcc'></center></pre></bdo></b><th id='QCRcc'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='QCRcc'><tfoot id='QCRcc'></tfoot><dl id='QCRcc'><fieldset id='QCRcc'></fieldset></dl></div>
                <tbody id='QCRcc'></tbody>

              <small id='QCRcc'></small><noframes id='QCRcc'>

                1. <legend id='QCRcc'><style id='QCRcc'><dir id='QCRcc'><q id='QCRcc'></q></dir></style></legend>
                2. 本文介紹了多處理 - 管道與隊列的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學(xué)習(xí)吧!

                  問題描述

                  限時送ChatGPT賬號..

                  Python 的多處理包中的隊列和管道的根本區(qū)別是什么?p>

                  在什么情況下應(yīng)該選擇一個而不是另一個?什么時候使用 Pipe() 比較有利?什么時候使用 Queue() 比較有利?

                  解決方案

                  • 一個 Pipe() 只能有兩個端點.

                  • 一個 Queue() 可以有多個生產(chǎn)者和消費者.

                  何時使用它們

                  如果您需要兩個以上的點進(jìn)行通信,請使用 Queue().

                  如果您需要絕對性能,Pipe() 更快,因為 Queue() 是建立在 Pipe() 之上的.

                  績效基準(zhǔn)測試

                  假設(shè)您想要生成兩個進(jìn)程并盡快在它們之間發(fā)送消息.這些是使用 Pipe()Queue() 的類似測試之間的拉力賽的計時結(jié)果......這是在運行 Ubuntu 11.10 和 Python 2.7 的 ThinkpadT61 上.2.

                  僅供參考,我輸入了 JoinableQueue() 作為獎勵;JoinableQueue() 在調(diào)用 queue.task_done() 時計算任務(wù)(它甚至不知道具體的任務(wù),它只計算隊列中未完成的任務(wù)),以便 queue.join() 知道工作已完成.

                  此答案底部的每個代碼...

                  mpenning@mpenning-T61:~$ python multi_pipe.py向 Pipe() 發(fā)送 10000 個數(shù)字需要 0.0369849205017 秒向 Pipe() 發(fā)送 100000 個數(shù)字需要 0.328398942947 秒向 Pipe() 發(fā)送 1000000 個數(shù)字需要 3.17266988754 秒mpenning@mpenning-T61:~$ python multi_queue.py向 Queue() 發(fā)送 10000 個號碼需要 0.105256080627 秒向 Queue() 發(fā)送 100000 個號碼需要 0.980564117432 秒向 Queue() 發(fā)送 1000000 個數(shù)字需要 10.1611330509 秒mpnening@mpenning-T61:~$ python multi_joinablequeue.py向 JoinableQueue() 發(fā)送 10000 個號碼需要 0.172781944275 秒向 JoinableQueue() 發(fā)送 100000 個號碼需要 1.5714070797 秒向 JoinableQueue() 發(fā)送 1000000 個號碼需要 15.8527247906 秒mpenning@mpenning-T61:~$

                  總而言之,Pipe()Queue() 快大約三倍.甚至不要考慮 JoinableQueue() 除非你真的必須有好處.

                  獎勵材料 2

                  多處理在信息流中引入了細(xì)微的變化,這使得調(diào)試變得困難,除非您知道一些捷徑.例如,您可能有一個腳本在許多條件下通過字典索引時工作正常,但很少會因某些輸入而失敗.

                  通常當(dāng)整個python進(jìn)程崩潰時我們會得到失敗的線索;但是,如果多處理功能崩潰,您不會將未經(jīng)請求的崩潰回溯打印到控制臺.如果不知道是什么導(dǎo)致了進(jìn)程崩潰,就很難追蹤未知的多進(jìn)程崩潰.

                  我發(fā)現(xiàn)追蹤多處理崩潰信息的最簡單方法是將整個多處理函數(shù)包裝在 try/except 中并使用 traceback.print_exc():

                  導(dǎo)入回溯定義運行(自我,參數(shù)):嘗試:# 在這里插入要多處理的東西返回 args[0]['那個']除了:print "FATAL: reader({0}) exited while multiprocessing".format(args)追溯.print_exc()

                  現(xiàn)在,當(dāng)您發(fā)現(xiàn)崩潰時,您會看到如下內(nèi)容:

                  FATAL: reader([{'crash': 'this'}]) 在多處理時退出回溯(最近一次通話最后):__init__ 中的文件foo.py",第 19 行自我運行(參數(shù))運行中的文件foo.py",第 46 行鍵錯誤:'那個'

                  源代碼:

                  <小時>

                  <代碼>"""multi_pipe.py"""從多處理導(dǎo)入過程,管道進(jìn)口時間def reader_proc(管道):## 從管道中讀取;這將作為一個單獨的進(jìn)程產(chǎn)生p_output, p_input = 管道p_input.close() # 我們只是在閱讀而真:msg = p_output.recv() # 從輸出管道讀取,什么也不做如果味精=='完成':休息def writer(計數(shù),p_input):對于 xrange(0, count) 中的 ii:p_input.send(ii) # 將 'count' 數(shù)字寫入輸入管道p_input.send('完成')如果 __name__=='__main__':對于 [10**4, 10**5, 10**6] 中的計數(shù):# 管道是單向的,有兩個端點:p_input ------>p_outputp_output, p_input = Pipe() # writer() 從 _this_ 進(jìn)程寫入 p_inputreader_p = Process(target=reader_proc, args=((p_output, p_input),))reader_p.daemon = Truereader_p.start() # 啟動閱讀器進(jìn)程p_output.close() # 我們不再需要這部分 Pipe()_start = time.time()writer(count, p_input) # 發(fā)送很多東西到 reader_proc()p_input.close()reader_p.join()print("向 Pipe() 發(fā)送 {0} 個數(shù)字需要 {1} 秒".format(count,(time.time() - _start)))

                  <小時>

                  <代碼>"""多隊列.py"""從多處理導(dǎo)入進(jìn)程,隊列進(jìn)口時間導(dǎo)入系統(tǒng)def reader_proc(隊列):## 從隊列中讀取;這將作為一個單獨的進(jìn)程產(chǎn)生而真:msg = queue.get() # 從隊列中讀取,什么也不做如果(味精 == '完成'):休息def writer(計數(shù),隊列):## 寫入隊列對于范圍內(nèi)的 ii(0,計數(shù)):queue.put(ii) # 將 'count' 數(shù)寫入隊列queue.put('完成')如果 __name__=='__main__':pqueue = Queue() # writer() 從 _this_ 進(jìn)程寫入 pqueue對于 [10**4, 10**5, 10**6] 中的計數(shù):### reader_proc() 作為單獨的進(jìn)程從 pqueue 讀取reader_p = Process(target=reader_proc, args=((pqueue),))reader_p.daemon = Truereader_p.start() # 啟動 reader_proc() 作為一個單獨的 python 進(jìn)程_start = time.time()writer(count, pqueue) # 發(fā)送很多東西給 reader()reader_p.join() # 等待閱讀器完成print("發(fā)送 {0} 個數(shù)字到 Queue() 花費了 {1} 秒".format(count,(time.time() - _start)))

                  <小時>

                  <代碼>"""multi_joinablequeue.py"""從多處理導(dǎo)入流程,JoinableQueue進(jìn)口時間def reader_proc(隊列):## 從隊列中讀取;這將作為一個單獨的進(jìn)程產(chǎn)生而真:msg = queue.get() # 從隊列中讀取,什么也不做queue.task_done()def writer(計數(shù),隊列):對于 xrange(0, count) 中的 ii:queue.put(ii) # 將 'count' 數(shù)寫入隊列如果 __name__=='__main__':對于 [10**4, 10**5, 10**6] 中的計數(shù):jqueue = JoinableQueue() # writer() 從 _this_ 進(jìn)程寫入 jqueue# reader_proc() 作為不同的進(jìn)程從 jqueue 中讀取...reader_p = Process(target=reader_proc, args=((jqueue),))reader_p.daemon = Truereader_p.start() # 啟動閱讀器進(jìn)程_start = time.time()writer(count, jqueue) # 向 reader_proc() 發(fā)送很多東西(在不同的進(jìn)程中)jqueue.join() # 等待閱讀器完成print("向 JoinableQueue() 發(fā)送 {0} 個數(shù)字需要 {1} 秒".format(count,(time.time() - _start)))

                  What are the fundamental differences between queues and pipes in Python's multiprocessing package?

                  In what scenarios should one choose one over the other? When is it advantageous to use Pipe()? When is it advantageous to use Queue()?

                  解決方案

                  • A Pipe() can only have two endpoints.

                  • A Queue() can have multiple producers and consumers.

                  When to use them

                  If you need more than two points to communicate, use a Queue().

                  If you need absolute performance, a Pipe() is much faster because Queue() is built on top of Pipe().

                  Performance Benchmarking

                  Let's assume you want to spawn two processes and send messages between them as quickly as possible. These are the timing results of a drag race between similar tests using Pipe() and Queue()... This is on a ThinkpadT61 running Ubuntu 11.10, and Python 2.7.2.

                  FYI, I threw in results for JoinableQueue() as a bonus; JoinableQueue() accounts for tasks when queue.task_done() is called (it doesn't even know about the specific task, it just counts unfinished tasks in the queue), so that queue.join() knows the work is finished.

                  The code for each at bottom of this answer...

                  mpenning@mpenning-T61:~$ python multi_pipe.py 
                  Sending 10000 numbers to Pipe() took 0.0369849205017 seconds
                  Sending 100000 numbers to Pipe() took 0.328398942947 seconds
                  Sending 1000000 numbers to Pipe() took 3.17266988754 seconds
                  mpenning@mpenning-T61:~$ python multi_queue.py 
                  Sending 10000 numbers to Queue() took 0.105256080627 seconds
                  Sending 100000 numbers to Queue() took 0.980564117432 seconds
                  Sending 1000000 numbers to Queue() took 10.1611330509 seconds
                  mpnening@mpenning-T61:~$ python multi_joinablequeue.py 
                  Sending 10000 numbers to JoinableQueue() took 0.172781944275 seconds
                  Sending 100000 numbers to JoinableQueue() took 1.5714070797 seconds
                  Sending 1000000 numbers to JoinableQueue() took 15.8527247906 seconds
                  mpenning@mpenning-T61:~$
                  

                  In summary Pipe() is about three times faster than a Queue(). Don't even think about the JoinableQueue() unless you really must have the benefits.

                  BONUS MATERIAL 2

                  Multiprocessing introduces subtle changes in information flow that make debugging hard unless you know some shortcuts. For instance, you might have a script that works fine when indexing through a dictionary in under many conditions, but infrequently fails with certain inputs.

                  Normally we get clues to the failure when the entire python process crashes; however, you don't get unsolicited crash tracebacks printed to the console if the multiprocessing function crashes. Tracking down unknown multiprocessing crashes is hard without a clue to what crashed the process.

                  The simplest way I have found to track down multiprocessing crash informaiton is to wrap the entire multiprocessing function in a try / except and use traceback.print_exc():

                  import traceback
                  def run(self, args):
                      try:
                          # Insert stuff to be multiprocessed here
                          return args[0]['that']
                      except:
                          print "FATAL: reader({0}) exited while multiprocessing".format(args) 
                          traceback.print_exc()
                  

                  Now, when you find a crash you see something like:

                  FATAL: reader([{'crash': 'this'}]) exited while multiprocessing
                  Traceback (most recent call last):
                    File "foo.py", line 19, in __init__
                      self.run(args)
                    File "foo.py", line 46, in run
                      KeyError: 'that'
                  

                  Source Code:


                  """
                  multi_pipe.py
                  """
                  from multiprocessing import Process, Pipe
                  import time
                  
                  def reader_proc(pipe):
                      ## Read from the pipe; this will be spawned as a separate Process
                      p_output, p_input = pipe
                      p_input.close()    # We are only reading
                      while True:
                          msg = p_output.recv()    # Read from the output pipe and do nothing
                          if msg=='DONE':
                              break
                  
                  def writer(count, p_input):
                      for ii in xrange(0, count):
                          p_input.send(ii)             # Write 'count' numbers into the input pipe
                      p_input.send('DONE')
                  
                  if __name__=='__main__':
                      for count in [10**4, 10**5, 10**6]:
                          # Pipes are unidirectional with two endpoints:  p_input ------> p_output
                          p_output, p_input = Pipe()  # writer() writes to p_input from _this_ process
                          reader_p = Process(target=reader_proc, args=((p_output, p_input),))
                          reader_p.daemon = True
                          reader_p.start()     # Launch the reader process
                  
                          p_output.close()       # We no longer need this part of the Pipe()
                          _start = time.time()
                          writer(count, p_input) # Send a lot of stuff to reader_proc()
                          p_input.close()
                          reader_p.join()
                          print("Sending {0} numbers to Pipe() took {1} seconds".format(count,
                              (time.time() - _start)))
                  


                  """
                  multi_queue.py
                  """
                  
                  from multiprocessing import Process, Queue
                  import time
                  import sys
                  
                  def reader_proc(queue):
                      ## Read from the queue; this will be spawned as a separate Process
                      while True:
                          msg = queue.get()         # Read from the queue and do nothing
                          if (msg == 'DONE'):
                              break
                  
                  def writer(count, queue):
                      ## Write to the queue
                      for ii in range(0, count):
                          queue.put(ii)             # Write 'count' numbers into the queue
                      queue.put('DONE')
                  
                  if __name__=='__main__':
                      pqueue = Queue() # writer() writes to pqueue from _this_ process
                      for count in [10**4, 10**5, 10**6]:             
                          ### reader_proc() reads from pqueue as a separate process
                          reader_p = Process(target=reader_proc, args=((pqueue),))
                          reader_p.daemon = True
                          reader_p.start()        # Launch reader_proc() as a separate python process
                  
                          _start = time.time()
                          writer(count, pqueue)    # Send a lot of stuff to reader()
                          reader_p.join()         # Wait for the reader to finish
                          print("Sending {0} numbers to Queue() took {1} seconds".format(count, 
                              (time.time() - _start)))
                  


                  """
                  multi_joinablequeue.py
                  """
                  from multiprocessing import Process, JoinableQueue
                  import time
                  
                  def reader_proc(queue):
                      ## Read from the queue; this will be spawned as a separate Process
                      while True:
                          msg = queue.get()         # Read from the queue and do nothing
                          queue.task_done()
                  
                  def writer(count, queue):
                      for ii in xrange(0, count):
                          queue.put(ii)             # Write 'count' numbers into the queue
                  
                  if __name__=='__main__':
                      for count in [10**4, 10**5, 10**6]:
                          jqueue = JoinableQueue() # writer() writes to jqueue from _this_ process
                          # reader_proc() reads from jqueue as a different process...
                          reader_p = Process(target=reader_proc, args=((jqueue),))
                          reader_p.daemon = True
                          reader_p.start()     # Launch the reader process
                          _start = time.time()
                          writer(count, jqueue) # Send a lot of stuff to reader_proc() (in different process)
                          jqueue.join()         # Wait for the reader to finish
                          print("Sending {0} numbers to JoinableQueue() took {1} seconds".format(count, 
                              (time.time() - _start)))
                  

                  這篇關(guān)于多處理 - 管道與隊列的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網(wǎng)!

                  【網(wǎng)站聲明】本站部分內(nèi)容來源于互聯(lián)網(wǎng),旨在幫助大家更快的解決問題,如果有圖片或者內(nèi)容侵犯了您的權(quán)益,請聯(lián)系我們刪除處理,感謝您的支持!

                  相關(guān)文檔推薦

                  What exactly is Python multiprocessing Module#39;s .join() Method Doing?(Python 多處理模塊的 .join() 方法到底在做什么?)
                  Passing multiple parameters to pool.map() function in Python(在 Python 中將多個參數(shù)傳遞給 pool.map() 函數(shù))
                  multiprocessing.pool.MaybeEncodingError: #39;TypeError(quot;cannot serialize #39;_io.BufferedReader#39; objectquot;,)#39;(multiprocessing.pool.MaybeEncodingError: TypeError(cannot serialize _io.BufferedReader object,)) - IT屋-程序員軟件開
                  Python Multiprocess Pool. How to exit the script when one of the worker process determines no more work needs to be done?(Python 多進(jìn)程池.當(dāng)其中一個工作進(jìn)程確定不再需要完成工作時,如何退出腳本?) - IT屋-程序員
                  How do you pass a Queue reference to a function managed by pool.map_async()?(如何將隊列引用傳遞給 pool.map_async() 管理的函數(shù)?)
                  yet another confusion with multiprocessing error, #39;module#39; object has no attribute #39;f#39;(與多處理錯誤的另一個混淆,“模塊對象沒有屬性“f)
                  <tfoot id='RZ6uo'></tfoot>

                3. <legend id='RZ6uo'><style id='RZ6uo'><dir id='RZ6uo'><q id='RZ6uo'></q></dir></style></legend>

                        <bdo id='RZ6uo'></bdo><ul id='RZ6uo'></ul>
                        <i id='RZ6uo'><tr id='RZ6uo'><dt id='RZ6uo'><q id='RZ6uo'><span id='RZ6uo'><b id='RZ6uo'><form id='RZ6uo'><ins id='RZ6uo'></ins><ul id='RZ6uo'></ul><sub id='RZ6uo'></sub></form><legend id='RZ6uo'></legend><bdo id='RZ6uo'><pre id='RZ6uo'><center id='RZ6uo'></center></pre></bdo></b><th id='RZ6uo'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='RZ6uo'><tfoot id='RZ6uo'></tfoot><dl id='RZ6uo'><fieldset id='RZ6uo'></fieldset></dl></div>
                          <tbody id='RZ6uo'></tbody>

                        <small id='RZ6uo'></small><noframes id='RZ6uo'>

                          • 主站蜘蛛池模板: 国产粉嫩尤物极品99综合精品 | 免费视频成人国产精品网站 | 国产高清在线观看 | 国产一区二区三区不卡av | 精品成人一区二区 | 毛片网在线观看 | 亚洲 自拍 另类 欧美 丝袜 | 国产乱码精品一区二区三区中文 | 国产成人av一区二区三区 | 亚洲欧美日韩精品久久亚洲区 | 97caoporn国产免费人人 | 日韩精品免费在线 | 国产精品久久久 | 99亚洲国产精品 | 亚洲成人精品免费 | 免费国产视频在线观看 | 免费久久视频 | 日日骚av | 久久精品国产一区二区电影 | 亚洲日本一区二区 | 亚洲天堂中文字幕 | 欧美黑人一级爽快片淫片高清 | 2023亚洲天堂 | 日韩激情在线 | 国产一在线观看 | 午夜精品久久久久久久久久久久久 | 日韩小视频在线 | 精品久久一 | 欧美精品一区二区三区四区五区 | 中文字幕一区二区三区精彩视频 | 国产一区二区电影 | 日韩中文字幕在线视频 | 日韩高清中文字幕 | 国产成人免费视频 | 在线观看成人小视频 | 日本五月婷婷 | 亚洲不卡av在线 | 亚洲精品国产a久久久久久 中文字幕一区二区三区四区五区 | 欧美日韩视频在线 | 国产精品综合色区在线观看 | 中文字幕第十五页 |