久久久久久久av_日韩在线中文_看一级毛片视频_日本精品二区_成人深夜福利视频_武道仙尊动漫在线观看

    <bdo id='YU6NN'></bdo><ul id='YU6NN'></ul>
<tfoot id='YU6NN'></tfoot>
    <i id='YU6NN'><tr id='YU6NN'><dt id='YU6NN'><q id='YU6NN'><span id='YU6NN'><b id='YU6NN'><form id='YU6NN'><ins id='YU6NN'></ins><ul id='YU6NN'></ul><sub id='YU6NN'></sub></form><legend id='YU6NN'></legend><bdo id='YU6NN'><pre id='YU6NN'><center id='YU6NN'></center></pre></bdo></b><th id='YU6NN'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='YU6NN'><tfoot id='YU6NN'></tfoot><dl id='YU6NN'><fieldset id='YU6NN'></fieldset></dl></div>
  • <small id='YU6NN'></small><noframes id='YU6NN'>

      1. <legend id='YU6NN'><style id='YU6NN'><dir id='YU6NN'><q id='YU6NN'></q></dir></style></legend>

        Python多處理附加列表

        Python Multiprocessing appending list(Python多處理附加列表)

          <tbody id='QoHxm'></tbody>

          <tfoot id='QoHxm'></tfoot>
          <i id='QoHxm'><tr id='QoHxm'><dt id='QoHxm'><q id='QoHxm'><span id='QoHxm'><b id='QoHxm'><form id='QoHxm'><ins id='QoHxm'></ins><ul id='QoHxm'></ul><sub id='QoHxm'></sub></form><legend id='QoHxm'></legend><bdo id='QoHxm'><pre id='QoHxm'><center id='QoHxm'></center></pre></bdo></b><th id='QoHxm'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='QoHxm'><tfoot id='QoHxm'></tfoot><dl id='QoHxm'><fieldset id='QoHxm'></fieldset></dl></div>
          <legend id='QoHxm'><style id='QoHxm'><dir id='QoHxm'><q id='QoHxm'></q></dir></style></legend>

            • <bdo id='QoHxm'></bdo><ul id='QoHxm'></ul>
                • <small id='QoHxm'></small><noframes id='QoHxm'>

                  本文介紹了Python多處理附加列表的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學(xué)習吧!

                  問題描述

                  限時送ChatGPT賬號..

                  關(guān)于使用 Multiprocessing.Pool() 在多個進程之間共享變量的問題.

                  Have a quick question about a shared variable between multiple processes using Multiprocessing.Pool().

                  如果我從多個進程中更新全局列表,我會遇到任何問題嗎?IE.如果兩個進程同時嘗試更新列表.

                  Will I run in to any issues if I am updating a global list from within multiple processes? I.e. if two of the processes were to try to update the list at the same time.

                  我看過關(guān)于在類似事情上使用鎖的文檔,但我想知道是否有必要.

                  I have seen documentation about using a Lock for similar things but I was wondering if it was necessary.

                  我共享這個變量的方式是在我的回調(diào)函數(shù)中使用一個全局變量,'success' 在目標函數(shù)完成后,我將所有成功的操作附加到:

                  The way I am sharing this variable is by using a global variable in my callback function, 'successes' in which i append all of the successful actions to after the target function has completed:

                  TOTAL_SUCCESSES = []
                  
                  def func(inputs):
                      successes = []
                  
                      for input in inputs:
                          result = #something with return code
                          if result == 0:
                              successes.append(input)
                      return successes
                  
                  def callback(successes):
                      global TOTAL_SUCCESSES
                  
                      for entry in successes:
                          TOTAL_SUCCESSES.append(entry)
                  
                  def main():     
                      pool = mp.Pool()
                      for entry in myInputs:
                           pool.apply_async(func, args=(entry,),callback=callback)         
                  

                  為任何語法錯誤道歉,很快就寫出來了,但是程序正在運行,只是想知道如果我有問題我是否添加了共享變量.

                  Apologize for any syntax errors, wrote this up quickly but the program is working just wondering if I add the shared variable if I will have issues.

                  提前致謝!

                  推薦答案

                  使用您當前的代碼,您實際上并沒有在進程之間共享 CURRENT_SUCCESSES.callback 在主進程中的結(jié)果處理線程中執(zhí)行.只有一個結(jié)果處理線程,因此每個 callback 將一次運行一個,而不是同時運行.所以你寫的代碼是進程/線程安全的.

                  With your current code, you're not actually sharing CURRENT_SUCCESSES between processes. callback is executed in the main process, in a result handling thread. There is only one result handling thread, so each callback will be run one at a time, not concurrently. So your code as written is process/thread safe.

                  但是,您忘記從 func 中返回 success,這是您想要修復(fù)的.

                  However, you are forgetting to return successes from func, which you'll want to fix.

                  此外,使用 map 可以更簡潔地編寫:

                  Also, this could be much more succinctly written using map:

                  def func(inputs):
                      successes = []
                  
                      for input in inputs:
                          result = #something with return code
                          if result == 0:
                              successes.append(input)
                      return successes
                  
                  def main():     
                      pool = mp.Pool()
                      total_successes = pool.map(func, myInputs) # Returns a list of lists
                      # Flatten the list of lists
                      total_successes = [ent for sublist in total_successes for ent in sublist]
                  

                  這篇關(guān)于Python多處理附加列表的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網(wǎng)!

                  【網(wǎng)站聲明】本站部分內(nèi)容來源于互聯(lián)網(wǎng),旨在幫助大家更快的解決問題,如果有圖片或者內(nèi)容侵犯了您的權(quán)益,請聯(lián)系我們刪除處理,感謝您的支持!

                  相關(guān)文檔推薦

                  What exactly is Python multiprocessing Module#39;s .join() Method Doing?(Python 多處理模塊的 .join() 方法到底在做什么?)
                  Passing multiple parameters to pool.map() function in Python(在 Python 中將多個參數(shù)傳遞給 pool.map() 函數(shù))
                  multiprocessing.pool.MaybeEncodingError: #39;TypeError(quot;cannot serialize #39;_io.BufferedReader#39; objectquot;,)#39;(multiprocessing.pool.MaybeEncodingError: TypeError(cannot serialize _io.BufferedReader object,)) - IT屋-程序員軟件開
                  Python Multiprocess Pool. How to exit the script when one of the worker process determines no more work needs to be done?(Python 多進程池.當其中一個工作進程確定不再需要完成工作時,如何退出腳本?) - IT屋-程序員
                  How do you pass a Queue reference to a function managed by pool.map_async()?(如何將隊列引用傳遞給 pool.map_async() 管理的函數(shù)?)
                  yet another confusion with multiprocessing error, #39;module#39; object has no attribute #39;f#39;(與多處理錯誤的另一個混淆,“模塊對象沒有屬性“f)

                      <tbody id='OxTMg'></tbody>
                  1. <tfoot id='OxTMg'></tfoot>

                    <i id='OxTMg'><tr id='OxTMg'><dt id='OxTMg'><q id='OxTMg'><span id='OxTMg'><b id='OxTMg'><form id='OxTMg'><ins id='OxTMg'></ins><ul id='OxTMg'></ul><sub id='OxTMg'></sub></form><legend id='OxTMg'></legend><bdo id='OxTMg'><pre id='OxTMg'><center id='OxTMg'></center></pre></bdo></b><th id='OxTMg'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='OxTMg'><tfoot id='OxTMg'></tfoot><dl id='OxTMg'><fieldset id='OxTMg'></fieldset></dl></div>

                    <small id='OxTMg'></small><noframes id='OxTMg'>

                        <bdo id='OxTMg'></bdo><ul id='OxTMg'></ul>
                        • <legend id='OxTMg'><style id='OxTMg'><dir id='OxTMg'><q id='OxTMg'></q></dir></style></legend>

                          1. 主站蜘蛛池模板: 亚洲视频免费在线 | 欧美福利影院 | 免费高清成人 | 国偷自产av一区二区三区 | 精品国产一区二区三区久久久久久 | 国产区久久 | 神马久久久久久久久久 | 欧美精品一区二区三区四区五区 | 中文字幕第二十页 | 国产成人在线视频免费观看 | 欧美视频二区 | 亚洲性视频网站 | 亚洲精品视频一区 | 成人免费一区二区三区牛牛 | 99re视频这里只有精品 | 久久久久国产精品 | 国产va| 亚洲乱码一区二区三区在线观看 | 欧美在线网站 | 国产女人精品视频 | 欧美日韩国产在线观看 | 欧美精品99 | 91九色在线观看 | 91成人午夜性a一级毛片 | 国产精品福利在线观看 | 久久国产精品网站 | 国产美女在线观看 | 久久精品国内 | 亚洲国产精品视频 | 国产精品亚洲视频 | 九九亚洲 | 欧美午夜精品 | 欧美日韩国产一区二区三区 | 黑人巨大精品欧美一区二区一视频 | 久久精品国产久精国产 | 久草视频观看 | 国产专区在线 | 久久精品国产一区老色匹 | 久久大 | 亚洲成人三级 | 亚洲视频在线看 |