久久久久久久av_日韩在线中文_看一级毛片视频_日本精品二区_成人深夜福利视频_武道仙尊动漫在线观看

<legend id='nSKGi'><style id='nSKGi'><dir id='nSKGi'><q id='nSKGi'></q></dir></style></legend>
<tfoot id='nSKGi'></tfoot>
<i id='nSKGi'><tr id='nSKGi'><dt id='nSKGi'><q id='nSKGi'><span id='nSKGi'><b id='nSKGi'><form id='nSKGi'><ins id='nSKGi'></ins><ul id='nSKGi'></ul><sub id='nSKGi'></sub></form><legend id='nSKGi'></legend><bdo id='nSKGi'><pre id='nSKGi'><center id='nSKGi'></center></pre></bdo></b><th id='nSKGi'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='nSKGi'><tfoot id='nSKGi'></tfoot><dl id='nSKGi'><fieldset id='nSKGi'></fieldset></dl></div>
  • <small id='nSKGi'></small><noframes id='nSKGi'>

        • <bdo id='nSKGi'></bdo><ul id='nSKGi'></ul>
      1. 使用帶有 multiprocessing.Pool 的鎖時遇到問題:酸洗錯

        Trouble using a lock with multiprocessing.Pool: pickling error(使用帶有 multiprocessing.Pool 的鎖時遇到問題:酸洗錯誤)

        <legend id='p7LYV'><style id='p7LYV'><dir id='p7LYV'><q id='p7LYV'></q></dir></style></legend>
        <i id='p7LYV'><tr id='p7LYV'><dt id='p7LYV'><q id='p7LYV'><span id='p7LYV'><b id='p7LYV'><form id='p7LYV'><ins id='p7LYV'></ins><ul id='p7LYV'></ul><sub id='p7LYV'></sub></form><legend id='p7LYV'></legend><bdo id='p7LYV'><pre id='p7LYV'><center id='p7LYV'></center></pre></bdo></b><th id='p7LYV'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='p7LYV'><tfoot id='p7LYV'></tfoot><dl id='p7LYV'><fieldset id='p7LYV'></fieldset></dl></div>

        <tfoot id='p7LYV'></tfoot>

                <small id='p7LYV'></small><noframes id='p7LYV'>

                  <tbody id='p7LYV'></tbody>
                • <bdo id='p7LYV'></bdo><ul id='p7LYV'></ul>
                  本文介紹了使用帶有 multiprocessing.Pool 的鎖時遇到問題:酸洗錯誤的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學習吧!

                  問題描述

                  限時送ChatGPT賬號..

                  我正在構建一個 python 模塊來從大量文本中提取標簽,雖然它的結果質量很高,但它的執行速度非常慢.我試圖通過使用多處理來加速這個過程,這也很有效,直到我嘗試引入一個鎖,以便一次只有一個進程連接到我們的數據庫.我一生都無法弄清楚如何完成這項工作-盡管進行了很多搜索和調整,但我仍然收到 PicklingError: Can't pickle <type 'thread.lock'>: attribute lookup thread.lock 失敗.這是有問題的代碼 - 在我嘗試將鎖定對象作為 f 的參數傳遞之前它運行良好.

                  I'm building a python module to extract tags from a large corpus of text, and while its results are high quality it executes very slowly. I'm trying to speed the process up by using multiprocessing, and that was working too, until I tried to introduce a lock so that only one process was connecting to our database at a time. I can't figure out for the life of me how to make this work - despite much searching and tweaking I am still getting a PicklingError: Can't pickle <type 'thread.lock'>: attribute lookup thread.lock failed. Here's the offending code - it worked fine until I tried to pass a lock object as an argument for f.

                  def make_network(initial_tag, max_tags = 2, max_iter = 3):
                      manager = Manager()
                      lock = manager.Lock()
                      pool = manager.Pool(8)
                  
                      # this is a very expensive function that I would like to parallelize 
                      # over a list of tags. It involves a (relatively cheap) call to an external
                      # database, which needs a lock to avoid simultaneous queries. It takes a list
                      # of strings (tags) as its sole argument, and returns a list of sets with entries
                      # corresponding to the input list.
                      f = partial(get_more_tags, max_tags = max_tags, lock = lock) 
                  
                      def _recursively_find_more_tags(tags, level):
                          if level >= max_iter:
                              raise StopIteration
                          new_tags = pool.map(f, tags)
                          to_search = []
                          for i, s in zip(tags, new_tags):
                              for t in s:
                                  joined = ' '.join(t)
                                  print i + "|" + joined
                                  to_search.append(joined)
                          try:
                              return _recursively_find_more_tags(to_search, level+1)
                          except StopIteration:
                              return None
                  
                      _recursively_find_more_tags([initial_tag], 0)
                  

                  推薦答案

                  你的問題是鎖對象不可picklable.在這種情況下,我可以為您看到兩種可能的解決方案.

                  Your problem is that lock objects are not picklable. I can see two possible solutions for you in that case.

                  • 為避免這種情況,您可以將鎖變量設為全局變量.然后,您將能夠在池進程函數中直接將其作為全局變量引用,而不必將其作為參數傳遞給池進程函數.這是因為 Python 在創建池進程時使用 OS fork 機制,因此將創建池進程的進程的全部內容復制到它們.這是將鎖傳遞給使用 multiprocessing 包創建的 Python 進程的唯一方法.順便說一句,沒有必要只為這個鎖使用 Manager 類.通過此更改,您的代碼將如下所示:

                  • To avoid this, you can make your lock variable a global variable. Then you will be able to reference it within your pool process function directly as a global variable, and will not have to pass it as an argument to the pool process function. This works because Python uses the OS fork mechanism when creating the pool processes and hence copies the entire contents of the process that creates the pool processes to them. This is the only way of passing a lock to a Python process created with the multiprocessing package. Incidentally, it is not necessary to use the Manager class just for this lock. With this change your code would look like this:

                  import multiprocessing
                  from functools import partial
                  
                  lock = None  # Global definition of lock
                  pool = None  # Global definition of pool
                  
                  
                  def make_network(initial_tag, max_tags=2, max_iter=3):
                      global lock
                      global pool
                      lock = multiprocessing.Lock()
                      pool = multiprocessing.Pool(8)
                  
                  
                  def get_more_tags():
                      global lock
                      pass
                  
                  
                  # this is a very expensive function that I would like to parallelize
                  # over a list of tags. It involves a (relatively cheap) call to an external
                  # database, which needs a lock to avoid simultaneous queries. It takes a
                  # list of strings (tags) as its sole argument, and returns a list of sets
                  # with entries corresponding to the input list.
                  f = partial(get_more_tags, max_tags=max_tags) 
                  
                  def _recursively_find_more_tags(tags, level):
                      global pool
                      if level >= max_iter:
                          raise StopIteration
                      new_tags = pool.map(f, tags)
                      to_search = []
                      for i, s in zip(tags, new_tags):
                          for t in s:
                              joined = ' '.join(t)
                              print(i + "|" + joined)
                              to_search.append(joined)
                      try:
                          return _recursively_find_more_tags(to_search, level + 1)
                      except StopIteration:
                          return None
                  
                  _recursively_find_more_tags([initial_tag], 0)
                  

                  在您的真實代碼中,鎖和池變量可能是類實例變量.

                  In your real code, it is possible that the lock and pool variables might be class instance variables.

                  • 完全避免使用鎖但開銷可能稍高的第二種解決方案是使用 multiprocessing.Process 創建另一個進程并通過 multiprocessing.Queue 到您的每個池進程.此過程將負責運行您的數據庫查詢.您將使用隊列來允許池進程將參數發送到管理數據庫查詢的進程.由于所有池進程將使用相同的隊列,因此對數據庫的訪問將自動序列化.額外的開銷將來自數據庫查詢參數和查詢響應的酸洗/解酸.請注意,您可以將 multiprocessing.Queue 對象作為參數傳遞給池進程.另請注意,基于 multiprocessing.Lock 的解決方案不適用于未使用 fork 語義創建進程的 Windows.
                  • A second solution which avoids the use of locks altogether but which might have slightly higher overhead would be to create another process with multiprocessing.Process and connect it via a multiprocessing.Queue to each of your pool processes. This process would be responsible for running your database query. You would use the queue to allow your pool processes to send parameters to the process that managed the database query. Since all the pool processes would use the same queue, access to the database would automatically be serialized. The additional overheads would come from the pickling/unpickling of the database query arguments and the query response. Note that you can pass a multiprocessing.Queue object to a pool process as an argument. Note also that the multiprocessing.Lock based solution would not work on Windows where process are not created with fork semantics.

                  這篇關于使用帶有 multiprocessing.Pool 的鎖時遇到問題:酸洗錯誤的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!

                  【網站聲明】本站部分內容來源于互聯網,旨在幫助大家更快的解決問題,如果有圖片或者內容侵犯了您的權益,請聯系我們刪除處理,感謝您的支持!

                  相關文檔推薦

                  What exactly is Python multiprocessing Module#39;s .join() Method Doing?(Python 多處理模塊的 .join() 方法到底在做什么?)
                  Passing multiple parameters to pool.map() function in Python(在 Python 中將多個參數傳遞給 pool.map() 函數)
                  multiprocessing.pool.MaybeEncodingError: #39;TypeError(quot;cannot serialize #39;_io.BufferedReader#39; objectquot;,)#39;(multiprocessing.pool.MaybeEncodingError: TypeError(cannot serialize _io.BufferedReader object,)) - IT屋-程序員軟件開
                  Python Multiprocess Pool. How to exit the script when one of the worker process determines no more work needs to be done?(Python 多進程池.當其中一個工作進程確定不再需要完成工作時,如何退出腳本?) - IT屋-程序員
                  How do you pass a Queue reference to a function managed by pool.map_async()?(如何將隊列引用傳遞給 pool.map_async() 管理的函數?)
                  yet another confusion with multiprocessing error, #39;module#39; object has no attribute #39;f#39;(與多處理錯誤的另一個混淆,“模塊對象沒有屬性“f)

                  <small id='8mdN2'></small><noframes id='8mdN2'>

                  <tfoot id='8mdN2'></tfoot>
                  • <bdo id='8mdN2'></bdo><ul id='8mdN2'></ul>
                  • <legend id='8mdN2'><style id='8mdN2'><dir id='8mdN2'><q id='8mdN2'></q></dir></style></legend>
                        <tbody id='8mdN2'></tbody>

                          <i id='8mdN2'><tr id='8mdN2'><dt id='8mdN2'><q id='8mdN2'><span id='8mdN2'><b id='8mdN2'><form id='8mdN2'><ins id='8mdN2'></ins><ul id='8mdN2'></ul><sub id='8mdN2'></sub></form><legend id='8mdN2'></legend><bdo id='8mdN2'><pre id='8mdN2'><center id='8mdN2'></center></pre></bdo></b><th id='8mdN2'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='8mdN2'><tfoot id='8mdN2'></tfoot><dl id='8mdN2'><fieldset id='8mdN2'></fieldset></dl></div>

                            主站蜘蛛池模板: 狠狠操av| 国产精品久久久久久久毛片 | 国产欧美精品一区二区 | 欧美一区二区三区在线观看 | 在线视频一区二区三区 | 久久精品网 | 91精品国产一区二区三区香蕉 | 91在线精品视频 | 日本在线免费看最新的电影 | 在线观看日本高清二区 | 久久www免费人成看片高清 | 久久久久久久久国产成人免费 | 亚洲一区二区在线 | 国产目拍亚洲精品99久久精品 | 久久国产成人精品国产成人亚洲 | 在线三级电影 | 波多野结衣二区 | 视频一区在线播放 | 免费在线观看成人 | 亚洲精色 | 亚洲视频免费在线 | 国产区一区 | 久久av一区二区三区 | 国产高清久久 | 国产一区二区三区视频 | 日韩精品免费看 | 91精品国产综合久久久久久丝袜 | 亚洲欧美自拍偷拍视频 | 午夜精品久久久久久久星辰影院 | 午夜小影院 | 成人精品鲁一区一区二区 | 中文字幕亚洲在线 | 精品欧美色视频网站在线观看 | 欧美日韩在线观看一区二区三区 | 看亚洲a级一级毛片 | 亚洲欧美精品在线 | 操人网 | 亚洲成人一区二区 | 中文字幕av色 | 久久婷婷av | 欧美一级黄色片免费观看 |