久久久久久久av_日韩在线中文_看一级毛片视频_日本精品二区_成人深夜福利视频_武道仙尊动漫在线观看

<tfoot id='yiT9Q'></tfoot>
    • <bdo id='yiT9Q'></bdo><ul id='yiT9Q'></ul>

  • <legend id='yiT9Q'><style id='yiT9Q'><dir id='yiT9Q'><q id='yiT9Q'></q></dir></style></legend>
    <i id='yiT9Q'><tr id='yiT9Q'><dt id='yiT9Q'><q id='yiT9Q'><span id='yiT9Q'><b id='yiT9Q'><form id='yiT9Q'><ins id='yiT9Q'></ins><ul id='yiT9Q'></ul><sub id='yiT9Q'></sub></form><legend id='yiT9Q'></legend><bdo id='yiT9Q'><pre id='yiT9Q'><center id='yiT9Q'></center></pre></bdo></b><th id='yiT9Q'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='yiT9Q'><tfoot id='yiT9Q'></tfoot><dl id='yiT9Q'><fieldset id='yiT9Q'></fieldset></dl></div>

      1. <small id='yiT9Q'></small><noframes id='yiT9Q'>

        多處理和蒔蘿可以一起做什么?

        What can multiprocessing and dill do together?(多處理和蒔蘿可以一起做什么?)

        <tfoot id='6GLKU'></tfoot>

        • <i id='6GLKU'><tr id='6GLKU'><dt id='6GLKU'><q id='6GLKU'><span id='6GLKU'><b id='6GLKU'><form id='6GLKU'><ins id='6GLKU'></ins><ul id='6GLKU'></ul><sub id='6GLKU'></sub></form><legend id='6GLKU'></legend><bdo id='6GLKU'><pre id='6GLKU'><center id='6GLKU'></center></pre></bdo></b><th id='6GLKU'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='6GLKU'><tfoot id='6GLKU'></tfoot><dl id='6GLKU'><fieldset id='6GLKU'></fieldset></dl></div>
            <legend id='6GLKU'><style id='6GLKU'><dir id='6GLKU'><q id='6GLKU'></q></dir></style></legend>

              <tbody id='6GLKU'></tbody>

            <small id='6GLKU'></small><noframes id='6GLKU'>

                  <bdo id='6GLKU'></bdo><ul id='6GLKU'></ul>
                  本文介紹了多處理和蒔蘿可以一起做什么?的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學習吧!

                  問題描述

                  限時送ChatGPT賬號..

                  我想在 Python 中使用 multiprocessing 庫.遺憾的是,multiprocessing 使用了 pickle,它不支持帶有閉包的函數、lambdas 或 __main__ 中的函數.這三個對我來說都很重要

                  I would like to use the multiprocessing library in Python. Sadly multiprocessing uses pickle which doesn't support functions with closures, lambdas, or functions in __main__. All three of these are important to me

                  In [1]: import pickle
                  
                  In [2]: pickle.dumps(lambda x: x)
                  PicklingError: Can't pickle <function <lambda> at 0x23c0e60>: it's not found as __main__.<lambda>
                  

                  幸運的是,dill 是一種更強大的泡菜.顯然 dill 在導入時執行魔術以使泡菜工作

                  Fortunately there is dill a more robust pickle. Apparently dill performs magic on import to make pickle work

                  In [3]: import dill
                  
                  In [4]: pickle.dumps(lambda x: x)
                  Out[4]: "cdill.dill
                  _load_type
                  p0
                  (S'FunctionType'
                  p1 ...
                  

                  這非常令人鼓舞,特別是因為我無法訪問多處理源代碼.可悲的是,我仍然無法讓這個非常基本的示例工作

                  This is very encouraging, particularly because I don't have access to the multiprocessing source code. Sadly, I still can't get this very basic example to work

                  import multiprocessing as mp
                  import dill
                  
                  p = mp.Pool(4)
                  print p.map(lambda x: x**2, range(10))
                  

                  這是為什么?我錯過了什么?multiprocessing+dill組合到底有什么限制?

                  Why is this? What am I missing? Exactly what are the limitations on the multiprocessing+dill combination?

                  mrockli@mrockli-notebook:~/workspace/toolz$ python testmp.py 
                      Temporary Edit for J.F Sebastian
                  
                  mrockli@mrockli-notebook:~/workspace/toolz$ python testmp.py 
                  Exception in thread Thread-2:
                  Traceback (most recent call last):
                    File "/home/mrockli/Software/anaconda/lib/python2.7/threading.py", line 808, in __bootstrap_inner
                      self.run()
                    File "/home/mrockli/Software/anaconda/lib/python2.7/threading.py", line 761, in run
                      self.__target(*self.__args, **self.__kwargs)
                    File "/home/mrockli/Software/anaconda/lib/python2.7/multiprocessing/pool.py", line 342, in _handle_tasks
                      put(task)
                  PicklingError: Can't pickle <type 'function'>: attribute lookup __builtin__.function failed
                  
                  ^C
                  ...lots of junk...
                  
                  [DEBUG/MainProcess] cleaning up worker 3
                  [DEBUG/MainProcess] cleaning up worker 2
                  [DEBUG/MainProcess] cleaning up worker 1
                  [DEBUG/MainProcess] cleaning up worker 0
                  [DEBUG/MainProcess] added worker
                  [DEBUG/MainProcess] added worker
                  [INFO/PoolWorker-5] child process calling self.run()
                  [INFO/PoolWorker-6] child process calling self.run()
                  [DEBUG/MainProcess] added worker
                  [INFO/PoolWorker-7] child process calling self.run()
                  [DEBUG/MainProcess] added worker
                  [INFO/PoolWorker-8] child process calling self.run()Exception in thread Thread-2:
                  Traceback (most recent call last):
                    File "/home/mrockli/Software/anaconda/lib/python2.7/threading.py", line 808, in __bootstrap_inner
                      self.run()
                    File "/home/mrockli/Software/anaconda/lib/python2.7/threading.py", line 761, in run
                      self.__target(*self.__args, **self.__kwargs)
                    File "/home/mrockli/Software/anaconda/lib/python2.7/multiprocessing/pool.py", line 342, in _handle_tasks
                      put(task)
                  PicklingError: Can't pickle <type 'function'>: attribute lookup __builtin__.function failed
                  
                  ^C
                  ...lots of junk...
                  
                  [DEBUG/MainProcess] cleaning up worker 3
                  [DEBUG/MainProcess] cleaning up worker 2
                  [DEBUG/MainProcess] cleaning up worker 1
                  [DEBUG/MainProcess] cleaning up worker 0
                  [DEBUG/MainProcess] added worker
                  [DEBUG/MainProcess] added worker
                  [INFO/PoolWorker-5] child process calling self.run()
                  [INFO/PoolWorker-6] child process calling self.run()
                  [DEBUG/MainProcess] added worker
                  [INFO/PoolWorker-7] child process calling self.run()
                  [DEBUG/MainProcess] added worker
                  [INFO/PoolWorker-8] child process calling self.run()
                  

                  推薦答案

                  multiprocessing 對酸洗做了一些錯誤的選擇.不要誤會我的意思,它做出了一些不錯的選擇,使其能夠腌制某些類型,以便它們可以在池的地圖功能中使用.然而,由于我們有 dill 可以進行酸洗,多處理自己的酸洗變得有點限制.實際上,如果 multiprocessing 要使用 pickle 而不是 cPickle... 并且還放棄一些它自己的酸洗覆蓋,那么 dill 可以接管并為 multiprocessing 提供更完整的序列化.

                  multiprocessing makes some bad choices about pickling. Don't get me wrong, it makes some good choices that enable it to pickle certain types so they can be used in a pool's map function. However, since we have dill that can do the pickling, multiprocessing's own pickling becomes a bit limiting. Actually, if multiprocessing were to use pickle instead of cPickle... and also drop some of it's own pickling overrides, then dill could take over and give a much more full serialization for multiprocessing.

                  在此之前,會有一個名為 pathos 的 multiprocessing 分支(the不幸的是,發布版本有點陳舊)消除了上述限制.Pathos 還添加了多處理沒有的一些不錯的功能,例如 map 函數中的多參數.Pathos 即將發布,經過一些溫和的更新——主要是轉換為 python 3.x.

                  Until that happens, there's a fork of multiprocessing called pathos (the release version is a bit stale, unfortunately) that removes the above limitations. Pathos also adds some nice features that multiprocessing doesn't have, like multi-args in the map function. Pathos is due for a release, after some mild updating -- mostly conversion to python 3.x.

                  Python 2.7.5 (default, Sep 30 2013, 20:15:49) 
                  [GCC 4.2.1 (Apple Inc. build 5566)] on darwin
                  Type "help", "copyright", "credits" or "license" for more information.
                  >>> import dill
                  >>> from pathos.multiprocessing import ProcessingPool    
                  >>> pool = ProcessingPool(nodes=4)
                  >>> result = pool.map(lambda x: x**2, range(10))
                  >>> result
                  [0, 1, 4, 9, 16, 25, 36, 49, 64, 81]
                  

                  只是為了展示一下 pathos.multiprocessing 可以做什么...

                  and just to show off a little of what pathos.multiprocessing can do...

                  >>> def busy_add(x,y, delay=0.01):
                  ...     for n in range(x):
                  ...        x += n
                  ...     for n in range(y):
                  ...        y -= n
                  ...     import time
                  ...     time.sleep(delay)
                  ...     return x + y
                  ... 
                  >>> def busy_squared(x):
                  ...     import time, random
                  ...     time.sleep(2*random.random())
                  ...     return x*x
                  ... 
                  >>> def squared(x):
                  ...     return x*x
                  ... 
                  >>> def quad_factory(a=1, b=1, c=0):
                  ...     def quad(x):
                  ...         return a*x**2 + b*x + c
                  ...     return quad
                  ... 
                  >>> square_plus_one = quad_factory(2,0,1)
                  >>> 
                  >>> def test1(pool):
                  ...     print pool
                  ...     print "x: %s
                  " % str(x)
                  ...     print pool.map.__name__
                  ...     start = time.time()
                  ...     res = pool.map(squared, x)
                  ...     print "time to results:", time.time() - start
                  ...     print "y: %s
                  " % str(res)
                  ...     print pool.imap.__name__
                  ...     start = time.time()
                  ...     res = pool.imap(squared, x)
                  ...     print "time to queue:", time.time() - start
                  ...     start = time.time()
                  ...     res = list(res)
                  ...     print "time to results:", time.time() - start
                  ...     print "y: %s
                  " % str(res)
                  ...     print pool.amap.__name__
                  ...     start = time.time()
                  ...     res = pool.amap(squared, x)
                  ...     print "time to queue:", time.time() - start
                  ...     start = time.time()
                  ...     res = res.get()
                  ...     print "time to results:", time.time() - start
                  ...     print "y: %s
                  " % str(res)
                  ... 
                  >>> def test2(pool, items=4, delay=0):
                  ...     _x = range(-items/2,items/2,2)
                  ...     _y = range(len(_x))
                  ...     _d = [delay]*len(_x)
                  ...     print map
                  ...     res1 = map(busy_squared, _x)
                  ...     res2 = map(busy_add, _x, _y, _d)
                  ...     print pool.map
                  ...     _res1 = pool.map(busy_squared, _x)
                  ...     _res2 = pool.map(busy_add, _x, _y, _d)
                  ...     assert _res1 == res1
                  ...     assert _res2 == res2
                  ...     print pool.imap
                  ...     _res1 = pool.imap(busy_squared, _x)
                  ...     _res2 = pool.imap(busy_add, _x, _y, _d)
                  ...     assert list(_res1) == res1
                  ...     assert list(_res2) == res2
                  ...     print pool.amap
                  ...     _res1 = pool.amap(busy_squared, _x)
                  ...     _res2 = pool.amap(busy_add, _x, _y, _d)
                  ...     assert _res1.get() == res1
                  ...     assert _res2.get() == res2
                  ...     print ""
                  ... 
                  >>> def test3(pool): # test against a function that should fail in pickle
                  ...     print pool
                  ...     print "x: %s
                  " % str(x)
                  ...     print pool.map.__name__
                  ...     start = time.time()
                  ...     res = pool.map(square_plus_one, x)
                  ...     print "time to results:", time.time() - start
                  ...     print "y: %s
                  " % str(res)
                  ... 
                  >>> def test4(pool, maxtries, delay):
                  ...     print pool
                  ...     m = pool.amap(busy_add, x, x)
                  ...     tries = 0
                  ...     while not m.ready():
                  ...         time.sleep(delay)
                  ...         tries += 1
                  ...         print "TRY: %s" % tries
                  ...         if tries >= maxtries:
                  ...             print "TIMEOUT"
                  ...             break
                  ...     print m.get()
                  ... 
                  >>> import time
                  >>> x = range(18)
                  >>> delay = 0.01
                  >>> items = 20
                  >>> maxtries = 20
                  >>> from pathos.multiprocessing import ProcessingPool as Pool
                  >>> pool = Pool(nodes=4)
                  >>> test1(pool)
                  <pool ProcessingPool(ncpus=4)>
                  x: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17]
                  
                  map
                  time to results: 0.0553691387177
                  y: [0, 1, 4, 9, 16, 25, 36, 49, 64, 81, 100, 121, 144, 169, 196, 225, 256, 289]
                  
                  imap
                  time to queue: 7.91549682617e-05
                  time to results: 0.102381229401
                  y: [0, 1, 4, 9, 16, 25, 36, 49, 64, 81, 100, 121, 144, 169, 196, 225, 256, 289]
                  
                  amap
                  time to queue: 7.08103179932e-05
                  time to results: 0.0489699840546
                  y: [0, 1, 4, 9, 16, 25, 36, 49, 64, 81, 100, 121, 144, 169, 196, 225, 256, 289]
                  
                  >>> test2(pool, items, delay)
                  <built-in function map>
                  <bound method ProcessingPool.map of <pool ProcessingPool(ncpus=4)>>
                  <bound method ProcessingPool.imap of <pool ProcessingPool(ncpus=4)>>
                  <bound method ProcessingPool.amap of <pool ProcessingPool(ncpus=4)>>
                  
                  >>> test3(pool)
                  <pool ProcessingPool(ncpus=4)>
                  x: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17]
                  
                  map
                  time to results: 0.0523059368134
                  y: [1, 3, 9, 19, 33, 51, 73, 99, 129, 163, 201, 243, 289, 339, 393, 451, 513, 579]
                  
                  >>> test4(pool, maxtries, delay)
                  <pool ProcessingPool(ncpus=4)>
                  TRY: 1
                  TRY: 2
                  TRY: 3
                  TRY: 4
                  TRY: 5
                  TRY: 6
                  TRY: 7
                  [0, 2, 4, 6, 8, 10, 12, 14, 16, 18, 20, 22, 24, 26, 28, 30, 32, 34]
                  

                  這篇關于多處理和蒔蘿可以一起做什么?的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!

                  【網站聲明】本站部分內容來源于互聯網,旨在幫助大家更快的解決問題,如果有圖片或者內容侵犯了您的權益,請聯系我們刪除處理,感謝您的支持!

                  相關文檔推薦

                  What exactly is Python multiprocessing Module#39;s .join() Method Doing?(Python 多處理模塊的 .join() 方法到底在做什么?)
                  Passing multiple parameters to pool.map() function in Python(在 Python 中將多個參數傳遞給 pool.map() 函數)
                  multiprocessing.pool.MaybeEncodingError: #39;TypeError(quot;cannot serialize #39;_io.BufferedReader#39; objectquot;,)#39;(multiprocessing.pool.MaybeEncodingError: TypeError(cannot serialize _io.BufferedReader object,)) - IT屋-程序員軟件開
                  Python Multiprocess Pool. How to exit the script when one of the worker process determines no more work needs to be done?(Python 多進程池.當其中一個工作進程確定不再需要完成工作時,如何退出腳本?) - IT屋-程序員
                  How do you pass a Queue reference to a function managed by pool.map_async()?(如何將隊列引用傳遞給 pool.map_async() 管理的函數?)
                  yet another confusion with multiprocessing error, #39;module#39; object has no attribute #39;f#39;(與多處理錯誤的另一個混淆,“模塊對象沒有屬性“f)

                      <bdo id='grOkw'></bdo><ul id='grOkw'></ul>
                        <tbody id='grOkw'></tbody>

                        • <small id='grOkw'></small><noframes id='grOkw'>

                          <tfoot id='grOkw'></tfoot>

                          <i id='grOkw'><tr id='grOkw'><dt id='grOkw'><q id='grOkw'><span id='grOkw'><b id='grOkw'><form id='grOkw'><ins id='grOkw'></ins><ul id='grOkw'></ul><sub id='grOkw'></sub></form><legend id='grOkw'></legend><bdo id='grOkw'><pre id='grOkw'><center id='grOkw'></center></pre></bdo></b><th id='grOkw'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='grOkw'><tfoot id='grOkw'></tfoot><dl id='grOkw'><fieldset id='grOkw'></fieldset></dl></div>

                            <legend id='grOkw'><style id='grOkw'><dir id='grOkw'><q id='grOkw'></q></dir></style></legend>
                            主站蜘蛛池模板: 久久久免费在线观看 | 中文字幕在线播放第一页 | a级片网站 | 五月综合色啪 | 91中文| 日韩av高清在线 | 欧美精品久久久久 | 国产黄色麻豆视频 | 久草网址| 国产美女自拍视频 | av资源在线看 | 久久久福利 | 亚洲综合一区二区三区 | 亚洲第一免费播放区 | 色婷婷久久久亚洲一区二区三区 | 午夜一区| 久久精品久久久 | 日韩欧美亚洲 | 国产成人精品区一区二区不卡 | 久久久久久蜜桃一区二区 | 一级毛片视频 | 99re6在线视频 | 男女网站免费观看 | 国产精品九九 | 国产精品一区二区三区免费观看 | 在线免费黄色 | 日韩成人在线观看 | 精品久久久久久久久久久院品网 | 亚洲国产精品久久久久秋霞不卡 | 黄色av网站在线免费观看 | 亚洲一区二区av在线 | 欧美一级在线视频 | 日日日日日日bbbbb视频 | 国产视频中文字幕 | 久久精品视频免费看 | 国产精品久久久久久妇女6080 | 久久久区 | 国产乱码精品一区二区三区中文 | 97国产精品 | 久久久久久久国产 | 天天干天天玩天天操 |