久久久久久久av_日韩在线中文_看一级毛片视频_日本精品二区_成人深夜福利视频_武道仙尊动漫在线观看

<small id='SsmMD'></small><noframes id='SsmMD'>

    • <bdo id='SsmMD'></bdo><ul id='SsmMD'></ul>
    <i id='SsmMD'><tr id='SsmMD'><dt id='SsmMD'><q id='SsmMD'><span id='SsmMD'><b id='SsmMD'><form id='SsmMD'><ins id='SsmMD'></ins><ul id='SsmMD'></ul><sub id='SsmMD'></sub></form><legend id='SsmMD'></legend><bdo id='SsmMD'><pre id='SsmMD'><center id='SsmMD'></center></pre></bdo></b><th id='SsmMD'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='SsmMD'><tfoot id='SsmMD'></tfoot><dl id='SsmMD'><fieldset id='SsmMD'></fieldset></dl></div>

    <legend id='SsmMD'><style id='SsmMD'><dir id='SsmMD'><q id='SsmMD'></q></dir></style></legend>

        <tfoot id='SsmMD'></tfoot>

        python多處理的生產者/消費者問題

        producer/consumer problem with python multiprocessing(python多處理的生產者/消費者問題)

        <legend id='6nzvl'><style id='6nzvl'><dir id='6nzvl'><q id='6nzvl'></q></dir></style></legend>

          <i id='6nzvl'><tr id='6nzvl'><dt id='6nzvl'><q id='6nzvl'><span id='6nzvl'><b id='6nzvl'><form id='6nzvl'><ins id='6nzvl'></ins><ul id='6nzvl'></ul><sub id='6nzvl'></sub></form><legend id='6nzvl'></legend><bdo id='6nzvl'><pre id='6nzvl'><center id='6nzvl'></center></pre></bdo></b><th id='6nzvl'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='6nzvl'><tfoot id='6nzvl'></tfoot><dl id='6nzvl'><fieldset id='6nzvl'></fieldset></dl></div>
          • <bdo id='6nzvl'></bdo><ul id='6nzvl'></ul>
          • <tfoot id='6nzvl'></tfoot>
              <tbody id='6nzvl'></tbody>

              <small id='6nzvl'></small><noframes id='6nzvl'>

                  本文介紹了python多處理的生產者/消費者問題的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學習吧!

                  問題描述

                  限時送ChatGPT賬號..

                  我正在編寫一個有一個生產者和多個消費者的服務器程序,讓我感到困惑的是只有第一個放入隊列的任務生產者得到消耗,之后排隊的任務不再被消耗,它們仍然存在永遠在隊列中.

                  I am writing a server program with one producer and multiple consumers, what confuses me is only the first task producer put into the queue gets consumed, after which tasks enqueued no longer get consumed, they remain in the queue forever.

                  from multiprocessing import Process, Queue, cpu_count
                  from http import httpserv
                  import time
                  
                  def work(queue):
                      while True:
                          task = queue.get()
                          if task is None:
                              break
                          time.sleep(5)
                          print "task done:", task
                      queue.put(None)
                  
                  class Manager:
                      def __init__(self):
                          self.queue = Queue()
                          self.NUMBER_OF_PROCESSES = cpu_count()
                  
                      def start(self):
                          self.workers = [Process(target=work, args=(self.queue,))
                                          for i in xrange(self.NUMBER_OF_PROCESSES)]
                          for w in self.workers:
                              w.start()
                  
                          httpserv(self.queue)
                  
                      def stop(self):
                          self.queue.put(None)
                          for i in range(self.NUMBER_OF_PROCESSES):
                              self.workers[i].join()
                          queue.close()
                  
                  Manager().start()
                  

                  生產者是一個 HTTP 服務器,一旦接收到任務,它就會將任務放入隊列中來自用戶的請求.看來消費者流程還在當隊列中有新任務時阻塞,這很奇怪.

                  The producer is a HTTP server which put a task in the queue once receive a request from the user. It seems that consumer processes are still blocked when there are new tasks in the queue, which is weird.

                  附:另外兩個與上述無關的問題,我不確定是否最好將 HTTP 服務器放在自己的進程中而不是主進程中進程,如果是,我怎樣才能讓主進程繼續運行子進程結束.第二個問題,什么是最好的方法來阻止HTTP 服務器優雅嗎?

                  P.S. Another two questions not relating to the above, I am not sure if it's better to put HTTP server in its own process other than the main process, if yes how can I make the main process keep running before all children processes end. Second question, what's the best way to stop the HTTP server gracefully?

                  編輯:添加生產者代碼,它只是一個簡單的python wsgi服務器:

                  Edit: add producer code, it's just a simple python wsgi server:

                  import fapws._evwsgi as evwsgi
                  from fapws import base
                  
                  def httpserv(queue):
                      evwsgi.start("0.0.0.0", 8080)
                      evwsgi.set_base_module(base)
                  
                      def request_1(environ, start_response):
                          start_response('200 OK', [('Content-Type','text/html')])
                          queue.put('task_1')
                          return ["request 1!"]
                  
                      def request_2(environ, start_response):
                          start_response('200 OK', [('Content-Type','text/html')])
                          queue.put('task_2')
                          return ["request 2!!"]
                  
                      evwsgi.wsgi_cb(("/request_1", request_1))
                      evwsgi.wsgi_cb(("/request_2", request_2))
                  
                      evwsgi.run()
                  

                  推薦答案

                  我認為 Web 服務器部分一定有問題,因為它運行良好:

                  I think there must be something wrong with the web server part, as this works perfectly:

                  from multiprocessing import Process, Queue, cpu_count
                  import random
                  import time
                  
                  
                  def serve(queue):
                      works = ["task_1", "task_2"]
                      while True:
                          time.sleep(0.01)
                          queue.put(random.choice(works))
                  
                  
                  def work(id, queue):
                      while True:
                          task = queue.get()
                          if task is None:
                              break
                          time.sleep(0.05)
                          print "%d task:" % id, task
                      queue.put(None)
                  
                  
                  class Manager:
                      def __init__(self):
                          self.queue = Queue()
                          self.NUMBER_OF_PROCESSES = cpu_count()
                  
                      def start(self):
                          print "starting %d workers" % self.NUMBER_OF_PROCESSES
                          self.workers = [Process(target=work, args=(i, self.queue,))
                                          for i in xrange(self.NUMBER_OF_PROCESSES)]
                          for w in self.workers:
                              w.start()
                  
                          serve(self.queue)
                  
                      def stop(self):
                          self.queue.put(None)
                          for i in range(self.NUMBER_OF_PROCESSES):
                              self.workers[i].join()
                          self.queue.close()
                  
                  
                  Manager().start()
                  

                  樣本輸出:

                  starting 2 workers
                  0 task: task_1
                  1 task: task_2
                  0 task: task_2
                  1 task: task_1
                  0 task: task_1
                  

                  這篇關于python多處理的生產者/消費者問題的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!

                  【網站聲明】本站部分內容來源于互聯網,旨在幫助大家更快的解決問題,如果有圖片或者內容侵犯了您的權益,請聯系我們刪除處理,感謝您的支持!

                  相關文檔推薦

                  What exactly is Python multiprocessing Module#39;s .join() Method Doing?(Python 多處理模塊的 .join() 方法到底在做什么?)
                  Passing multiple parameters to pool.map() function in Python(在 Python 中將多個參數傳遞給 pool.map() 函數)
                  multiprocessing.pool.MaybeEncodingError: #39;TypeError(quot;cannot serialize #39;_io.BufferedReader#39; objectquot;,)#39;(multiprocessing.pool.MaybeEncodingError: TypeError(cannot serialize _io.BufferedReader object,)) - IT屋-程序員軟件開
                  Python Multiprocess Pool. How to exit the script when one of the worker process determines no more work needs to be done?(Python 多進程池.當其中一個工作進程確定不再需要完成工作時,如何退出腳本?) - IT屋-程序員
                  How do you pass a Queue reference to a function managed by pool.map_async()?(如何將隊列引用傳遞給 pool.map_async() 管理的函數?)
                  yet another confusion with multiprocessing error, #39;module#39; object has no attribute #39;f#39;(與多處理錯誤的另一個混淆,“模塊對象沒有屬性“f)
                    <tfoot id='sVuLV'></tfoot>
                  1. <i id='sVuLV'><tr id='sVuLV'><dt id='sVuLV'><q id='sVuLV'><span id='sVuLV'><b id='sVuLV'><form id='sVuLV'><ins id='sVuLV'></ins><ul id='sVuLV'></ul><sub id='sVuLV'></sub></form><legend id='sVuLV'></legend><bdo id='sVuLV'><pre id='sVuLV'><center id='sVuLV'></center></pre></bdo></b><th id='sVuLV'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='sVuLV'><tfoot id='sVuLV'></tfoot><dl id='sVuLV'><fieldset id='sVuLV'></fieldset></dl></div>
                      <tbody id='sVuLV'></tbody>

                            <bdo id='sVuLV'></bdo><ul id='sVuLV'></ul>
                            <legend id='sVuLV'><style id='sVuLV'><dir id='sVuLV'><q id='sVuLV'></q></dir></style></legend>
                          • <small id='sVuLV'></small><noframes id='sVuLV'>

                            主站蜘蛛池模板: 福利网站在线观看 | 国产视频一区二区三区四区 | 亚洲资源在线观看 | 久久99精品久久久久久琪琪 | 国内精品久久久久 | 久久精品2| 青青草久草 | 韩日av| 在线观看视频一区二区 | 日本免费黄色网址 | 欧美一区二区三区在线播放 | 懂色av色吟av夜夜嗨 | 成人综合婷婷国产精品久久 | 欧美性生交 | 中文字幕一区在线观看 | 国产精品久久久一区二区 | 久久精品久久精品 | 国产精品无遮挡 | 日韩免费视频一区二区 | 欧美日韩免费在线观看 | 精品一区二区三 | 97精品视频 | 色综合久 | 国产午夜一区二区 | 中文日韩在线 | 免费色片| 成人精品国产 | 特黄一级视频 | av成人在线播放 | 超碰在线国产 | 久久精品一区二区 | 国产欧美在线观看 | 国产成人精品一区二区三区福利 | 日韩视频免费 | 日韩一区二区在线视频 | 日本少妇高潮达到高潮 | 国产福利91精品一区二区三区 | 日韩不卡一区二区 | 亚洲欧美一区二区三区在线 | 在线观看黄色片 | 深夜福利视频在线观看 |