pbootcms网站模板|日韩1区2区|织梦模板||网站源码|日韩1区2区|jquery建站特效-html5模板网

      <tfoot id='lshdA'></tfoot>

      <legend id='lshdA'><style id='lshdA'><dir id='lshdA'><q id='lshdA'></q></dir></style></legend>
    1. <small id='lshdA'></small><noframes id='lshdA'>

      <i id='lshdA'><tr id='lshdA'><dt id='lshdA'><q id='lshdA'><span id='lshdA'><b id='lshdA'><form id='lshdA'><ins id='lshdA'></ins><ul id='lshdA'></ul><sub id='lshdA'></sub></form><legend id='lshdA'></legend><bdo id='lshdA'><pre id='lshdA'><center id='lshdA'></center></pre></bdo></b><th id='lshdA'></th></span></q></dt></tr></i><div class="ljz5bj5" id='lshdA'><tfoot id='lshdA'></tfoot><dl id='lshdA'><fieldset id='lshdA'></fieldset></dl></div>

      • <bdo id='lshdA'></bdo><ul id='lshdA'></ul>

      1. python多處理:某些函數完成后不返回(隊列材料太大

        python multiprocessing: some functions do not return when they are complete (queue material too big)(python多處理:某些函數完成后不返回(隊列材料太大))

            <tbody id='xEJQe'></tbody>

          <small id='xEJQe'></small><noframes id='xEJQe'>

          <tfoot id='xEJQe'></tfoot>

              <legend id='xEJQe'><style id='xEJQe'><dir id='xEJQe'><q id='xEJQe'></q></dir></style></legend>

                <i id='xEJQe'><tr id='xEJQe'><dt id='xEJQe'><q id='xEJQe'><span id='xEJQe'><b id='xEJQe'><form id='xEJQe'><ins id='xEJQe'></ins><ul id='xEJQe'></ul><sub id='xEJQe'></sub></form><legend id='xEJQe'></legend><bdo id='xEJQe'><pre id='xEJQe'><center id='xEJQe'></center></pre></bdo></b><th id='xEJQe'></th></span></q></dt></tr></i><div class="aknwhmd" id='xEJQe'><tfoot id='xEJQe'></tfoot><dl id='xEJQe'><fieldset id='xEJQe'></fieldset></dl></div>

                  <bdo id='xEJQe'></bdo><ul id='xEJQe'></ul>
                  本文介紹了python多處理:某些函數完成后不返回(隊列材料太大)的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學習吧!

                  問題描述

                  限時送ChatGPT賬號..

                  我正在使用多處理的進程和隊列.我并行啟動了幾個函數,并且大多數函數都表現良好:它們完成,它們的輸出進入它們的隊列,它們顯示為 .is_alive() == False.但是由于某種原因,一些函數沒有運行.它們總是顯示 .is_alive() == True,即使在函數的最后一行(打印語句說完成")完成之后也是如此.無論我啟動了哪些功能,都會發生這種情況,即使它只有一個.如果不并行運行,則函數運行良好并正常返回.什么種類可能是問題?

                  I am using multiprocessing's Process and Queue. I start several functions in parallel and most behave nicely: they finish, their output goes to their Queue, and they show up as .is_alive() == False. But for some reason a couple of functions are not behaving. They always show .is_alive() == True, even after the last line in the function (a print statement saying "Finished") is complete. This happens regardless of the set of functions I launch, even it there's only one. If not run in parallel, the functions behave fine and return normally. What kind of thing might be the problem?

                  這是我用來管理作業的通用函數.我沒有展示的只是我傳遞給它的函數.它們很長,經常使用 matplotlib,有時會啟動一些 shell 命令,但我不知道失敗的命令有什么共同點.

                  Here's the generic function I'm using to manage the jobs. All I'm not showing is the functions I'm passing to it. They're long, often use matplotlib, sometimes launch some shell commands, but I cannot figure out what the failing ones have in common.

                  def  runFunctionsInParallel(listOf_FuncAndArgLists):
                      """
                      Take a list of lists like [function, arg1, arg2, ...]. Run those functions in parallel, wait for them all to finish, and return the list of their return values, in order.   
                      """
                      from multiprocessing import Process, Queue
                  
                      def storeOutputFFF(fff,theArgs,que): #add a argument to function for assigning a queue
                          print 'MULTIPROCESSING: Launching %s in parallel '%fff.func_name
                          que.put(fff(*theArgs)) #we're putting return value into queue
                          print 'MULTIPROCESSING: Finished %s in parallel! '%fff.func_name
                          # We get this far even for "bad" functions
                          return
                  
                      queues=[Queue() for fff in listOf_FuncAndArgLists] #create a queue object for each function
                      jobs = [Process(target=storeOutputFFF,args=[funcArgs[0],funcArgs[1:],queues[iii]]) for iii,funcArgs in enumerate(listOf_FuncAndArgLists)]
                      for job in jobs: job.start() # Launch them all
                      import time
                      from math import sqrt
                      n=1
                      while any([jj.is_alive() for jj in jobs]): # debugging section shows progress updates
                          n+=1
                          time.sleep(5+sqrt(n)) # Wait a while before next update. Slow down updates for really long runs.
                          print('
                  ---------------------------------------------------
                  '+ '	'.join(['alive?','Job','exitcode','Func',])+ '
                  ---------------------------------------------------')
                          print('
                  '.join(['%s:	%s:	%s:	%s'%(job.is_alive()*'Yes',job.name,job.exitcode,listOf_FuncAndArgLists[ii][0].func_name) for ii,job in enumerate(jobs)]))
                          print('---------------------------------------------------
                  ')
                      # I never get to the following line when one of the "bad" functions is running.
                      for job in jobs: job.join() # Wait for them all to finish... Hm, Is this needed to get at the Queues?
                      # And now, collect all the outputs:
                      return([queue.get() for queue in queues])
                  

                  推薦答案

                  好吧,好像函數的輸出太大時,用來填充Queue的管道被堵塞了(我粗略的理解?這是一個未解決的/關閉的錯誤?http://bugs.python.org/issue8237).我已經修改了我的問題中的代碼,以便有一些緩沖(在進程運行時定期清空隊列),這解決了我所有的問題.所以現在這需要一組任務(函數及其參數),啟動它們,并收集輸出.我希望它看起來更簡單/更干凈.

                  Alright, it seems that the pipe used to fill the Queue gets plugged when the output of a function is too big (my crude understanding? This is an unresolved/closed bug? http://bugs.python.org/issue8237). I have modified the code in my question so that there is some buffering (queues are regularly emptied while processes are running), which solves all my problems. So now this takes a collection of tasks (functions and their arguments), launches them, and collects the outputs. I wish it were simpler /cleaner looking.

                  編輯(2014 年 9 月;2017 年 11 月更新:重寫以提高可讀性):我正在使用我此后所做的增強來更新代碼.新代碼(功能相同,但功能更好)在這里:https://gitlab.com/cpbl/cpblUtilities/blob/master/parallel.py

                  Edit (2014 Sep; update 2017 Nov: rewritten for readability): I'm updating the code with the enhancements I've made since. The new code (same function, but better features) is here: https://gitlab.com/cpbl/cpblUtilities/blob/master/parallel.py

                  調用說明也在下方.

                  def runFunctionsInParallel(*args, **kwargs):
                      """ This is the main/only interface to class cRunFunctionsInParallel. See its documentation for arguments.
                      """
                      return cRunFunctionsInParallel(*args, **kwargs).launch_jobs()
                  
                  ###########################################################################################
                  ###
                  class cRunFunctionsInParallel():
                      ###
                      #######################################################################################
                      """Run any list of functions, each with any arguments and keyword-arguments, in parallel.
                  The functions/jobs should return (if anything) pickleable results. In order to avoid processes getting stuck due to the output queues overflowing, the queues are regularly collected and emptied.
                  You can now pass os.system or etc to this as the function, in order to parallelize at the OS level, with no need for a wrapper: I made use of hasattr(builtinfunction,'func_name') to check for a name.
                  Parameters
                  ----------
                  listOf_FuncAndArgLists : a list of lists 
                      List of up-to-three-element-lists, like [function, args, kwargs],
                      specifying the set of functions to be launched in parallel.  If an
                      element is just a function, rather than a list, then it is assumed
                      to have no arguments or keyword arguments. Thus, possible formats
                      for elements of the outer list are:
                        function
                        [function, list]
                        [function, list, dict]
                  kwargs: dict
                      One can also supply the kwargs once, for all jobs (or for those
                      without their own non-empty kwargs specified in the list)
                  names: an optional list of names to identify the processes.
                      If omitted, the function name is used, so if all the functions are
                      the same (ie merely with different arguments), then they would be
                      named indistinguishably
                  offsetsSeconds: int or list of ints
                      delay some functions' start times
                  expectNonzeroExit: True/False
                      Normal behaviour is to not proceed if any function exits with a
                      failed exit code. This can be used to override this behaviour.
                  parallel: True/False
                      Whenever the list of functions is longer than one, functions will
                      be run in parallel unless this parameter is passed as False
                  maxAtOnce: int
                      If nonzero, this limits how many jobs will be allowed to run at
                      once.  By default, this is set according to how many processors
                      the hardware has available.
                  showFinished : int
                      Specifies the maximum number of successfully finished jobs to show
                      in the text interface (before the last report, which should always
                      show them all).
                  Returns
                  -------
                  Returns a tuple of (return codes, return values), each a list in order of the jobs provided.
                  Issues
                  -------
                  Only tested on POSIX OSes.
                  Examples
                  --------
                  See the testParallel() method in this module
                      """
                  

                  這篇關于python多處理:某些函數完成后不返回(隊列材料太大)的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!

                  【網站聲明】本站部分內容來源于互聯網,旨在幫助大家更快的解決問題,如果有圖片或者內容侵犯了您的權益,請聯系我們刪除處理,感謝您的支持!

                  相關文檔推薦

                  What exactly is Python multiprocessing Module#39;s .join() Method Doing?(Python 多處理模塊的 .join() 方法到底在做什么?)
                  Passing multiple parameters to pool.map() function in Python(在 Python 中將多個參數傳遞給 pool.map() 函數)
                  multiprocessing.pool.MaybeEncodingError: #39;TypeError(quot;cannot serialize #39;_io.BufferedReader#39; objectquot;,)#39;(multiprocessing.pool.MaybeEncodingError: TypeError(cannot serialize _io.BufferedReader object,)) - IT屋-程序員軟件開
                  Python Multiprocess Pool. How to exit the script when one of the worker process determines no more work needs to be done?(Python 多進程池.當其中一個工作進程確定不再需要完成工作時,如何退出腳本?) - IT屋-程序員
                  How do you pass a Queue reference to a function managed by pool.map_async()?(如何將隊列引用傳遞給 pool.map_async() 管理的函數?)
                  yet another confusion with multiprocessing error, #39;module#39; object has no attribute #39;f#39;(與多處理錯誤的另一個混淆,“模塊對象沒有屬性“f)
                • <tfoot id='KoLVC'></tfoot>
                  <i id='KoLVC'><tr id='KoLVC'><dt id='KoLVC'><q id='KoLVC'><span id='KoLVC'><b id='KoLVC'><form id='KoLVC'><ins id='KoLVC'></ins><ul id='KoLVC'></ul><sub id='KoLVC'></sub></form><legend id='KoLVC'></legend><bdo id='KoLVC'><pre id='KoLVC'><center id='KoLVC'></center></pre></bdo></b><th id='KoLVC'></th></span></q></dt></tr></i><div class="yjeddx7" id='KoLVC'><tfoot id='KoLVC'></tfoot><dl id='KoLVC'><fieldset id='KoLVC'></fieldset></dl></div>

                  <small id='KoLVC'></small><noframes id='KoLVC'>

                        <bdo id='KoLVC'></bdo><ul id='KoLVC'></ul>
                          <tbody id='KoLVC'></tbody>
                            <legend id='KoLVC'><style id='KoLVC'><dir id='KoLVC'><q id='KoLVC'></q></dir></style></legend>

                          1. 主站蜘蛛池模板: 水性漆|墙面漆|木器家具漆|水漆涂料_晨阳水漆官网 | 江苏密集柜_电动_手动_移动_盛隆柜业江苏档案密集柜厂家 | 瓶盖扭矩测试仪-瓶盖扭力仪-全自动扭矩仪-济南三泉中石单品站 | PO膜_灌浆膜及地膜供应厂家 - 青州市鲁谊塑料厂 | 北京律师咨询_知名专业北京律师事务所_免费法律咨询 | 沟盖板_复合沟盖板厂_电力盖板_树脂雨水篦子-淄博拜斯特 | ◆大型吹塑加工|吹塑加工|吹塑代加工|吹塑加工厂|吹塑设备|滚塑加工|滚塑代加工-莱力奇塑业有限公司 | R507制冷剂,R22/R152a制冷剂厂家-浙江瀚凯制冷科技有限公司 | YT保温材料_YT无机保温砂浆_外墙保温材料_南阳银通节能建材高新技术开发有限公司 | 长沙发电机-湖南发电机-柴油发电机供应厂家-长沙明邦智能科技 | 塑料造粒机「厂家直销」-莱州鑫瑞迪机械有限公司 | 高空重型升降平台_高空液压举升平台_高空作业平台_移动式升降机-河南华鹰机械设备有限公司 | 深圳市宏康仪器科技有限公司-模拟高空低压试验箱-高温防爆试验箱-温控短路试验箱【官网】 | 不锈钢闸阀_球阀_蝶阀_止回阀_调节阀_截止阀-可拉伐阀门(上海)有限公司 | 凝胶成像系统(wb成像系统)百科-上海嘉鹏| 气动隔膜泵厂家-温州永嘉定远泵阀有限公司 | 托利多电子平台秤-高精度接线盒-托利多高精度电子秤|百科 | 礼仪庆典公司,礼仪策划公司,庆典公司,演出公司,演艺公司,年会酒会,生日寿宴,动工仪式,开工仪式,奠基典礼,商务会议,竣工落成,乔迁揭牌,签约启动-东莞市开门红文化传媒有限公司 | 排烟防火阀-消防排烟风机-正压送风口-厂家-价格-哪家好-德州鑫港旺通风设备有限公司 | 蔬菜配送公司|蔬菜配送中心|食材配送|饭堂配送|食堂配送-首宏公司 | 锂电池生产厂家-电动自行车航模无人机锂电池定制-世豹新能源 | 天津力值检测-天津管道检测-天津天诚工程检测技术有限公司 | 广州中央空调回收,二手中央空调回收,旧空调回收,制冷设备回收,冷气机组回收公司-广州益夫制冷设备回收公司 | 济南菜鸟驿站广告|青岛快递车车体|社区媒体-抖音|墙体广告-山东揽胜广告传媒有限公司 | 合肥防火门窗/隔断_合肥防火卷帘门厂家_安徽耐火窗_良万消防设备有限公司 | 微妙网,专业的动画师、特效师、CG模型设计师网站! - wmiao.com 超声波电磁流量计-液位计-孔板流量计-料位计-江苏信仪自动化仪表有限公司 | 户外-组合-幼儿园-不锈钢-儿童-滑滑梯-床-玩具-淘气堡-厂家-价格 | 河南正规膏药生产厂家-膏药贴牌-膏药代加工-修康药业集团官网 | 菲希尔X射线测厚仪-菲希尔库伦法测厚仪-无锡骏展仪器有限责任公司 | 刮板输送机,粉尘加湿搅拌机,螺旋输送机,布袋除尘器 | 耐酸碱胶管_耐腐蚀软管总成_化学品输送软管_漯河利通液压科技耐油耐磨喷砂软管|耐腐蚀化学软管 | 全温恒温摇床-水浴气浴恒温摇床-光照恒温培养摇床-常州金坛精达仪器制造有限公司 | 网络推广公司_网络营销方案策划_企业网络推广外包平台-上海澜推网络 | 盘古网络技术有限公司| 武汉高低温试验机-现货恒温恒湿试验箱-高低温湿热交变箱价格-湖北高天试验设备 | 橡胶接头|可曲挠橡胶接头|橡胶软接头安装使用教程-上海松夏官方网站 | 郑州律师咨询-郑州律师事务所_河南锦盾律师事务所 | 微信聊天记录恢复_手机短信删除怎么恢复_通讯录恢复软件下载-快易数据恢复 | 有机肥设备生产制造厂家,BB掺混肥搅拌机、复合肥设备生产线,有机肥料全部加工设备多少钱,对辊挤压造粒机,有机肥造粒设备 -- 郑州程翔重工机械有限公司 | 车牌识别道闸_停车场收费系统_人脸识别考勤机_速通门闸机_充电桩厂家_中全清茂官网 | 播音主持培训-中影人教育播音主持学苑「官网」-中国艺考界的贵族学校 |