pbootcms网站模板|日韩1区2区|织梦模板||网站源码|日韩1区2区|jquery建站特效-html5模板网

    <legend id='tetoQ'><style id='tetoQ'><dir id='tetoQ'><q id='tetoQ'></q></dir></style></legend>

      <bdo id='tetoQ'></bdo><ul id='tetoQ'></ul>
  1. <i id='tetoQ'><tr id='tetoQ'><dt id='tetoQ'><q id='tetoQ'><span id='tetoQ'><b id='tetoQ'><form id='tetoQ'><ins id='tetoQ'></ins><ul id='tetoQ'></ul><sub id='tetoQ'></sub></form><legend id='tetoQ'></legend><bdo id='tetoQ'><pre id='tetoQ'><center id='tetoQ'></center></pre></bdo></b><th id='tetoQ'></th></span></q></dt></tr></i><div class="l37t7xp" id='tetoQ'><tfoot id='tetoQ'></tfoot><dl id='tetoQ'><fieldset id='tetoQ'></fieldset></dl></div>

    1. <tfoot id='tetoQ'></tfoot>
    2. <small id='tetoQ'></small><noframes id='tetoQ'>

      多處理 Queue.get() 掛起

      Multiprocessing Queue.get() hangs(多處理 Queue.get() 掛起)
      <legend id='uwW7G'><style id='uwW7G'><dir id='uwW7G'><q id='uwW7G'></q></dir></style></legend>

            <tfoot id='uwW7G'></tfoot>
              <tbody id='uwW7G'></tbody>

              <small id='uwW7G'></small><noframes id='uwW7G'>

                <bdo id='uwW7G'></bdo><ul id='uwW7G'></ul>
                <i id='uwW7G'><tr id='uwW7G'><dt id='uwW7G'><q id='uwW7G'><span id='uwW7G'><b id='uwW7G'><form id='uwW7G'><ins id='uwW7G'></ins><ul id='uwW7G'></ul><sub id='uwW7G'></sub></form><legend id='uwW7G'></legend><bdo id='uwW7G'><pre id='uwW7G'><center id='uwW7G'></center></pre></bdo></b><th id='uwW7G'></th></span></q></dt></tr></i><div class="rf7htpb" id='uwW7G'><tfoot id='uwW7G'></tfoot><dl id='uwW7G'><fieldset id='uwW7G'></fieldset></dl></div>
                本文介紹了多處理 Queue.get() 掛起的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學習吧!

                問題描述

                限時送ChatGPT賬號..

                我正在嘗試實現基本的多處理,但遇到了問題.下面附上python腳本.

                I'm trying to implement basic multiprocessing and I've run into an issue. The python script is attached below.

                import time, sys, random, threading
                from multiprocessing import Process
                from Queue import Queue
                from FrequencyAnalysis import FrequencyStore, AnalyzeFrequency
                
                append_queue = Queue(10)
                database = FrequencyStore()
                
                def add_to_append_queue(_list):
                    append_queue.put(_list)
                
                def process_append_queue():
                    while True:
                        item = append_queue.get()
                        database.append(item)
                        print("Appended to database in %.4f seconds" % database.append_time)
                        append_queue.task_done()
                    return
                
                def main():
                    database.load_db()
                    print("Database loaded in %.4f seconds" % database.load_time)
                    append_queue_process = Process(target=process_append_queue)
                    append_queue_process.daemon = True
                    append_queue_process.start()
                    #t = threading.Thread(target=process_append_queue)
                    #t.daemon = True
                    #t.start()
                
                    while True:
                        path = raw_input("file: ")
                        if path == "exit":
                            break
                        a = AnalyzeFrequency(path)
                        a.analyze()
                        print("Analyzed file in %.4f seconds" % a._time)
                        add_to_append_queue(a.get_results())
                
                    append_queue.join()
                    #append_queue_process.join()
                    database.save_db()
                    print("Database saved in %.4f seconds" % database.save_time)
                    sys.exit(0)
                
                if __name__=="__main__":
                    main()
                

                AnalyzeFrequency 分析文件中單詞的頻率,get_results() 返回所述單詞和頻率的排序列表.列表非常大,可能有 10000 項.

                The AnalyzeFrequency analyzes the frequencies of words in a file and get_results() returns a sorted list of said words and frequencies. The list is very large, perhaps 10000 items.

                然后將該列表傳遞給 add_to_append_queue 方法,該方法將其添加到隊列中.process_append_queue 一項一項地獲取項目并將頻率添加到數據庫".此操作比 main() 中的實際分析需要更長的時間,因此我嘗試對此方法使用單獨的過程.當我嘗試使用線程模塊執行此操作時,一切正常,沒有錯誤.當我嘗試使用 Process 時,腳本掛在 item = append_queue.get().

                This list is then passed to the add_to_append_queue method which adds it to a queue. The process_append_queue takes the items one by one and adds the frequencies to a "database". This operation takes a bit longer than the actual analysis in main() so I am trying to use a seperate process for this method. When I try and do this with the threading module, everything works perfectly fine, no errors. When I try and use Process, the script hangs at item = append_queue.get().

                有人能解釋一下這里發生了什么,或許可以指導我解決問題嗎?

                Could someone please explain what is happening here, and perhaps direct me toward a fix?

                感謝所有答案!

                更新

                泡菜錯誤是我的錯,只是一個錯字.現在我在多處理中使用 Queue 類,但 append_queue.get() 方法仍然掛起.新代碼

                import time, sys, random
                from multiprocessing import Process, Queue
                from FrequencyAnalysis import FrequencyStore, AnalyzeFrequency
                
                append_queue = Queue()
                database = FrequencyStore()
                
                def add_to_append_queue(_list):
                    append_queue.put(_list)
                
                def process_append_queue():
                    while True:
                        database.append(append_queue.get())
                        print("Appended to database in %.4f seconds" % database.append_time)
                    return
                
                def main():
                    database.load_db()
                    print("Database loaded in %.4f seconds" % database.load_time)
                    append_queue_process = Process(target=process_append_queue)
                    append_queue_process.daemon = True
                    append_queue_process.start()
                    #t = threading.Thread(target=process_append_queue)
                    #t.daemon = True
                    #t.start()
                
                    while True:
                        path = raw_input("file: ")
                        if path == "exit":
                            break
                        a = AnalyzeFrequency(path)
                        a.analyze()
                        print("Analyzed file in %.4f seconds" % a._time)
                        add_to_append_queue(a.get_results())
                
                    #append_queue.join()
                    #append_queue_process.join()
                    print str(append_queue.qsize())
                    database.save_db()
                    print("Database saved in %.4f seconds" % database.save_time)
                    sys.exit(0)
                
                if __name__=="__main__":
                    main()
                

                更新 2

                這是數據庫代碼:

                class FrequencyStore:
                
                    def __init__(self):
                        self.sorter = Sorter()
                        self.db = {}
                        self.load_time = -1
                        self.save_time = -1
                        self.append_time = -1
                        self.sort_time = -1
                
                    def load_db(self):
                        start_time = time.time()
                
                        try:
                            file = open("results.txt", 'r')
                        except:
                            raise IOError
                
                        self.db = {}
                        for line in file:
                            word, count = line.strip("
                ").split("=")
                            self.db[word] = int(count)
                        file.close()
                
                        self.load_time = time.time() - start_time
                
                    def save_db(self):
                        start_time = time.time()
                
                        _db = []
                        for key in self.db:
                            _db.append([key, self.db[key]])
                        _db = self.sort(_db)
                
                        try:
                            file = open("results.txt", 'w')
                        except:
                            raise IOError
                
                        file.truncate(0)
                        for x in _db:
                            file.write(x[0] + "=" + str(x[1]) + "
                ")
                        file.close()
                
                        self.save_time = time.time() - start_time
                
                    def create_sorted_db(self):
                        _temp_db = []
                        for key in self.db:
                            _temp_db.append([key, self.db[key]])
                        _temp_db = self.sort(_temp_db)
                        _temp_db.reverse()
                        return _temp_db
                
                    def get_db(self):
                        return self.db
                
                    def sort(self, _list):
                        start_time = time.time()
                
                        _list = self.sorter.mergesort(_list)
                        _list.reverse()
                
                        self.sort_time = time.time() - start_time
                        return _list
                
                    def append(self, _list):
                        start_time = time.time()
                
                        for x in _list:
                            if x[0] not in self.db:
                                self.db[x[0]] = x[1]
                            else:
                                self.db[x[0]] += x[1]
                
                        self.append_time = time.time() - start_time
                

                推薦答案

                評論建議您嘗試在 Windows 上運行它.正如我在評論中所說,

                Comments suggest you're trying to run this on Windows. As I said in a comment,

                如果你在 Windows 上運行它,它就不能工作 - Windows 不能有 fork(),所以每個進程都有自己的隊列,他們什么都沒有彼此做.整個模塊由從頭開始"導入Windows 上的每個進程.您需要在 main() 中創建隊列,并將其作為參數傳遞給工作函數.

                If you're running this on Windows, it can't work - Windows doesn't have fork(), so each process gets its own Queue and they have nothing to do with each other. The entire module is imported "from scratch" by each process on Windows. You'll need to create the Queue in main(), and pass it as an argument to the worker function.

                這里充實了您需要做的事情以使其可移植,盡管我刪除了所有數據庫內容,因為它與您迄今為止描述的問題無關.我還刪除了 daemon 擺弄,因為這通常只是避免干凈地關閉事物的一種懶惰方式,而且通常以后會回來咬你:

                Here's fleshing out what you need to do to make it portable, although I removed all the database stuff because it's irrelevant to the problems you've described so far. I also removed the daemon fiddling, because that's usually just a lazy way to avoid shutting down things cleanly, and often as not will come back to bite you later:

                def process_append_queue(append_queue):
                    while True:
                        x = append_queue.get()
                        if x is None:
                            break
                        print("processed %d" % x)
                    print("worker done")
                
                def main():
                    import multiprocessing as mp
                
                    append_queue = mp.Queue(10)
                    append_queue_process = mp.Process(target=process_append_queue, args=(append_queue,))
                    append_queue_process.start()
                    for i in range(100):
                        append_queue.put(i)
                    append_queue.put(None)  # tell worker we're done
                    append_queue_process.join()
                
                if __name__=="__main__":
                    main()
                

                輸出是明顯"的東西:

                processed 0
                processed 1
                processed 2
                processed 3
                processed 4
                ...
                processed 96
                processed 97
                processed 98
                processed 99
                worker done
                

                注意:因為 Windows 不(不能)fork(),所以工作進程不可能繼承 Windows 上的任何 Python 對象.每個進程從一開始就運行整個程序.這就是為什么您的原始程序無法運行的原因:每個進程都創建了自己的 Queue,與另一個進程中的 Queue 完全無關.在上面顯示的方法中,只有主進程創建了一個 Queue,主進程將它(作為參數)傳遞給工作進程.

                Note: because Windows doesn't (can't) fork(), it's impossible for worker processes to inherit any Python object on Windows. Each process runs the entire program from its start. That's why your original program couldn't work: each process created its own Queue, wholly unrelated to the Queue in the other process. In the approach shown above, only the main process creates a Queue, and the main process passes it (as an argument) to the worker process.

                這篇關于多處理 Queue.get() 掛起的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!

                【網站聲明】本站部分內容來源于互聯網,旨在幫助大家更快的解決問題,如果有圖片或者內容侵犯了您的權益,請聯系我們刪除處理,感謝您的支持!

                相關文檔推薦

                What exactly is Python multiprocessing Module#39;s .join() Method Doing?(Python 多處理模塊的 .join() 方法到底在做什么?)
                Passing multiple parameters to pool.map() function in Python(在 Python 中將多個參數傳遞給 pool.map() 函數)
                multiprocessing.pool.MaybeEncodingError: #39;TypeError(quot;cannot serialize #39;_io.BufferedReader#39; objectquot;,)#39;(multiprocessing.pool.MaybeEncodingError: TypeError(cannot serialize _io.BufferedReader object,)) - IT屋-程序員軟件開
                Python Multiprocess Pool. How to exit the script when one of the worker process determines no more work needs to be done?(Python 多進程池.當其中一個工作進程確定不再需要完成工作時,如何退出腳本?) - IT屋-程序員
                How do you pass a Queue reference to a function managed by pool.map_async()?(如何將隊列引用傳遞給 pool.map_async() 管理的函數?)
                yet another confusion with multiprocessing error, #39;module#39; object has no attribute #39;f#39;(與多處理錯誤的另一個混淆,“模塊對象沒有屬性“f)
                  <i id='feLUJ'><tr id='feLUJ'><dt id='feLUJ'><q id='feLUJ'><span id='feLUJ'><b id='feLUJ'><form id='feLUJ'><ins id='feLUJ'></ins><ul id='feLUJ'></ul><sub id='feLUJ'></sub></form><legend id='feLUJ'></legend><bdo id='feLUJ'><pre id='feLUJ'><center id='feLUJ'></center></pre></bdo></b><th id='feLUJ'></th></span></q></dt></tr></i><div class="nrf77lt" id='feLUJ'><tfoot id='feLUJ'></tfoot><dl id='feLUJ'><fieldset id='feLUJ'></fieldset></dl></div>

                  <small id='feLUJ'></small><noframes id='feLUJ'>

                  • <legend id='feLUJ'><style id='feLUJ'><dir id='feLUJ'><q id='feLUJ'></q></dir></style></legend>
                      <tbody id='feLUJ'></tbody>

                      • <bdo id='feLUJ'></bdo><ul id='feLUJ'></ul>

                          <tfoot id='feLUJ'></tfoot>

                        1. 主站蜘蛛池模板: 工控机-工业平板电脑-研华工控机-研越无风扇嵌入式box工控机 | 万濠投影仪_瑞士TRIMOS高度仪_尼康投影仪V12BDC|量子仪器 | 免费B2B信息推广发布平台 - 推发网| 洛阳网站建设_洛阳网站优化_网站建设平台_洛阳香河网络科技有限公司 | 耐热钢-耐磨钢-山东聚金合金钢铸造有限公司 | PC构件-PC预制构件-构件设计-建筑预制构件-PC构件厂-锦萧新材料科技(浙江)股份有限公司 | 食药成分检测_调料配方还原_洗涤剂化学成分分析_饲料_百检信息科技有限公司 | 成人纸尿裤,成人尿不湿,成人护理垫-山东康舜日用品有限公司 | 除甲醛公司-甲醛检测治理-杭州创绿家环保科技有限公司-室内空气净化十大品牌 | 橡胶弹簧|复合弹簧|橡胶球|振动筛配件-新乡市永鑫橡胶厂 | 山西3A认证|太原AAA信用认证|投标AAA信用证书-山西AAA企业信用评级网 | 篷房|仓储篷房|铝合金篷房|体育篷房|篷房厂家-华烨建筑科技官网 知名电动蝶阀,电动球阀,气动蝶阀,气动球阀生产厂家|价格透明-【固菲阀门官网】 | 滤芯,过滤器,滤油机,贺德克滤芯,精密滤芯_新乡市宇清流体净化技术有限公司 | 合肥活动房_安徽活动板房_集成打包箱房厂家-安徽玉强钢结构集成房屋有限公司 | ◆大型吹塑加工|吹塑加工|吹塑代加工|吹塑加工厂|吹塑设备|滚塑加工|滚塑代加工-莱力奇塑业有限公司 | 成都思迪机电技术研究所-四川成都思迪编码器 | 淘气堡_室内儿童乐园_户外无动力儿童游乐设备-高乐迪(北京) | 手持式线材张力计-套帽式风量罩-深圳市欧亚精密仪器有限公司 | 全屋整木定制-橱柜,家具定制-四川峨眉山龙马木业有限公司 | 金属管浮子流量计_金属转子流量计厂家-淮安润中仪表科技有限公司 | 烘箱-工业烘箱-工业电炉-实验室干燥箱 - 苏州华洁烘箱制造有限公司 | 废旧物资回收公司_广州废旧设备回收_报废设备物资回收-益美工厂设备回收公司 | 上海质量认证办理中心| 动物解剖台-成蚊接触筒-标本工具箱-负压实验台-北京哲成科技有限公司 | 软文发布-新闻发布推广平台-代写文章-网络广告营销-自助发稿公司媒介星 | 交通信号灯生产厂家_红绿灯厂家_电子警察监控杆_标志杆厂家-沃霖电子科技 | 电子书导航网_电子书之家_电子书大全_最新电子书分享发布平台 | 防水套管-柔性防水套管-刚性防水套管-上海执品管件有限公司 | AR开发公司_AR增强现实_AR工业_AR巡检|上海集英科技 | 水冷散热器_水冷电子散热器_大功率散热器_水冷板散热器厂家-河源市恒光辉散热器有限公司 | Q361F全焊接球阀,200X减压稳压阀,ZJHP气动单座调节阀-上海戎钛 | 广州展览制作|展台制作工厂|展览设计制作|展览展示制作|搭建制作公司 | 桁架楼承板-钢筋桁架楼承板-江苏众力达钢筋楼承板厂 | 新能源汽车电机定转子合装机 - 电机维修设备 - 睿望达 | 医养体检包_公卫随访箱_慢病随访包_家签随访包_随访一体机-济南易享医疗科技有限公司 | 自动售货机_无人售货机_专业的自动售货机运营商_免费投放售货机-广州富宏主官网 | 蓝莓施肥机,智能施肥机,自动施肥机,水肥一体化项目,水肥一体机厂家,小型施肥机,圣大节水,滴灌施工方案,山东圣大节水科技有限公司官网17864474793 | 耐火浇注料-喷涂料-浇注料生产厂家_郑州市元领耐火材料有限公司 耐力板-PC阳光板-PC板-PC耐力板 - 嘉兴赢创实业有限公司 | 知企服务-企业综合服务(ZiKeys.com)-品优低价、种类齐全、过程管理透明、速度快捷高效、放心服务,知企专家! | 管理会计网-PCMA初级管理会计,中级管理会计考试网站 | 排烟防火阀-消防排烟风机-正压送风口-厂家-价格-哪家好-德州鑫港旺通风设备有限公司 |