pbootcms网站模板|日韩1区2区|织梦模板||网站源码|日韩1区2区|jquery建站特效-html5模板网

    <bdo id='5MClv'></bdo><ul id='5MClv'></ul>

      <small id='5MClv'></small><noframes id='5MClv'>

      <i id='5MClv'><tr id='5MClv'><dt id='5MClv'><q id='5MClv'><span id='5MClv'><b id='5MClv'><form id='5MClv'><ins id='5MClv'></ins><ul id='5MClv'></ul><sub id='5MClv'></sub></form><legend id='5MClv'></legend><bdo id='5MClv'><pre id='5MClv'><center id='5MClv'></center></pre></bdo></b><th id='5MClv'></th></span></q></dt></tr></i><div class="jhhjr3z" id='5MClv'><tfoot id='5MClv'></tfoot><dl id='5MClv'><fieldset id='5MClv'></fieldset></dl></div>
      1. <tfoot id='5MClv'></tfoot>
      2. <legend id='5MClv'><style id='5MClv'><dir id='5MClv'><q id='5MClv'></q></dir></style></legend>

        使用文件的一部分生成 JavaScript 文件哈希值

        JavaScript File Hash Value Generate with Part of the file(使用文件的一部分生成 JavaScript 文件哈希值)
          <bdo id='Vwfka'></bdo><ul id='Vwfka'></ul>
            <tbody id='Vwfka'></tbody>
          <tfoot id='Vwfka'></tfoot>

            <small id='Vwfka'></small><noframes id='Vwfka'>

            1. <i id='Vwfka'><tr id='Vwfka'><dt id='Vwfka'><q id='Vwfka'><span id='Vwfka'><b id='Vwfka'><form id='Vwfka'><ins id='Vwfka'></ins><ul id='Vwfka'></ul><sub id='Vwfka'></sub></form><legend id='Vwfka'></legend><bdo id='Vwfka'><pre id='Vwfka'><center id='Vwfka'></center></pre></bdo></b><th id='Vwfka'></th></span></q></dt></tr></i><div class="hf7xzdt" id='Vwfka'><tfoot id='Vwfka'></tfoot><dl id='Vwfka'><fieldset id='Vwfka'></fieldset></dl></div>

                <legend id='Vwfka'><style id='Vwfka'><dir id='Vwfka'><q id='Vwfka'></q></dir></style></legend>
                  本文介紹了使用文件的一部分生成 JavaScript 文件哈希值的處理方法,對大家解決問題具有一定的參考價(jià)值,需要的朋友們下面隨著小編來一起學(xué)習(xí)吧!

                  問題描述

                  我正在使用 JavaScript 為唯一文件值生成文件哈希值.請檢查以下代碼以了解運(yùn)行良好的哈希生成機(jī)制.

                  但是,我在為大文件生成哈希值時(shí)遇到了問題,因?yàn)樵诳蛻舳藶g覽器崩潰了.

                  直到 30MB,HASHING 運(yùn)行良好,但如果我嘗試上傳大于該值的文件,系統(tǒng)就會崩潰.

                  我的問題是:

                  <塊引用>

                  1. 我能否為文件的一部分生成 HASH 值而不是讀取大文件并導(dǎo)致崩潰?如果是,我可以知道如何做那個(gè)寬度嗎'文件閱讀器';

                  2. 我可以指定任意數(shù)量的字節(jié)(例如文件的 2000 個(gè)字符)來生成 HASH 值,然后為大文件生成.

                  我希望上述兩種解決方案適用于較大和較小的文件.還有其他選擇嗎?

                  我的小提琴演示

                  解決方案

                  1. 我能否為文件的一部分生成 HASH 值而不是讀取大文件并導(dǎo)致崩潰?如果是,我能知道如何處理那個(gè)寬度的FileReader"嗎;

                  是的,你可以這樣做,它被稱為漸進(jìn)式哈希.

                  var md5 = CryptoJS.algo.MD5.create();md5.update("文件第1部分");md5.update("文件第2部分");md5.update("文件第 3 部分");var hash = md5.finalize();

                  <塊引用>

                  1. 我可以指定任意數(shù)量的字節(jié)(例如文件的 2000 個(gè)字符)來生成 HASH 值,然后為大文件生成.

                  HTML5Rocks 文章介紹了如何可以使用 File.slice 將切片文件傳遞給 FileReader:

                  var blob = file.slice(startingByte, endindByte);reader.readAsArrayBuffer(blob);

                  完整解決方案

                  我把兩者結(jié)合了.棘手的部分是同步文件讀取,因?yàn)?FileReader.readAsArrayBuffer() 是異步的.我寫了一個(gè)小的 series 函數(shù),它模仿了 series async.js 的功能.必須一個(gè)接一個(gè)地進(jìn)行,因?yàn)闆]有辦法進(jìn)入CryptoJS的哈希函數(shù)的內(nèi)部狀態(tài).

                  此外,CryptoJS 不了解 ArrayBuffer 是什么,因此必須將其轉(zhuǎn)換為其原生數(shù)據(jù)表示形式,即所謂的 WordArray:

                  function arrayBufferToWordArray(ab) {var i8a = new Uint8Array(ab);var a = [];for (var i = 0; i < i8a.length; i += 4) {a.push(i8a[i] <<24 | i8a[i + 1] <<16 | i8a[i + 2] <<8 | i8a[i + 3]);}返回 CryptoJS.lib.WordArray.create(a, i8a.length);}

                  另一件事是散列是一種同步操作,其中沒有 yield 可以在其他地方繼續(xù)執(zhí)行.因此,瀏覽器將凍結(jié),因?yàn)?JavaScript 是單線程的.解決方案是使用 Web Workers 將哈希卸載到不同的線程,以便 UI 線程保持響應(yīng).
                  Web 工作者希望在其構(gòu)造函數(shù)中使用腳本文件,因此我使用了 Rob W 的此解決方案來獲得內(nèi)聯(lián)腳本.

                  函數(shù)系列(任務(wù),完成){if(!tasks || tasks.length === 0) {完畢();} 別的 {任務(wù)[0](函數(shù)(){系列(任務(wù).切片(1),完成);});}}函數(shù) webWorkerOnMessage(e){如果(e.data.type ===創(chuàng)建"){md5 = CryptoJS.algo.MD5.create();postMessage({type: "create"});} else if (e.data.type === "update") {函數(shù) arrayBufferToWordArray(ab) {var i8a = new Uint8Array(ab);var a = [];for (var i = 0; i < i8a.length; i += 4) {a.push(i8a[i] <<24 | i8a[i + 1] <<16 | i8a[i + 2] <<8 | i8a[i + 3]);}返回 CryptoJS.lib.WordArray.create(a, i8a.length);}md5.update(arrayBufferToWordArray(e.data.chunk));postMessage({type: "update"});} else if (e.data.type === "finish") {postMessage({type: "finish", hash: ""+md5.finalize()});}}//URL.createObjectURLwindow.URL = window.URL ||window.webkitURL;//服務(wù)器響應(yīng)",用于所有示例變量響應(yīng) ="importScripts('https://cdn.rawgit.com/CryptoStore/crypto-js/3.1.2/build/rollups/md5.js');"+"var md5;"+"self.onmessage = "+webWorkerOnMessage.toString();var blob;嘗試 {blob = new Blob([response], {type: 'application/javascript'});} catch (e) {//向后兼容window.BlobBuilder = window.BlobBuilder ||window.WebKitBlobBuilder ||窗口.MozBlobBuilder;blob = 新的 BlobBuilder();blob.append(響應(yīng));blob = blob.getBlob();}var worker = new Worker(URL.createObjectURL(blob));var 文件 = evt.target.files;//文件列表對象變量塊大小 = 1000000;//塊大小沒有區(qū)別變量 i = 0,f = 文件[i],塊 = Math.ceil(f.size/chunksize),塊任務(wù) = [],startTime = (new Date()).getTime();worker.onmessage = 函數(shù)(e){//創(chuàng)建回調(diào)for(var j = 0; j <塊; j++){(函數(shù)(j,f){chunkTasks.push(function(next){var blob = f.slice(j * chunksize, Math.min((j+1) * chunksize, f.size));var reader = new FileReader();reader.onload = function(e) {var chunk = e.target.result;worker.onmessage = 函數(shù)(e){//更新回調(diào)document.getElementById('num').innerHTML = ""+(j+1)+"/"+chunks;下一個(gè)();};worker.postMessage({type: "update", chunk: chunk});};reader.readAsArrayBuffer(blob);});})(j, f);}系列(塊任務(wù),功能(){var elem = document.getElementById("hashValueSplit");var telem = document.getElementById("time");worker.onmessage = 函數(shù)(e){//結(jié)束回調(diào)elem.value = e.data.hash;telem.innerHTML = "in " + Math.ceil(((new Date()).getTime() - startTime)/1000) + " seconds";};worker.postMessage({type: "finish"});});//阻塞前面的路...if (document.getElementById("singleHash").checked) {var reader = new FileReader();//關(guān)閉以捕獲文件信息.reader.onloadend = (function(theFile) {函數(shù) arrayBufferToWordArray(ab) {var i8a = new Uint8Array(ab);var a = [];for (var i = 0; i < i8a.length; i += 4) {a.push(i8a[i] <<24 | i8a[i + 1] <<16 | i8a[i + 2] <<8 | i8a[i + 3]);}返回 CryptoJS.lib.WordArray.create(a, i8a.length);}返回函數(shù)(e){var test = e.target.result;var hash = CryptoJS.MD5(arrayBufferToWordArray(test));//var hash = "none";var elem = document.getElementById("hashValue");elem.value = 哈希值;};})(F);//讀入圖像文件作為數(shù)據(jù) URL.reader.readAsArrayBuffer(f);}};worker.postMessage({type: "create"});

                  DEMO 似乎適用于大文件,但需要相當(dāng)多的時(shí)間.也許這可以使用更快的 MD5 實(shí)現(xiàn)來改進(jìn).散列一個(gè) 3 GB 的文件大約需要 23 分鐘.

                  我的這個(gè)答案 展示了一個(gè)沒有 SHA-256 網(wǎng)絡(luò)工作者的例子.

                  I am working with JavaScript to generate File HASH VALUE for unique file values. Kindly check the below code for the Hash Generation Mechanism Which works good.

                  <script type="text/javascript">
                  // Reference: https://code.google.com/p/crypto-js/#MD5
                  function handleFileSelect(evt) 
                  {   
                      var files = evt.target.files; // FileList object
                      // Loop through the FileList and render image files as thumbnails.
                      for (var i = 0, f; f = files[i]; i++) 
                      {
                          var reader = new FileReader();
                          // Closure to capture the file information.
                          reader.onload = (function(theFile) 
                          {
                              return function(e) 
                              {
                                  var span = document.createElement('span');
                                  var test = e.target.result;                 
                                  //var hash = hex_md5(test);
                                  var hash = CryptoJS.MD5(test);
                                  var elem = document.getElementById("hashValue");
                                  elem.value = hash;
                              };
                          })(f);
                          // Read in the image file as a data URL.
                          reader.readAsBinaryString(f);
                      }
                  }
                  document.getElementById('videoupload').addEventListener('change', handleFileSelect, false);
                  </script>
                  

                  However I am facing problem when generating HASH VALUE for large files as in client side the browser Crashed.

                  Up-till 30MB the HASHING works well but if i try to upload larger than that the system crashes.

                  My Question is:

                  1. Can I generate HASH Value for part of file than reading the LARGE files and getting crashes? If yes, Can I know how to do that width 'FileReader';

                  2. Can I specify any amount of Byte such as 2000 Character of a file to generate HASH Value then generating for large files.

                  I hope the above two solution will work for larger and small files. Is there any other options?

                  My Fiddle Demo

                  解決方案

                  1. Can I generate HASH Value for part of file than reading the LARGE files and getting crashes? If yes, Can I know how to do that width 'FileReader';

                  Yes, you can do that and it is called Progressive Hashing.

                  var md5 = CryptoJS.algo.MD5.create();
                  
                  md5.update("file part 1");
                  md5.update("file part 2");
                  md5.update("file part 3");
                  
                  var hash = md5.finalize();
                  

                  1. Can I specify any amount of Byte such as 2000 Character of a file to generate HASH Value then generating for large files.

                  There's an HTML5Rocks article on how one can use File.slice to pass a sliced file to the FileReader:

                  var blob = file.slice(startingByte, endindByte);
                  reader.readAsArrayBuffer(blob);
                  

                  Full solution

                  I have combined both. The tricky part was to synchronize the file reading, because FileReader.readAsArrayBuffer() is asynchronous. I've written a small series function which is modeled after the series function of async.js. It has to be done one after the other, because there is is no way to get to the internal state of the hashing function of CryptoJS.

                  Additionally, CryptoJS doesn't understand what an ArrayBuffer is, so it has to be converted to its native data representation, which is the so-called WordArray:

                  function arrayBufferToWordArray(ab) {
                    var i8a = new Uint8Array(ab);
                    var a = [];
                    for (var i = 0; i < i8a.length; i += 4) {
                      a.push(i8a[i] << 24 | i8a[i + 1] << 16 | i8a[i + 2] << 8 | i8a[i + 3]);
                    }
                    return CryptoJS.lib.WordArray.create(a, i8a.length);
                  }
                  

                  The other thing is that hashing is a synchronous operation where there is no yield to continue execution elsewhere. Because of this, the browser will freeze since JavaScript is single threaded. The solution is to use Web Workers to off-load the hashing to a different thread so that the UI thread keeps responsive.
                  Web workers expect the script file in their constructors, so I used this solution by Rob W to have an inline script.

                  function series(tasks, done){
                      if(!tasks || tasks.length === 0) {
                          done();
                      } else {
                          tasks[0](function(){
                              series(tasks.slice(1), done);
                          });
                      }
                  }
                  
                  function webWorkerOnMessage(e){
                      if (e.data.type === "create") {
                          md5 = CryptoJS.algo.MD5.create();
                          postMessage({type: "create"});
                      } else if (e.data.type === "update") {
                          function arrayBufferToWordArray(ab) {
                              var i8a = new Uint8Array(ab);
                              var a = [];
                              for (var i = 0; i < i8a.length; i += 4) {
                                  a.push(i8a[i] << 24 | i8a[i + 1] << 16 | i8a[i + 2] << 8 | i8a[i + 3]);
                              }
                              return CryptoJS.lib.WordArray.create(a, i8a.length);
                          }
                          md5.update(arrayBufferToWordArray(e.data.chunk));
                          postMessage({type: "update"});
                      } else if (e.data.type === "finish") {
                          postMessage({type: "finish", hash: ""+md5.finalize()});
                      }
                  }
                  
                  // URL.createObjectURL
                  window.URL = window.URL || window.webkitURL;
                  
                  // "Server response", used in all examples
                  var response = 
                      "importScripts('https://cdn.rawgit.com/CryptoStore/crypto-js/3.1.2/build/rollups/md5.js');"+
                      "var md5;"+
                      "self.onmessage = "+webWorkerOnMessage.toString();
                  
                  var blob;
                  try {
                      blob = new Blob([response], {type: 'application/javascript'});
                  } catch (e) { // Backwards-compatibility
                      window.BlobBuilder = window.BlobBuilder || window.WebKitBlobBuilder || window.MozBlobBuilder;
                      blob = new BlobBuilder();
                      blob.append(response);
                      blob = blob.getBlob();
                  }
                  var worker = new Worker(URL.createObjectURL(blob));
                  
                  
                  var files = evt.target.files; // FileList object    
                  var chunksize = 1000000; // the chunk size doesn't make a difference
                  var i = 0, 
                      f = files[i],
                      chunks = Math.ceil(f.size / chunksize),
                      chunkTasks = [],
                      startTime = (new Date()).getTime();
                  worker.onmessage = function(e) {
                      // create callback
                  
                      for(var j = 0; j < chunks; j++){
                          (function(j, f){
                              chunkTasks.push(function(next){
                                  var blob = f.slice(j * chunksize, Math.min((j+1) * chunksize, f.size));
                                  var reader = new FileReader();
                  
                                  reader.onload = function(e) {
                                      var chunk = e.target.result;
                                      worker.onmessage = function(e) {
                                          // update callback
                                          document.getElementById('num').innerHTML = ""+(j+1)+"/"+chunks;
                                          next();
                                      };
                                      worker.postMessage({type: "update", chunk: chunk});
                                  };
                                  reader.readAsArrayBuffer(blob);
                              });
                          })(j, f);
                      }
                      series(chunkTasks, function(){
                          var elem = document.getElementById("hashValueSplit");
                          var telem = document.getElementById("time");
                          worker.onmessage = function(e) {
                              // finish callback
                              elem.value = e.data.hash;
                              telem.innerHTML = "in " + Math.ceil(((new Date()).getTime() - startTime) / 1000) + " seconds";
                          };
                          worker.postMessage({type: "finish"});
                      });
                  
                      // blocking way ahead...
                      if (document.getElementById("singleHash").checked) {
                          var reader = new FileReader();
                  
                          // Closure to capture the file information.
                          reader.onloadend = (function(theFile) {
                              function arrayBufferToWordArray(ab) {
                                  var i8a = new Uint8Array(ab);
                                  var a = [];
                                  for (var i = 0; i < i8a.length; i += 4) {
                                      a.push(i8a[i] << 24 | i8a[i + 1] << 16 | i8a[i + 2] << 8 | i8a[i + 3]);
                                  }
                                  return CryptoJS.lib.WordArray.create(a, i8a.length);
                              }
                              return function(e) {
                                  var test = e.target.result;
                                  var hash = CryptoJS.MD5(arrayBufferToWordArray(test));
                                  //var hash = "none";
                                  var elem = document.getElementById("hashValue");
                                  elem.value = hash;
                              };
                          })(f);
                  
                          // Read in the image file as a data URL.
                          reader.readAsArrayBuffer(f);
                      }
                  };
                  worker.postMessage({type: "create"});
                  

                  DEMO seems to work for big files, but it takes quite a lot of time. Maybe this can be improved using a faster MD5 implementation. It took around 23 minutes to hash a 3 GB file.

                  This answer of mine shows an example without webworkers for SHA-256.

                  這篇關(guān)于使用文件的一部分生成 JavaScript 文件哈希值的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網(wǎng)!

                  【網(wǎng)站聲明】本站部分內(nèi)容來源于互聯(lián)網(wǎng),旨在幫助大家更快的解決問題,如果有圖片或者內(nèi)容侵犯了您的權(quán)益,請聯(lián)系我們刪除處理,感謝您的支持!

                  相關(guān)文檔推薦

                  Deadlock exception code for PHP, MySQL PDOException?(PHP、MySQL PDOException 的死鎖異常代碼?)
                  PHP PDO MySQL scrollable cursor doesn#39;t work(PHP PDO MySQL 可滾動游標(biāo)不起作用)
                  PHP PDO ODBC connection(PHP PDO ODBC 連接)
                  Using PDO::FETCH_CLASS with Magic Methods(使用 PDO::FETCH_CLASS 和魔術(shù)方法)
                  php pdo get only one value from mysql; value that equals to variable(php pdo 只從 mysql 獲取一個(gè)值;等于變量的值)
                  MSSQL PDO could not find driver(MSSQL PDO 找不到驅(qū)動程序)
                  <i id='eQC9o'><tr id='eQC9o'><dt id='eQC9o'><q id='eQC9o'><span id='eQC9o'><b id='eQC9o'><form id='eQC9o'><ins id='eQC9o'></ins><ul id='eQC9o'></ul><sub id='eQC9o'></sub></form><legend id='eQC9o'></legend><bdo id='eQC9o'><pre id='eQC9o'><center id='eQC9o'></center></pre></bdo></b><th id='eQC9o'></th></span></q></dt></tr></i><div class="fthrrzh" id='eQC9o'><tfoot id='eQC9o'></tfoot><dl id='eQC9o'><fieldset id='eQC9o'></fieldset></dl></div>

                    <small id='eQC9o'></small><noframes id='eQC9o'>

                      <tbody id='eQC9o'></tbody>
                    <legend id='eQC9o'><style id='eQC9o'><dir id='eQC9o'><q id='eQC9o'></q></dir></style></legend>

                          • <bdo id='eQC9o'></bdo><ul id='eQC9o'></ul>

                          • <tfoot id='eQC9o'></tfoot>
                            主站蜘蛛池模板: 拖鞋定制厂家-品牌拖鞋代加工厂-振扬实业中国高端拖鞋大型制造商 | 气动隔膜泵-电动隔膜泵-循环热水泵-液下排污/螺杆/管道/化工泵「厂家」浙江绿邦 | 玻璃钢板-玻璃钢防腐瓦-玻璃钢材料-广东壹诺 | 米顿罗计量泵(科普)——韬铭机械 | 低噪声电流前置放大器-SR570电流前置放大器-深圳市嘉士达精密仪器有限公司 | 深圳市东信高科自动化设备有限公司 | 清水混凝土修复_混凝土色差修复剂_混凝土色差调整剂_清水混凝土色差修复_河南天工 | 拉力机-万能试验机-材料拉伸试验机-电子拉力机-拉力试验机厂家-冲击试验机-苏州皖仪实验仪器有限公司 | 工业PH计|工业ph酸度计|在线PH计价格-合肥卓尔仪器仪表有限公司 济南画室培训-美术高考培训-山东艺霖艺术培训画室 | 云南丰泰挖掘机修理厂-挖掘机维修,翻新,再制造的大型企业-云南丰泰工程机械维修有限公司 | 硅PU球场、篮球场地面施工「水性、环保、弹性」硅PU材料生产厂家-广东中星体育公司 | 上海单片机培训|重庆曙海培训分支机构—CortexM3+uC/OS培训班,北京linux培训,Windows驱动开发培训|上海IC版图设计,西安linux培训,北京汽车电子EMC培训,ARM培训,MTK培训,Android培训 | 温州在线网| 山东臭氧发生器,臭氧发生器厂家-山东瑞华环保设备| 工业rfid读写器_RFID工业读写器_工业rfid设备厂商-ANDEAWELL | 四探针电阻率测试仪-振实密度仪-粉末流动性测定仪-宁波瑞柯微智能 | 中矗模型-深圳中矗模型设计有限公司| 曙光腾达官网-天津脚手架租赁-木板架出租-移动门式脚手架租赁「免费搭设」 | 蓝鹏测控平台 - 智慧车间系统 - 车间生产数据采集与分析系统 | 365文案网_全网创意文案句子素材站 | 厂房出租-厂房规划-食品技术-厂房设计-厂房装修-建筑施工-设备供应-设备求购-龙爪豆食品行业平台 | 台湾阳明固态继电器-奥托尼克斯光电传感器-接近开关-温控器-光纤传感器-编码器一级代理商江苏用之宜电气 | 金属抛光机-磁悬浮抛光机-磁力研磨机-磁力清洗机 - 苏州冠古科技 | 苏州伊诺尔拆除公司_专业酒店厂房拆除_商场学校拆除_办公楼房屋拆除_家工装拆除拆旧 | 智能终端_RTU_dcm_北斗星空自动化科技| 智能型高压核相仪-自动开口闪点测试仪-QJ41A电雷管测试仪|上海妙定 | 德州万泰装饰 - 万泰装饰装修设计软装家居馆 | 沉降天平_沉降粒度仪_液体比重仪-上海方瑞仪器有限公司 | 紫外可见光分光度计-紫外分光度计-分光光度仪-屹谱仪器制造(上海)有限公司 | 蜗轮丝杆升降机-螺旋升降机-丝杠升降机厂家-润驰传动 | 烽火安全网_加密软件、神盾软件官网 | 阿米巴企业经营-阿米巴咨询管理-阿米巴企业培训-广东键锋企业管理咨询有限公司 | 硬齿面减速机_厂家-山东安吉富传动设备股份有限公司 | 泰国专线_泰国物流专线_广州到泰国物流公司-泰廊曼国际 | 洛阳永磁工业大吊扇研发生产-工厂通风降温解决方案提供商-中实洛阳环境科技有限公司 | 巨野电机维修-水泵维修-巨野县飞宇机电维修有限公司 | 代办建筑资质升级-建筑资质延期就找上海国信启航 | 全自动真空上料机_粉末真空上料机_气动真空上料机-南京奥威环保科技设备有限公司 | ★店家乐|服装销售管理软件|服装店收银系统|内衣店鞋店进销存软件|连锁店管理软件|收银软件手机版|会员管理系统-手机版,云版,App | 中红外QCL激光器-其他连续-半导体连续激光器-筱晓光子 | 包装设计公司,产品包装设计|包装制作,包装盒定制厂家-汇包装【官方网站】 |