問題描述
我正在從另一臺服務器下載一個 CSV 文件作為來自供應商的數據源.
I am downloading a CSV file from another server as a data feed from a vendor.
我正在使用 curl 來獲取文件的內容并將其保存到一個名為 $contents
的變量中.
I am using curl to get the contents of the file and saving that into a variable called $contents
.
我可以很好地到達那部分,但是我嘗試通過
和
進行爆炸以獲取行數組,但它失敗并顯示 'out of內存錯誤.
I can get to that part just fine, but I tried exploding by
and
to get an array of lines but it fails with an 'out of memory' error.
我 echo strlen($contents)
大約有 3050 萬個字符.
I echo strlen($contents)
and it's about 30.5 million chars.
我需要操作這些值并將它們插入到數據庫中.我需要做什么來避免內存分配錯誤?
I need to manipulate the values and insert them into a database. What do I need to do to avoid memory allocation errors?
推薦答案
PHP 因內存不足而窒息.不要讓 curl 用文件內容填充 PHP 變量,而是使用
PHP is choking because it's running out memory. Instead of having curl populate a PHP variable with the contents of the file, use the
CURLOPT_FILE
改為將文件保存到磁盤的選項.
option to save the file to disk instead.
//pseudo, untested code to give you the idea
$fp = fopen('path/to/save/file', 'w');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_exec ($ch);
curl_close ($ch);
fclose($fp);
然后,一旦文件被保存,而不是使用 file
或 file_get_contents
函數(這會將整個文件加載到內存中,再次殺死 PHP),使用 fopen
和 fgets 讀取文件一行一次.
Then, once the file is saved, instead of using the file
or file_get_contents
functions (which would load the entire file into memory, killing PHP again), use fopen
and fgets to read the file one line at a time.
這篇關于處理一個長度為 3000 萬個字符的字符串的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!