問題描述
我環顧四周,但似乎找不到任何人正試圖完全按照我的意愿行事.
I've looked around for this and I can't seem to find anyone who is trying to do exactly what I am.
我有通過 _POST 請求傳遞給我的函數的信息.基于該數據,我運行 exec 命令來運行 TCL 腳本一定次數(使用不同的參數,基于 post 變量).現在,我在 foreach 中有 exec,所以這需要永遠運行(TCL 腳本需要 15 秒左右才能返回,所以如果我需要運行它 100 次,我有一點問題).這是我的代碼:
I have information that is passed in to my function via a _POST request. Based on that data, I run an exec command to run a TCL script a certain number of times (with different parameters, based on the post variable). Right now, I have the exec in a foreach so this takes forever to run (the TCL script takes 15 or so seconds to come back, so if I need to run it 100 times, I have a bit of an issue). Here is my code:
public function executeAction(){
//code to parse the _POST variable into an array called devices
foreach($devices as $devID){
exec("../path/to/script.tcl -parameter1 ".$device['param1']." -parameter2 ".$device['param2'], $execout[$devID]);
}
print_r($execout);
}
顯然,這段代碼只是刪除了大塊的摘錄,但希望它足以證明我正在嘗試做的事情.
Obviously this code is just an excerpt with big chunks removed, but hopefully it's enough to demonstrate what I'm trying to do.
我需要一次運行所有的 exec,我需要等待它們全部完成后再返回.我還需要存儲在名為 $execout 的數組中的所有腳本的輸出.
I need to run all of the execs at once and I need to wait for them all to complete before returning. I also need the output of all of the scripts stored in the array called $execout.
有什么想法嗎?
謝謝!!!
推薦答案
如果將 exec()
調用放在單獨的腳本中,則可以使用 curl_multi_exec()
.這樣,您可以在單獨的請求中進行所有調用,以便它們可以同時執行.輪詢 &$still_running
以查看所有請求何時完成,之后您可以收集每個請求的結果.
If you put your exec()
call in a separate script, you can call that external script multiple times in parallel using curl_multi_exec()
. That way, you'd make all the calls in separate requests, so they could execute simultaneously. Poll &$still_running
to see when all requests have finished, after which you can collect the results from each.
更新:這里是 一篇博文 詳細說明了我所描述的內容.
Update: Here's a blog post detailing exactly what I'm describing.
根據上面鏈接的博客文章,我整理了以下示例.
Based on the blog post linked above, I put together the following example.
腳本并行運行:
// waitAndDate.php
<?php
sleep((int)$_GET['time']);
printf('%d secs; %s', $_GET['time'], shell_exec('date'));
腳本并行調用:
// multiExec.php
<?php
$start = microtime(true);
$mh = curl_multi_init();
$handles = array();
// create several requests
for ($i = 0; $i < 5; $i++) {
$ch = curl_init();
$rand = rand(5,25); // just making up data to pass to script
curl_setopt($ch, CURLOPT_URL, "http://domain/waitAndDate.php?time=$rand");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 30);
curl_multi_add_handle($mh, $ch);
$handles[] = $ch;
}
// execute requests and poll periodically until all have completed
$isRunning = null;
do {
curl_multi_exec($mh, $isRunning);
usleep(250000);
} while ($isRunning > 0);
// fetch output of each request
$outputs = array();
for ($i = 0; $i < count($handles); $i++) {
$outputs[$i] = trim(curl_multi_getcontent($handles[$i]));
curl_multi_remove_handle($mh, $handles[$i]);
}
curl_multi_close($mh);
print_r($outputs);
printf("Elapsed time: %.2f seconds
", microtime(true) - $start);
這是我運行幾次時收到的一些輸出:
Here is some output I received when running it a few times:
Array
(
[0] => 8 secs; Mon Apr 2 19:01:33 UTC 2012
[1] => 8 secs; Mon Apr 2 19:01:33 UTC 2012
[2] => 18 secs; Mon Apr 2 19:01:43 UTC 2012
[3] => 11 secs; Mon Apr 2 19:01:36 UTC 2012
[4] => 8 secs; Mon Apr 2 19:01:33 UTC 2012
)
Elapsed time: 18.36 seconds
Array
(
[0] => 22 secs; Mon Apr 2 19:02:33 UTC 2012
[1] => 9 secs; Mon Apr 2 19:02:20 UTC 2012
[2] => 8 secs; Mon Apr 2 19:02:19 UTC 2012
[3] => 11 secs; Mon Apr 2 19:02:22 UTC 2012
[4] => 7 secs; Mon Apr 2 19:02:18 UTC 2012
)
Elapsed time: 22.37 seconds
Array
(
[0] => 5 secs; Mon Apr 2 19:02:40 UTC 2012
[1] => 18 secs; Mon Apr 2 19:02:53 UTC 2012
[2] => 7 secs; Mon Apr 2 19:02:42 UTC 2012
[3] => 9 secs; Mon Apr 2 19:02:44 UTC 2012
[4] => 9 secs; Mon Apr 2 19:02:44 UTC 2012
)
Elapsed time: 18.35 seconds
希望有幫助!
一方面要注意:確保您的 Web 服務器可以處理這么多并行請求.如果它按順序為它們提供服務,或者只能同時提供很少的服務,則這種方法幾乎沒有收益或沒有收益.:-)
One side note: make sure your web server can process this many parallel requests. If it serves them sequentially or can only serve very few simultaneously, this approach gains you little or nothing. :-)
這篇關于一次運行多個 exec 命令(但要等待最后一個完成)的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!