…or I’m doing something stupid, in which case I hope someone would enlighten me.
We grab a number of data from two different MySQL servers, get them back as arrays ($ar1 and $ar2) and then we concatenate the two arrays. $ar1 consists of 30 to 200 elements, sometimes more. $ar2 typically contains 30 elements.
The PHP way of doing this is:
$ar1 = array_merge($ar1, $ar2);
and the home-grown version is
foreach($ar2 as $i) {
$ar1[] = $i; }
While I do realize that “the PHP way” involves creating a new copy of $ar1 along the way, my assumption before testing this was that, being an internal function with no further parsing or interpretation to be done, it would be much faster.
Doing some microtime() estimations while keeping $ar2 constant at 30 elements, I found:
- At 1-10 elements in $ar1, array_merge is about 33% faster.
- At 20-40 …