Quantcast
Channel: VinylFox » PHP
Viewing all articles
Browse latest Browse all 4

PHP/MySQL Efficient Paged Data

$
0
0

There was some talk recently about the inefficiency of running two queries verses a single query to retrieve paged data, so I decided to do some testing of my own, and here are my results.

The first way to retrieve paged data is to run two queries, the first query would be just like the second, just without the LIMIT statement. You would use mysql_num_rows to retreive the total count using the first query, and the second query would be your page of data.

Testing average for this configuration was 1.9 seconds for every 200 repetitions.

$rs_count = mysql_query($sql_count);
$rows = mysql_num_rows($rs_count);

$rs = mysql_query($sql);

while($obj = mysql_fetch_object($rs))
{
$arr[] = $obj;
}

$bla = ‘{“total”:”‘.$rows.’”,”results”:’.json_encode($arr).’}';

The second way is almost the same as the first, but instead of using mysql_num_rows to get a total count, we use SQL to provide the count.

Testing average for this configuration was 1.8 seconds for every 200 repetitions.

$rs_count = mysql_query($sql_count);
$data = mysql_fetch_array($rs_count);
$rows = $data[0];

$rs = mysql_query($sql);

while($obj = mysql_fetch_object($rs))
{
$arr[] = $obj;
}

$bla = ‘{“total”:”‘.$rows.’”,”results”:’.json_encode($arr).’}';

The third way is to use a single query, and utilize PHP to split the data into pages. It was suggested that this would be the fastest way because it only has a single call to the database, however testing showed that was not a correct assumption. The overhead for two calls to the database was not enough to out weigh the processing time needed for all the rows to be returned and PHP to process them.

Testing average for this configuration was 2.2 seconds for every 200 repetitions.

$rs = mysql_query($sql);

$r = 0;
while($obj = mysql_fetch_object($rs))
{
if ($r >= $_GET['start'] && $r <= ($_GET['start'] + $_GET['limit'])) {
$arr[] = $obj;
}
$r++;
}

$bla = '{"total":"'.$rows.'","results":'.json_encode($arr).'}';

It would appear that the second method is the most efficient, followed closely by the first. This could change based on your dataset, for instance the third method might turn out to be the best in a situation where you dont have many rows in your table. There are also situations where the first method would run much slower than the second. My opinion would be to stick with the second method.


Viewing all articles
Browse latest Browse all 4

Trending Articles