Forum Moderators: coopster
I figure it's worth noting that this isn't going on a site that gets a ton of traffic, nor will the CSV ever be particularly big. I'd guess 20-30 rows at most.
Thanks!
$row = 1;$ch = curl_init("http://pathtomycsv.txt");
$fp = fopen("localcopy.txt", "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
$filename = "google.txt";
$id = fopen($filename, "r");
while ($data = fgetcsv($id, filesize($filename)))
$table[] = $data;
fclose($id);
echo "<table>\n";
foreach($table as $row) {
$c = 1 - $c;
if ($c % 2 == 0) {
echo "\t<tr class='alt'>\n";
} else {
echo "\t<tr>\n";
}
foreach($row as $data)
echo "\t\t<td>$data</td>\n";
echo "\t</tr>\n";
}
echo "</table>\n";
It would take some of the overhead off the server and put it on the client.
This will affect how content is indexed though so if you want the content indexed you will have to employ some methods of making Ajax loaded data available to SE.
[edited by: Demaestro at 6:14 pm (utc) on May 9, 2009]