I have a script that receives a user's coordinates from url parameters. As a fail-safe, if no valid parameters are passed I use an API service to convert the user's IP address to their latitude and longitude.
if(!$coords){
$geo = unserialize(file_get_contents("http://www.geoplugin.net/php.gp?ip=".$_SERVER["REMOTE_ADDR"]));
$coords = array($geo["geoplugin_latitude"],$geo["geoplugin_longitude"]);}
The free API service works great and I love that I do not have to maintain a large IP-to-Geo database. However, I feel a little uncomfortable as it makes the script slower and I feel abusive if I send a lot of traffic to the service.
To resolve my concern I am looking at caching the results. If the IP address is in the cache the script will use the corresponding latitude and longitude. If the IP address is not in the cache the script will use the free API service to convert the IP address to the corresponding coordinates, add the IP address to the cache, and continue using the new found latitude and longitude. I could create a cron job to purge the cache on set intervals ensuring data freshness.
I would like recommendations on how I could create this cache, starting with should it be a MySQL or a flat-file database? My website's users will come from all over the world so there could be many IP addresses referenced. Also, speed is very important for the script.