Forum Moderators: DixonJones
So - when a request hits your server, you check the IP against the database. Then you can deliver content based on the location of the user.
There's a downside. Whilst Geo-targetting per-se is seen as a legitimate use of cloaking (for that is what it is), you may find that you inadvertantly block search engine spiders, as these may be coming to your site from all sorts of countries. You could aim to get around this by finding a list of IPs they Googlebot, Slurp or other bots use, but be prepared for the possibilty of unexpected results. Keep at eye on Google's sitemaps reports and react if you suddenly get your site blocked or unspiderable.
Are there any recommendations for this solution?
Sorry - that would amount to an advert. You'll not notice many of those on WebmasterWorld. Your favourite search angine should give you plenty of ideas :)