Forum Moderators: phranque
You can use mod_rewrite in .htaccess to steer search engines to 'special' pages intended for them, but be careful; any attempt to do so can be seen as cloaking. If search engines are fed content that differs from the content seen by human visitors, then there must be a very good reason for doing so, and there can be no attempt to deceive the search engine.
About the only acceptable reason for cloaking is to make primarily non-text sites (such as Flash or image-based sites) spiderable. But the content for the spider must match the non-text content closely to survive a manual review of the site. And if you are in a competitive market segment and cloaking to deceive, you can be sure that your competitors will report your site, and that this manual review will happen.
It is not impossible to cloak successfuly -- at least for a while. But doing so requires that you keep track of all IP addresses used by search engines, search engine company employees, and your comepetition -- Including their home IP addresses.
Unless a compelling reason exists to server user-agent-dependent content, you are better off using robots.txt and on-page HTML robots meta tag to control search spiders. Cloaking and any other user-agent-dependent content should be 100% relevant and should be used sparingly.
You can test your site with any of the several widely-available user-agent spoofing tools. On-line tools such as Wannabrowser and browser extensions such as Firefox's User-Agent Switcher are two examples.
Jim