Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
Would google hurt your page rank if you used the $_SERVER['USER_AGENT'] value parsed it for "googlebot" or "yahoo" or "msnbot" or w/e, and not display links if those strings exist?
I am sure this question has been answered many timees before, but i couldn't find a specific thread.
What you have described doing is a form of cloaking and may in fact help your PR due to the fact that Google sees less outbound links when they come to crawl your site. However, I do not recommend this practice with a domain that is important to you. This is because of the fact that Google has stated publicly that they will ban cloaked domains. You may never get caught but you are on thin ice. If you use a throw away domain that would be the best way to test such a strategy.
Another recommendation is to not just use user agent strings to determine spiders. Why? They are incredibly easy to spoof and it will be relatively easy for Google to "break" your cloaking algorithm and find out that you are serving different pages to their spiders. Rather find yourself a good ip list, throw those into a mysql table, and use php to check to see if the ip hitting you is in that list. Volatilegx actually runs one of the best ip lists on the web. I am not allowed to give out a url here but go to Google and search for "ip lists" (without the quotes) his is the first link that shows up.
Hope this answers your questions!