homepage Welcome to WebmasterWorld Guest from 23.21.34.188
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Visit PubCon.com
Home / Forums Index / Marketing and Biz Dev / Cloaking
Forum Library, Charter, Moderator: open

Cloaking Forum

    
Hiding links with PHP
Hiding links from googlebot with php
ryan_b83




msg:3034600
 6:09 pm on Aug 4, 2006 (gmt 0)

Hey kinda a newbie question but...

Would google hurt your page rank if you used the $_SERVER['USER_AGENT'] value parsed it for "googlebot" or "yahoo" or "msnbot" or w/e, and not display links if those strings exist?

I am sure this question has been answered many timees before, but i couldn't find a specific thread.

Thanks,

 

volatilegx




msg:3035032
 3:43 am on Aug 5, 2006 (gmt 0)

It could.

rankboy




msg:3036690
 12:37 am on Aug 7, 2006 (gmt 0)

Ryan,
As far as we know Google "PR" or Page Rank is an algorithm developed by Larry Page way back in the day that determines web page importance by the number and type of links are pointed to it. What you have described only has to do with your pages own outbound links. The general theory is that the more outbound links on your page, the more you dilute your own PR. (Then of course there are those camps that say that PR makes no difference anyway, but that is another discussion for another thread)

What you have described doing is a form of cloaking and may in fact help your PR due to the fact that Google sees less outbound links when they come to crawl your site. However, I do not recommend this practice with a domain that is important to you. This is because of the fact that Google has stated publicly that they will ban cloaked domains. You may never get caught but you are on thin ice. If you use a throw away domain that would be the best way to test such a strategy.

Another recommendation is to not just use user agent strings to determine spiders. Why? They are incredibly easy to spoof and it will be relatively easy for Google to "break" your cloaking algorithm and find out that you are serving different pages to their spiders. Rather find yourself a good ip list, throw those into a mysql table, and use php to check to see if the ip hitting you is in that list. Volatilegx actually runs one of the best ip lists on the web. I am not allowed to give out a url here but go to Google and search for "ip lists" (without the quotes) his is the first link that shows up.

Hope this answers your questions!

volatilegx




msg:3036781
 2:39 am on Aug 7, 2006 (gmt 0)

Actually, Brett himself has linked to it [webmasterworld.com] ;)

The IP lists there are still up to date.

rankboy




msg:3037905
 10:46 pm on Aug 7, 2006 (gmt 0)

:) Sorry Dan. I always like to refer people to your software because I think it is great. I just wasn't sure if I could link to it here. I guess if Mr. Tabke linked to it then we are all good. :)

ryan_b83




msg:3039129
 8:07 pm on Aug 8, 2006 (gmt 0)

Thanks guy, good info to keep in mind!

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / Cloaking
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved