Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Removing session IDs from URLs for bots = good cloaking

Remember when GoogleGuy posted that this was OK?

         

stinker

9:39 pm on Jul 19, 2005 (gmt 0)

10+ Year Member



I recall quite a while ago now that GoogleGuy posted that selectively replacing spider-unfriendly links on your pages with spider-friendly ones, just for Googlebot, was a good practice (and technically, not cloaking). For example, you could use this approach to drop session IDs from your URLs just for bots. I can't find that post and I've been digging for more than an hour. Can anyone point me to his post?

goodroi

4:25 pm on Jul 20, 2005 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



try this one [webmasterworld.com...]

blaketar

5:57 pm on Jul 20, 2005 (gmt 0)

10+ Year Member



Create an array of Spider IP's or Agents then include a global file in your web site file via PHP or ASP or?. In that file run a simple:

if(inarray("$SpiderArray", "$REMOTE_ADDR")) {
//do not start a session on this page
} else {
session_start();
}

Simple as that, tweak to your taste or server code using.

stinker

12:53 am on Jul 21, 2005 (gmt 0)

10+ Year Member



Thanks goodroi, but that wasn't it. It was a really old post from GG.