| 2:39 pm on Dec 28, 2006 (gmt 0)|
Welcome to WebmasterWorld akumaikaruga!
|Our workaround for that problem involves disabling the sessionid in the URL for Googlebot. |
I wouldn't only do it for Googlebot but for Slurp and MSNBot too. And all other well behaved bots.
|It makes no sense to have them for bots anyway, since Googlebot will most likely never log into our application with a user account and buy something. |
That's a sensible conclusion and one that I think Google and the majors would appreciate. Anything to assist them with indexing the site and not getting caught up in duplicate content, loops, etc. is an added benefit for you and for them.
My understanding is that cloaking by IP in this instance is probably the best solution.
| 2:13 am on Dec 29, 2006 (gmt 0)|
Maybe you could serve a temporary redirect to a URL without a session ID to bots. That wouldn't count as cloaking. Wouldn't it also eliminate duplicate URLs?
| 8:56 am on Jan 2, 2007 (gmt 0)|
Thanks you two for replying. Why would you prefer IP-based cloaking instaed of User-Agent?
And what do you think about the statement from the google rep that said it might lead to exclusion from the index?
| 2:50 pm on Jan 2, 2007 (gmt 0)|
I think you would be OK with user agent cloaking in this case.
It could indeed lead to exclusion from the index if you decide to cloak. Cloaking carries risks.
| 12:58 pm on Feb 20, 2008 (gmt 0)|
"Maybe you could serve a temporary redirect to a URL without a session ID to bots"
What I just intend to ask is that is it really possible to redirect an URL to a different URL only for "Googlebot"?
| 2:52 am on Feb 21, 2008 (gmt 0)|
| 10:50 pm on Mar 4, 2008 (gmt 0)|
use user agent in .htaccess to prevent cloacking issue.
| 9:06 pm on Mar 16, 2008 (gmt 0)|
I was actually at SMX West and Matt Cutts specifically said that this is OK - just FYI.