homepage Welcome to WebmasterWorld Guest from 54.197.183.230
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Marketing and Biz Dev / Cloaking
Forum Library, Charter, Moderator: open

Cloaking Forum

    
Cloaking to prevent SessionID in index?
Can cloaking be "legal" in some cases?
akumaikaruga

5+ Year Member



 
Msg#: 3201743 posted 2:28 pm on Dec 28, 2006 (gmt 0)

Dear Webmasters and Googlers

I work for a large Website in Germany. The Sessions on our application are configured to pass their ID in the URL if the user does not accept cookies. Certainly Googlebot does not accept cookies, so we ended up with lots of different URLs for the same page in the Google-index. Easily spotted with the "inurl:"-parameter in a query on Google.

Our workaround for that problem involves disabling the sessionid in the URL for Googlebot. (It makes no sense to have them for bots anyway, since Googlebot will most likely never log into our application with a user account and buy something :-) )

Now my company had some consulting from Google Germany. And one engineer there stated that disabling the sessionid in the URL for Googlebot will be counted as cloaking and will eventually lead to exclusion from the index. On the other hand the engineer stated that we have to avoid duplicate content (same content with different URLs on the same domain) by all means. To me these 2 statements are completely contrary and make no sense since booth can not be followed at once. Is there an official Google-Way to deal with Sessions in URLs?

Will using sitemaps to communicate to the "official" URL eliminate the duplicate URLs from the Google-index?

thanks for any help
Sebastian

p.s.: this is a cross-post from here:
[groups.google.com...]
but noone cared to answer there :-(

[edited by: encyclo at 2:57 am (utc) on Feb. 21, 2008]
[edit reason] fixed formatting [/edit]

 

pageoneresults

WebmasterWorld Senior Member pageoneresults us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3201743 posted 2:39 pm on Dec 28, 2006 (gmt 0)

Welcome to WebmasterWorld akumaikaruga!

Our workaround for that problem involves disabling the sessionid in the URL for Googlebot.

I wouldn't only do it for Googlebot but for Slurp and MSNBot too. And all other well behaved bots.

It makes no sense to have them for bots anyway, since Googlebot will most likely never log into our application with a user account and buy something.

That's a sensible conclusion and one that I think Google and the majors would appreciate. Anything to assist them with indexing the site and not getting caught up in duplicate content, loops, etc. is an added benefit for you and for them.

My understanding is that cloaking by IP in this instance is probably the best solution.

volatilegx

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3201743 posted 2:13 am on Dec 29, 2006 (gmt 0)

Maybe you could serve a temporary redirect to a URL without a session ID to bots. That wouldn't count as cloaking. Wouldn't it also eliminate duplicate URLs?

akumaikaruga

5+ Year Member



 
Msg#: 3201743 posted 8:56 am on Jan 2, 2007 (gmt 0)

Thanks you two for replying. Why would you prefer IP-based cloaking instaed of User-Agent?
And what do you think about the statement from the google rep that said it might lead to exclusion from the index?

Thanks
Aku

volatilegx

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3201743 posted 2:50 pm on Jan 2, 2007 (gmt 0)

I think you would be OK with user agent cloaking in this case.

It could indeed lead to exclusion from the index if you decide to cloak. Cloaking carries risks.

venus

5+ Year Member



 
Msg#: 3201743 posted 12:58 pm on Feb 20, 2008 (gmt 0)

"Maybe you could serve a temporary redirect to a URL without a session ID to bots"

What I just intend to ask is that is it really possible to redirect an URL to a different URL only for "Googlebot"?

volatilegx

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3201743 posted 2:52 am on Feb 21, 2008 (gmt 0)

is it really possible

Sure!

bilalseo

5+ Year Member



 
Msg#: 3201743 posted 10:50 pm on Mar 4, 2008 (gmt 0)

use user agent in .htaccess to prevent cloacking issue.

bilal

portentint

5+ Year Member



 
Msg#: 3201743 posted 9:06 pm on Mar 16, 2008 (gmt 0)

I was actually at SMX West and Matt Cutts specifically said that this is OK - just FYI.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / Cloaking
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved