| 10:03 pm on May 9, 2011 (gmt 0)|
Cloaking would be when the content seen by google is not seen by humans.
| 2:33 am on May 10, 2011 (gmt 0)|
First off, I'm not "detecting" Googlebot and deciding to serve different content. That would be blatant cloaking, I get that.
Let me spin what you said around...
Cloaking would be when the content seen by humans is not seen by Google.
I guess a better question to ask is "does Googlebot see my AJAX'ed post-loaded content?"
If Googlebot does see it, then it's certainly not cloaking since the content is identical.
If Googlebot does not see it, then what?
Is it Googlebot's fault it doesn't see the post-loaded content, or is it my fault because I haven't implemented an "HTML Snapshot" mechanism vis-a-vis [code.google.com ]
Granted, the link I referred to above warns against using the AJAX Crawlable methodology to cloak (intentionally serve different content for different users), but I'm wondering that if Googlebot can't see the post-loaded content then, under a Manual Review, it could be considered 2 different versions of content - humans getting 4/4 of the content, while Googlebot only sees 3/4 of the content.
I hope I'm making myself clear. I apologize if I'm not articulating well, but I'm certainly not referring to blatant user-agent-detection cloaking.
| 3:24 am on May 10, 2011 (gmt 0)|
I believe the spin would be a Logical Fallacy - affirming the consequent. If the content google used to determine the page is relevant is plainly seen by the visitor then it would not be considered Cloaking - if the html shown to google is not the html shown to people it is cloaking.
... I would add that Google apparently also wants this content to be near the top of the page and not obscured by ads but the beats a different drum called Panda ...
Google looks bad when the content seen by the visitors does not match what the visitor expected. Google does not like it when web masters make them look bad.
Now if there is additional content that people can see but bots can not ... IE flash, images, ajax ... these are not cloaking; But if it makes google look bad they will come up with a penalty.
Google does not care about being fair they care about looking good. If SEOs help Google to look good they like SEOs. If SEOs make Google look bad they know how to deal with us. Fault is irrelevant when they have a billion pages to pick from.
Can Google see the material? Is a different and very interesting question. My research seems to indicate that if content is shown in the first few seconds by script Google sees it. An ajax picking up another file and embedding it is something I've not specifically attempted to test. I would assume the space to be seen by Panda as empty and assume Panda does not attempt to determine what goes into it.
| 3:44 am on May 10, 2011 (gmt 0)|
It's not cloaking as far as I understand it. If you don't use the hash-bang (#!) convention that Google introduced, then you most likely have content that Google cannot currently index. That's a lot different from serving content to Google that only Google can see (cloaking).
Added to that, in recent years Google spokespeople usually talk about "deceptive cloaking", not just "cloaking". There are all kinds of non-deceptive situations where googlebot gets something different than another user agent, for example, geo-location within a site might serve different content according to regional differences. That is also not deceptive cloaking, as long as the IP googlebot visits from sees the same thing as any other IP from that region.
| 4:58 am on May 10, 2011 (gmt 0)|
|If the content google used to determine the page is relevant is plainly seen by the visitor then it would not be considered Cloaking |
|That's a lot different from serving content to Google that only Google can see (cloaking). |
Thank you both, that's exactly what I wanted to hear, and inline with my gut feeling. Sometimes you just have to hear someone else say it.
| 2:53 pm on May 10, 2011 (gmt 0)|
Agreed, its not blatant cloaking that is against Google's policies. I've loaded content into a page via AJAX to hide it from Googlebot in many cases myself.
But as SanDiegoFreelance points out, if you take it to any kind of extreme, Google would have no problem whacking your site with a penalty.
I've also seen posts here that indicate that Google may devalue your site if it thinks there are big empty sections of white space like iframes that it can't see.