| 3:02 pm on Jan 12, 2010 (gmt 0)|
Why ask Google?
Might as well paint a target on your site and say "ban me!"
Many of us do some form of cloaking, often there is simply no other way.
For instance I cloak the text ads on my site to stop Google from running impression counter or from crawling them.
They're there for the content, not the ads, the ads are for the visitors.
| 5:30 pm on Jan 14, 2010 (gmt 0)|
I've spent the better part of last year learning SEO but don't have much practical experience yet.
I was under the impression that ANY cloaking would eventually get discovered and would result in a real hard slap.
Anyone have a suggestion of where the line may be between white hat, gray hat and black hat cloaking?
| 4:19 pm on Jan 15, 2010 (gmt 0)|
I think the line is clearly drawn when the purpose is to show the SE one thing and the visitor something entirely different, when you're trying to manipulate the SE with different content than the user actually sees.
| 10:22 pm on Jan 17, 2010 (gmt 0)|
I've done some more reading and Google has done a better job lately at indicating what is 'unacceptable'. In their vernacular the term 'cloaking' is strictly black hat and they don't have a term for what is considered acceptable. In general though, using cookies or geographic locations to modify a page is acceptable but using IP addresses or identifying bots is not.
| 5:51 am on Jan 18, 2010 (gmt 0)|
Some big companies cloak by user agent and serve simple 80's style text only pages only to googlebot and a rich flash site to everybody else.
| 1:28 pm on Jan 18, 2010 (gmt 0)|
I read that a couple of years ago Google gave NPR a pass on cloaking when they put the transcripts of Flash files online for search bots. I think in general bigger companies get a break but with one good slap, I'd be in the poor house.