Msg#: 4059089 posted 2:29 pm on Jan 12, 2010 (gmt 0)
I originally cloaked my site to identify search bots so that database records would always sort the same way when they visit - they typically sort randomly. My thought was that by making the pages more 'static' to the bots, I would get deeper searches.
I also showed a simple menu stripping out JaveScript that's early in my code. I figured that by reducing the code, the pages could get searched deeper. I didn't change the content or structure of the menu.
Asking Google about this, I was told to absolutely not cloak under any circumstances - it's purely black hat. My feeling was that I was simply making things easier. I wasn't changing content.
I'd like to know what webmasters generally think. I knew cloaking was common and now ran across this forum. I got the feeling from Google that cloaking was a big risk and eventually would eventually land me in on SERP # one million.
Msg#: 4059089 posted 4:19 pm on Jan 15, 2010 (gmt 0)
I think the line is clearly drawn when the purpose is to show the SE one thing and the visitor something entirely different, when you're trying to manipulate the SE with different content than the user actually sees.
Msg#: 4059089 posted 10:22 pm on Jan 17, 2010 (gmt 0)
I've done some more reading and Google has done a better job lately at indicating what is 'unacceptable'. In their vernacular the term 'cloaking' is strictly black hat and they don't have a term for what is considered acceptable. In general though, using cookies or geographic locations to modify a page is acceptable but using IP addresses or identifying bots is not.
Msg#: 4059089 posted 1:28 pm on Jan 18, 2010 (gmt 0)
I read that a couple of years ago Google gave NPR a pass on cloaking when they put the transcripts of Flash files online for search bots. I think in general bigger companies get a break but with one good slap, I'd be in the poor house.