Forum Moderators: open
I wanted to know if anyone has experience with user agent cloaking on google. I want to write a JSP page that does the following.
Get The USER AGENT from the request header.
Check for the string "google" and then serve up Google search engine information.
My industry isn't very competive and i am not worried about people reporting me.
I wanted to know if Google checks for this technique. I have heard rumors that they send out own test spiders that PRETEND OR HIDE there user agent name and pretend to be a normal surfer to see if people are trying to cloak. Does anyone know if this is true?
Spam away! Just pray that you're not pushing into any of my client's areas or you will be reported and you'll be put on every spam mailing list I can find for the next year.
If you have no reputation or investment in the site that couldn't be dropped in an instant if the police were about to catch up with you, then go ahead and cloak so long as it works; then drop it and get yourself another site. If, on the other hand, you do have a reputation and you're using your site to communicate it, then ... you'll want to consider a different strategy.
Google can check by hand, and you'll be banned until the sun grows cold, and still no bother.
GoogleGuy has stated that sites are given second chances, he even stated a timeframe of six months in a WebmasterWorld thread. I don't believe it's even that long. If you clean up your act you can be back in the rankings within a couple of updates.
Also Google say this about spam techniques:
"Don't employ cloaking or sneaky redirects ... we attempt to minimize hand-to-hand spam fighting. The spam reports we receive are used to create scalable algorithms that recognize and block future spam attempts."
I don't think that many sites are removed anymore, Google just try to make their algorithm spam proof. Anyone agree?
1) Try to keep the IPs secret because it wouldn't make sense to systematically check since everyone would eventually figure out what the test bot would be and that would be self defeating
2) Just base it off the complaints I get
This would be more selective, use resources intelligently and keep whatever IPs they use more secure.
Trust me programmers don't like to check by hand
Oh well wish me luck, I am curious
Also agree that, in keeping with their barrelfull of programmers, Google focuses on the algorithmic approach. And that means the cheaper cloaks of three years ago are simply flipped aside by the Googlebull on its way to goring the man behind the curtain. And cloaking relies on constantly, consistently guessing what forms of cloaking the algorithms are going after next. It's not for the amateur.