Forum Moderators: open
They are not happy with the placement in the searchengines. So they look around for some help.
Number1 a (PPC company) says they can deliver alot of traffic each month and they sign a agreement with www.foo.com pays per click.
Now the PPC company start a automatic tool ripp all the text from www.foo.com homepages and put it into their program. The PPC company spit out 1000pages of cloaked text pages that have different titels and bodytext built around a list of phrases and words. The registrer a new domain.
www.foo_new page.com.
The visitor will search for a word and click a link to www.foo_new page.com with the cloaking there is only 1 page pointing all it links at www.foo.com
This technique spamming the Google index. with 1000 pages.
And it seems like Google accept it.
And by the nice links linking into www.foo_new page.com it seems that there is alot of money involved because www.foo_new page.com get a high PR very fast.
So if Google accept this then they have to make their servers alot bigger beacuse the index are gonna get alot bigger as long as this is alowed.
This situation is kinda strange.
What do you think?
Do moneytalks or what?
There's nothing Google can do to stop 100% of people from cloaking their sites -> from auto creating content, to just altering the formate of the page as Google comes by to crawl the site. Eg, serving a spider a version with no CSS file, no Javascript, no images perhaps? and a 'spider friendly' version of the base HTML.
That being said, this kind of thing doesn't work long term. Sure, you can do it.
Take a look at the cached version of MSN.co.uk -> last time I checked, they were doing what's called User Agent cloaking for their site -> and gave Googlebot an altered version of the file because Googlebot is not Internet Explorer. Last I checked, they aren't banned. :)
There are also cases where I doubt that you would get in trouble, like feeding it a page with HTML only links instead of the same javascript links that are available on the page that you feed to the user. As long as you do not modify the real content of the page, I would expect it to pass a hand check.
Where you can get in trouble is the sort of situation that you described. There are more problems than just cloaking. You have duplicate content, doorway pages, no doubt some keyword stuffing, and your own personal link farm.