Forum Moderators: open
I just found some who scrape parts of the content of a site, making a "directory" with links and descriptions of other sites. Nothing new.
The twist here is that an external javascript overwrites the content visible in the browser, making the page a doorway page. Of course Google has no way of automatically detecting this type of 'cloaking', since they don't read javascript.
While with those other scraper "directories" there's at least some traffic you might get from them (at least in theory), here you get nothing and just might get penalized for dupe content.
I couldn't find anything in the forums about this type of trick. Anyone run into these before?