incrediBILL - 2:41 am on Dec 3, 2012 (gmt 0)
Comes on guys, admit that a site protecting itself by identifying, bagging and tagging scrapers automatically is cool. I only block ranges to be preemptive because so far the technology stops almost all of it cold. However, some of them get a few free pages before they get blocked if they're really good which is where blocking ranges helps prevent any leakage.
scrape product descriptions from an IP that is translated
Exactly why I put tracker bugs in my text, another reason anyway, because the trackers don't translate so codes like XXYYZZ-3287520629 (code plus long IP) make it thru unscathed into auto-translated text, scrambled text, etc. and I can easily find them in Google, Bing, etc.
I'd recommend everyone do it but I'm afraid the scrapers would figure out how to filter them out if I put out some tracker bug module.