Welcome to WebmasterWorld Guest from 18.104.22.168
Date: 01/30/2006, 14:27:06
UA: msnbot/1.0 (+http://search.msn.com/msnbot.htm),gzip(gfe) (via translate.google.com)
It seems like a "dirty trick". Is MSN going black hat to fight cloakers? Does Google know about it?
A cloaking script would
Is there any evidence to back up this opinion? Not that it isn't likely, but it sounds like an opinion.
Actually, my cloaking script is immune from this because it isn't user-agent based and the Google translator IPs are excluded from my list.
not sure if that ip range of their's is used for their bots or if
it might be used for their own staff, possibly a human at google
may have been browsing or checking a site. could have been
checking for cloaking via user agent.
Not knowing info about the site where the log entry was from
(that is, if the site was written in non-english) a translator
might have been legitimately been used.
As most are aware I have the majority of RIPE denied.
Some visitors will attempt to access a page and after the resulting 403 will come back immediately on the gooogle translator.
Unfortuantely that gets 403'd as well.
Jim's prospective on the google translator is the most logical that I'm able to recall.
He feels that is a visitor is intersted enough to use the google traslator (most software translators are pitiful and will be until the technology changes) than it's he desire to allow their visits.
Personaly, I'm just not willing to read or speak other languages so that I'm capable of determining if my pages have been duplicated.
Hell! I have a hard enough time with the English visitors.