aakk9999 - 1:08 am on Jun 6, 2013 (gmt 0)
Make sure your robots.txt is in plain text format. Having it in UTF-8
Format and file encoding have nothing to do with each other.
Well, if you open your robots.txt in Textpad and then save it as UTF-8 and upload it to server, google ignores it.
Unfortunately some months ago I had a first hand experience in this - pages that were supposed not to be crawled were crawled.
Only when I saved it in PC ANSI then it started to "work" stopping Google.
To be clear: you're talking about URL parameters, right? Not <lang="something"> declarations.
Yes, I was talking about lang= parameter in URL, sorry this was not clear enough!
Did not know about ni parameter (thanks!), but there are also "reg" (sometimes used for region parameter) which turns into Registered Trademark. I am sure there are others!
But "lang" is very common, hence I mentioned it.