Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: goodroi
Having disallowed every instances where there might be dup content and feeling smug about it I got ....
Now however I am finding google getting creative with its bot and indexing links with double / in them.
These // don't even exist on my board structure at that level, yet they are allowing the bot to index dup pages, that the above code had stopped them!
The bots seem to just add an extra / when they feel like it
Also if your problem is only with Google, then you can use their wildcard option in your robots.txt. That might make your robots.txt simpler.
I do note that none of the // URLs which Googlebot have been fetching have turned up in results and none appear if I do a site search.