I am having a strange problem where Googlebot has found URLs that don't exist yet they serve 200. I would like to block all of these urls on the site. This has caused over 5000 pages of dupe content. I have not been able to find where the urls are coming from, whether malicious or not, and would like to make sure I am getting the synatx right before I add this to the robots.txt file.
All good urls should contain a ?_function=(whatever) in the beginning of the query string like this...
Googlebot (or some other entity) is finding query strings like this...
...and other variations of this BUT all the muxed variation start with ?ForumMasterThreads_uid1=(n)
I would like to disallow all of these but need to make sure to allow all others. Is this the correct syntax and would this work?
By the way, I checked, double checked, triple checked and yes... quadriple checked my code and there are no pages that generate these URLs. Any help would be appreciated.