I've just noticed over 100,000 http errors in WMT.
They are coming from url's such as these:
http://www.example.com/index.php?app=core&module=reports&rcom=uploads&id=30786
These url's were obviously for users to report content, we've since removed this ability for guests. However, Google is still visiting them daily.
First question is, would over 100,000 of these errors effect quality score?
Second question is, via robots.txt, how do I deny these URL's, seeing as they're not in a directory.
Could I for example;
Disallow: http://www.example.com/index.php?app=core&module=reports
Would this catch them all or is there a way to use this as a wildcard? or is it not possible via robots.txt?
Thanks a lot.