Thanks for the replies. What i don't understand is this...
I had cgi-bin blocked from all robots with an exception for User-agent: Mediapartners-Google.
When i tested a url under cgi-bin in webmaster tools it said
Blocked by line 59: Disallow: /cgi-bin
Allowed by line 21: Allow: /cgi-bin/
so you would think that would be o.k.
I've made some changes but a week later there are still over a 100,000 cgi-bin pages in a site: command
as for removing them with the "Remove URL" tool.
Will that work for a directory ?
Do I really need to remove them if they are banned in robots.txt ?