frankleeceo - 6:54 pm on Dec 3, 2012 (gmt 0)
I have a wordpress install that somehow generated tons of automatically generated URLs that were "not indexed" by google. I still cannot find the source of the bug but I have now added that string of automatically generated URL's to robots.txt as disallow just past weekend. Now those pages appear with "A description for this result is not available because of this site's robots.txt"
The gibberish url is something like ?gibberish/page2/page3/page2 and continues on and on. And for some reason my wordpress install recognizes it as a valid URL with robots tag = index and everything. Although they are "not indexed" by google (supplemental index) because they have exactly the same content as my archive pages. I cannot seem to remove the gibberish generated pages that because it is out of my capability.
It will take a while to see if Google recognizes it as a bug and remove those accordingly. I just hope. I will report if my "not selected" count go down in the future. Or at least if it will stop rising.
I do think that you may have to worry if the "not indexed" count continue to rise, it may be a bug with any of your code that generated and feed gibberish url to google.