Welcome to WebmasterWorld Guest from 22.214.171.124
Forum Moderators: open
One that I've seen that I do understand is when I was moving files about and I marked the (old copies of the) HTMLs file with a Meta robots NOINDEX, FOLLOW
At this point, the files became Supplemental in Google.
So that's one reason to put on your list!
May be worth verifying that the supplementals aren't duplicates of other pages you have in the index...
joined:Dec 29, 2003
Another thing can be that your site was down when googlebot tried to visit
Googlebot in both above scenario will be served a 404 error.
I have some pages where an email address has been removed. The pages are still spidered, indexed and cached. The pages appear as normal for many searches, but when searching for that email address, the same page does appear in the results, and is then marked as a supplemental result. The snippet shows the old email address, but the real page does not (as it was removed many months ago). The cache does NOT show it either!
With such bollocksed up data storage it is a wonder they can find anything to rank accurately.