helenp - 9:57 pm on Jan 8, 2013 (gmt 0) [edited by: helenp at 10:04 pm (utc) on Jan 8, 2013]
What I meant is that you could do something like:
which will (for example) return all URLs that are php pages and which Google has in its index that use this particular parameter in URL. Often you will get message "...we have omitted some entries very similar to the 3 already displayed..." in which case you should click on "repeat the search with the omitted results included" to get the number of URLs indexed but "Not selected".
Trying this with different URL patterns and different parameters can show you which URLs may have problem in (perhaps) duplicate content, thin content and similar and you can check whether you have addressed the problem with blocking these via robots or noindexing them or in some other way.
Thanks, however the pages that have 404 (most of them) aren't dynamic and does not have any parameters.
I do have parameters since some years, but only for the bookingform for each static page, those parameters were only about 20 in google webmastertools before and google told me not to touch anything.
For some week I had more than 900 with that parameter and I told google to only index 1 of them (representive).
I think maybe that increase of parameters increased the same time as the amount of pages with 404 increased and my homepage fallen.
I am considering seriously on serving a 410 instead of a 404 to see if google take away those 404 errors quicker from webmastertools.
(The pages I checked were not indexed but they appeared as errors in google wembastertool.) ie google has them as 404 even though they redirect to homepage, but I think they see the homepage as duplicate content.
It looks to me all this is the due to the update in december, or maybe november.
Maybe I should serve the 404 as 410 for a while just to increase the speed of eliminating them from webmaster tools.
[edited by: helenp at 10:04 pm (utc) on Jan 8, 2013]