Forum Moderators: Robert Charlton & goodroi
I wonder if anyone can ponnt me in the right direction here.
I have a website with 1000's of category pages that are generated on the fly based on the visitor location.
I recenttly tidied up the entire directory structure to make it much more search engine firndly and easier to navigate.
As I did this I re-bult my website's xml sitemap files which sere sucessfully submitted to Google in my Webmaster account.
My problem is this. on my Google Webmaster dashboard for this website I can see 4000+ Not Found records referring to my old site structure.
I have tried submitting a few of these uerls for removal by Google under "crawler access" but each request was denied.
The Not found URLS no longer exist on my site and are not linked to from anywhere in my new sirtestructure, so I can only assume Google is reporting on pre indexed pages since before my update.
I have tried adding each url Not found into Disallow: statements in my Robots.txt. is this a good idea? As there are 4000+ urls.
Will Google eventually remove the 404 urls from its index?
Any advice on this will be much appreciated.
Thanks
Mr M
Otherwise, yes, they will eventually be dropped, but so will the weight of any links to them and possibly (likely IMO) the visitors who click on a link to them and get a 404, and those seem like they may be a fairly big loss.