Forum Moderators: open
Last year the co I work for had an outside designer create a new website. The site was then turned over to me and became an administrative nightmare because the guy didn't know how to organize the pages very well so it was a real mess.
In March I totally redesigned to site including the file structure and nuked all the old orphan files.
My problem is that Googlebot has crawled parts of my site, but I am STILL seeing search listings pointing to old sections of the website.
So my question is:
How can I fix this? Do I need to create bogus pages with redirects or will that just prolong the inaccurate listings?