Forum Moderators: open
If proof is needed that Google spiders dynamic sites - I have plenty of evidence. In fact - many thousands of them, all kept in Google's cache.
Those pages also demonstrate why Google (and other Search Engines) are very reluctant to spider dynamic content. The number of duplicate pages is enormous. That harms everyone - Google, the users and the site too. An awful amount of resources is just wasted.
Having seen the consequences, the site's structure is now totally transformed. While still dynamic internally - it looks and behaves as static externally. So my question is:
What is the shortest and simplest way to 'persuade' Googlbot to 'forget' all the duplicate pages and start afresh? Is it JUST a question of time, or can one DO something about it?
Any ideas or suggestions?
Thanks!
Suppose that the OLD URLs are of the format of
...AAA.com/old/*
and the new ones are
...AAA.com/new/*
If you have easily-distinguishable URLs like that, then it's not so bad.
On Apache:
RedirectPermanent /old/ http://AAA.com/new/
Jim