Forum Moderators: coopster
Due to a change of supplier I have to remove around 900 pages from my site all from within a sub-directory. These pages are all doing okay in the search engines so I'd like to be able to handle visitors who click on these pages in the serps.
My plan is to use my custom 404 page to redirect requests for pages in this sub-directory to the root page of this section using header("Location: index.php"). Would this cause any problems with the search engine spiders? I don't want the spiders being redirected as I want them to drop the old pages asap. Would a javascript redirect be a better option?
Any ideas appreciated, thanks.
Thanks for that.
I'm going to replace each of the pages with a page that will use a redirect rather than using the 404 error page to redirect them, this as it turns out won't be possible as I can't get at the referer info from the 404 page.
I need the robots to see that the page has gone completely so what would the best response code be? Does 301 effectively mean that the old page can can now be found at this location? If so this isn't what I'm after.
Also, although not a php issue I understand that Slurp isn't so good with redirects. If this is the case what's the best way to deal with this problem?
Thanks