Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Disallowing dynamic sites for Googlebot

         

code12

6:30 pm on Jan 17, 2006 (gmt 0)

10+ Year Member



Hello,

i changed the structure of my site and cleared around 1500 pages, which i dont wanna to serve anymore on my site.

Actually, some of the content was served by dynamic variables like http://www.example.com/?page=105.

How can i handle this best?

If i dont do anything, there is no Error 404 sending, because the URL variables dont have any influence anymore on the index-page.

That means.. dupe content! Because everytime Googlebot is requesting this old URL the index page without modifications is send.

Can i do something with .htaccess, mod-rewrite and robots.txt? Any suggestens how i can solve this problem?

And gnerally: What is bette, if you delete old files? Sending an 301 to the index page or an 404? Let say, you're deleting 1000 pages but you have still deep backlinks for them?

Thank you!

[edited by: engine at 4:24 pm (utc) on Jan. 19, 2006]

code12

9:24 am on Jan 20, 2006 (gmt 0)

10+ Year Member



The site has an PR of 5, and i got several backlinks from PR 6 sites to the URL http://www.example.com/?page=105.

Just sending a 404? Or a 302 to the index-page, to get this PR?

roodle

10:04 am on Jan 20, 2006 (gmt 0)

10+ Year Member



What type of dynamic pages are they? ASP? PHP?

code12

10:30 am on Jan 20, 2006 (gmt 0)

10+ Year Member



I dont need technical hints! It doesn't matter if it is php or asp. I just want to ask, how i can solve the probleme for the search engines?

Thank you!