Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

How to safely remove pages from Google index?

is there any way?

         

ichthyous

7:14 pm on Mar 26, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I recently installed a module on my site that generates pages based on keywords I code into my images. Each word becomes a link and links to a page which shows all the images related to that keyword...similar to flikr tags. I enabled the module for one day and found problems with it, and also noticed my site slowed down somewhat. I was shocked to see a bit later in webmaster tools that google now reports almost 8,000 missing pages from my site! Googlebot must have been indexing furiously and that's why the site slowed down. I am worried that such a high number of missing pages will get my site penalized...is there any way to tell google to remove them permanently from the index?

SEOPTI

1:31 am on Mar 27, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Get a fresh domain, this is a safe way.

ichthyous

2:08 am on Mar 27, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Get a fresh domain? Uh ok...is that how you deal with all the problems on your sites? I looked it up on WebmasterWorld and just added disallows for these directories in robots.txt. I knew this already but had forgotten it.

Silvery

3:29 am on Mar 27, 2007 (gmt 0)

10+ Year Member



Your disallows in robots.txt are one good way to deal with it. Another is to place a robots META tag in your page headers with NOINDEX parameter in them.

Achernar

4:17 pm on Mar 27, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



If you have a valid robots.txt file, and it's not using any wildcard in the rules, you can use google's removal tool here : [services.google.com...]