Welcome to WebmasterWorld Guest from 54.152.38.154

Forum Moderators: open

Message Too Old, No Replies

How to get pages permanently out of Google?

Same day as I removed Meta, old dead pages are back

     

mediaspinner

3:19 am on May 22, 2002 (gmt 0)

Inactive Member
Account Expired

 
 


I did the somewhat arduous task of meta tagging my pages with noindex tags for googlebot, and removing them one by one, two days ago... this morning, they were all out of the index, save the 7 I wanted left up. So I pulled the noindex meta off, hoping google would reindex my domain (with the new content... the site is 100% dynamic, if this matters, but I don't see how)... Anyhow, this evening, all 36 of the old ones are back. How can I get them permanently out of the index?
10:21 am on May 22, 2002 (gmt 0)

Full Member

10+ Year Member

joined:Mar 29, 2002
posts:248
votes: 0


You have to wait until googlebot crawls your site and then wait for the next index update.

mediaspinner

2:17 pm on May 22, 2002 (gmt 0)

Inactive Member
Account Expired

 
 


How do I know when the site index has been updated?
2:22 pm on May 22, 2002 (gmt 0)

Administrator from US 

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 21, 1999
posts:38246
votes: 108


If you wanted them out of the index, why did you pull off the no index tag?

mediaspinner

2:56 pm on May 22, 2002 (gmt 0)

Inactive Member
Account Expired

 
 


I pulled the noindex tag because my site is dynamic. There is only 1 meta header section for all possible generated pages on the site.

As my AM search showed that all the old, error generating pages were gone, I pulled the meta, in the hopes that the next googlebot sweep would get the new pages.

3:35 pm on May 22, 2002 (gmt 0)

Preferred Member

10+ Year Member

joined:Sept 7, 2001
posts:608
votes: 0


since your site is dynamic, why not make a table with a field to separate the pages that you don't want spidered. Then write a little loop the customizes the metas for each page depending on the table.

mediaspinner

4:03 pm on May 22, 2002 (gmt 0)

Inactive Member
Account Expired

 
 


Well, I've been thinking of some possible solutions including adding more generated meta to the source... currently the page descriptions and titles are generated this way. I also have a robots.txt that's working ok... I have a solution now, if my site had been indexed two weeks ago, it'd work perfectly. I'd just like to get these previously linked 'no content' pages out of the search engine.

However, since Alltheweb/FAST has them too, I guess maybe I'm chasing a monster that I don't wanna catch (lotsa hoops to deindex one page at a time), and will have to rely on a dynamic meta solution like you propose.

I shoulda considered these crawlers more ahead of time, I guess... hindsite is 20/20, they say.

 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members