Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Complex Situation With Removal Of Pages

Removal of pages from Google.

         

wiseapple

5:38 pm on Jun 21, 2006 (gmt 0)

10+ Year Member



Greetings,
I am trying to get some pages removed from Google. I have it down to where the pages have no cache. They all have the "noindex, nofollow" tag in them. Google seems to be dropping them. However, should I put in my robots.txt file a directive to disallow Googlebot from getting at these pages. Here is my question:

- By putting the disallow directive in robots.txt --- Would it stop Googlebot from picking up the pages and removing them? Would they hang in limbo forever because Googlebot cannot get to them? Would the URLs remain in the index?

... Or ...

- Would Googlebot read the directive and immediately remove the pages? Not leaving them hanging.

Appreciate all help.

Thanks.

tedster

5:38 am on Jun 23, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I almost never actively work to remove urls. In general, Iike to let Google do what it's programmed to do most naturally (or unnaturally, as the case may be.) I think that approach causes fewer issues long term.

Old urls may hang around as Supplemental Results for many months (even years) in my experience. Even using the Google url removal tool only gets rid of their visible presence. Six months later, they still can be back. Even if the urls are 404!