Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Blocking googlebot after page online.

Will this stop google from dropping my pages as it.s doing lately

         

security56

2:15 pm on Apr 22, 2006 (gmt 0)

10+ Year Member



I was wondering if once google index certain page of my site can I somehow block google from reindexing it again.

The reason I want to do this is, that google seeims to be indexing a page then it goes on the search engine index then next week the page is gone from the index,

And I done so much cleanong of the page that I see no reason why they dropping, as of 3 months ago I never had this type of problems, but since big daddy it's a mess.

jonrichd

6:28 pm on Apr 22, 2006 (gmt 0)

10+ Year Member



It seems to me that if you block Googlebot from indexing your page (say with robots.txt, or by using a NOINDEX metatag), then Google will drop your page from the index, rather than relying on a previously spidered version of the page.

That would appear to be the opposite result of what you want.

If you're sure your page is gone from the index (use the site:domain.com search to make sure it's actually gone) you might want to verify that there are links to the page in question, that the content on the page is reasonably unique, and that your site as a whole as enough inbound links to encourage Google to visit more often.

HTH

tedster

6:34 pm on Apr 22, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Quite a few sites are reporting this kind of thing since Big Daddy -- even CNN.com! I would not do anything extreme in reacting to just this moment. Just make sure you are using best practices and have no technical problems at the server. I agree that trying to stop further indexing will probably compound your problems, rather than fixing anything.

security56

3:07 pm on Apr 23, 2006 (gmt 0)

10+ Year Member



Thanks guys, I almost did it lol but I wait I guess and hope for the best.

Thanks :)