Forum Moderators: mack

Message Too Old, No Replies

How to manage Googlebot

Need to "noindex" my store pages

         

miki99

9:04 pm on Jan 2, 2007 (gmt 0)

10+ Year Member



Well, since my site dropped like a stone in the Google SERPs in late September, I've done everything I could to elimate duplicate content, clean up my code, add original content, etc. I even did the whole 301 non-www to www 301 redirect thing for my whole site, with lots of help from members of this forum (thanks again). Nothing has helped so far. I realize recovery can take many months, but....

I'm now pretty sure Google regards my site as a "thin affiliate." The problem I believe is that my Amazon store pages have greatly outgrown the original content pages of my site. I have trimmed back my Amazon store quite a bit, but I don't want to get rid of it, as it actually attracts quite a bit of traffic from MSN and Yahoo. (None whatsoever from Google, though it used to.)

So what I would like to do is disallow the Googlebot, and only the GB, from indexing my store pages. I don't have much experience with this kind of thing, so I looked up the code to use for the robots.txt file (or meta tags for individual pages), but I'm still a bit hazy about a thing or two.

Firstly, is this a good idea, or a bad one for some reason?

Secondly, what about all the links to the store pages from my content pages? Will the GB try to follow those, and what happens when it can't? Or is that basically what's supposed to happen? (Hope you understand what I mean.)

miki99

11:30 pm on Jan 2, 2007 (gmt 0)

10+ Year Member



I have been reading threads about using robots.txt vs meta tags on each page to disallow Google, and am even more confused. Some webmasters seem to find that Google ignores either one, or the other, or both. I'd really appreciate some advice on this. Thanks.