Forum Moderators: mack
I'm now pretty sure Google regards my site as a "thin affiliate." The problem I believe is that my Amazon store pages have greatly outgrown the original content pages of my site. I have trimmed back my Amazon store quite a bit, but I don't want to get rid of it, as it actually attracts quite a bit of traffic from MSN and Yahoo. (None whatsoever from Google, though it used to.)
So what I would like to do is disallow the Googlebot, and only the GB, from indexing my store pages. I don't have much experience with this kind of thing, so I looked up the code to use for the robots.txt file (or meta tags for individual pages), but I'm still a bit hazy about a thing or two.
Firstly, is this a good idea, or a bad one for some reason?
Secondly, what about all the links to the store pages from my content pages? Will the GB try to follow those, and what happens when it can't? Or is that basically what's supposed to happen? (Hope you understand what I mean.)