Msg#: 3323482 posted 7:20 am on Apr 27, 2007 (gmt 0)
Hi, I'm pretty new to SEO so I may be trying to do something impossible here.. any advice would be appreciated.
I have a site which was well indexed by Google and used to draw a nice amount of traffic. It consists of about 350 main pages, each with various sublinks. I submitted a sitemap to google listing the 350 main pages which was picked up fine.
Recently I noticed that google had spidered much further into the site and had indexed the checkout page for every item plus all the other links off the main pages giving a total of about 950 index pages for my site. Most of the subpages indexed contain the same keywords as the main page and hence may be viewed as duplication.
Round about then the traffic dropped off and I noticed most of my main pages had gone supplemental. Obviously I'd like to get my traffic back!
My main pages look like this: [mysite.com...] The various sublinks are ugly and formatted something like this: [mysite.com...]
Is it possible to write a rule in robots.txt which would disallow anything which is not "main.php?g2_itemId="..?
If it was possible, do you think it would do any good? Or am I barking up the wrong tree?
Yes, but then "none" of the links will be followed including both internal/external.
I think that's what I want. My site's a simple tree structure. Once you drill down to a main item all the links are Checkout, View Cart, Login and other things I don't really care about being indexing.
I'm thinking as long as there is a clear route from the home page down to all the main item pages, and as long as they get indexed, then I can safely do a index,nofollow to turn them into dead-ends... maybe? It's highly likely I'm completely misunderstanding how all this works tho!