Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: goodroi
1. A purchase button
2. A tell a friend button (loads a email script along with product info)
3. A add to wishlist button
4. Enlarge product image button
5. Complimentary product the customer may be interested in, say 5 or 6 for each product.
I have a problem in that the robots don't want to index my dynamic content and I think it may be because indexing product pages overwhelms it, yes? I mean your talking each product x at least 10 more links for a total of about 5400 links. Of course the complimentary products are repeats, but still it seems like to much.
Can I ban robots from a specific file, say tellafriend.asp or addwishlist.asp and does it matter if the file has a query string on the end of it?
Hopefully I explained this well enough, if I didn't let me know and I'll try and clarify.
Use the Meta noindex or nofollow on specific pages you want to control.
With dynamic pages, part of the problem is the complexity of the url. The sophisticated crawlers will follow dynamic urls as long as it is not too complex.
In addition, chech the file size of the generated pages. You don't want to end up with a single page much over 100k. Ideally, a lot under 100k in size.
Watch out for a spider trap. A site that feeds an infinite loop will cause the spider to eventually stop indexing altogether.
Check to see how much of your site is already indexed in the engines and look at how the link was followed from the home page. Sometimes that gives clues.