I have dynamically generated product pages (have 600 products) and on each page it lists the following:
1. A purchase button 2. A tell a friend button (loads a email script along with product info) 3. A add to wishlist button 4. Enlarge product image button 5. Complimentary product the customer may be interested in, say 5 or 6 for each product.
I have a problem in that the robots don't want to index my dynamic content and I think it may be because indexing product pages overwhelms it, yes? I mean your talking each product x at least 10 more links for a total of about 5400 links. Of course the complimentary products are repeats, but still it seems like to much.
Can I ban robots from a specific file, say tellafriend.asp or addwishlist.asp and does it matter if the file has a query string on the end of it?
Hopefully I explained this well enough, if I didn't let me know and I'll try and clarify.