You are ok for now. Just don't declare the size of the graphics in the img tag.
Ugh... what a pain. I see if I can't think of anything else.
Maybe just submit each page individually...
A thought that I had as I am going dynamic on all product pages is this...
Index page makes a call to let's say mens rings. Now all mens rings are dynamicly created using "obnoxious long links with question marks and other ugly stuff". What I am thinking of is make the call from the index page to a static flat page which describes mens rings and has one item on it. From this page the rest of my mens rings inventory is called dynamicly.
Just a thought. I would love someone elses thoughts on is this a viable solution.
Well, the shopping cart generates a user id number when you enter the shopping area, and each link URL must include "user_id=id" in order for the cgi to track the number, so I don't know if an ssi call or something like that would work.
Perhaps I've missed the mark again, or am just too small time (not so many inventory items) or something, but here's my take:
Your static catalog page is broken down into catagories, then that catagory page is broken down into either subcatagories, and then finally to a page with thumbnails and descriptions of all the items. Then each thumbnail leads to that item's own static page containing a larger pic, more detailed description perhaps, and THEN the buy button that activates the script to place it in the shopping cart.
With that system in place, wouldn't the catalog page, the catagory page, the thumbnail page, and the single item page be enough fodder for the spiders? Why would you need to go into all the trouble of trying to get dynamic (ugly URL'd) shopping cart pages indexed as well?
Like I said, maybe I'm way off the mark.
The "user_id=id" and other ugly stuff must be on every link inside the shopping area, or as the customer moves from one category/page to another, the cgi 'forgets' what's in their shopping cart (no cookies).
I can design the entire site with static pages, as there are fewer than 50 inventory items, and put lots of spider food in the product descriptions, but the ugly links would still need to be there in order for the shopping cart to remember what the visitor has already ordered when they move to a new page.
With my primary employer's site, we have a large informational section with lots of spider food in it, so I didn't worry much about the shopping cart area... just made sure that the front page of the store had some 'food' on it.
For this site, the content is 99% store, so I have to figure out a way to lead the spiders around the site inspite of the ugly links, because that's all there is to index.
Perhaps I'll just build each page, not worry about the links, and submit each page separately to the engines. I think the client will get most of her traffic from sites like AOL, Yahoo and Go, which prioritize their index listings anyway.
My site also relies on the big ugly links to keep track of the shopping cart, but I wrote a small perl script to run off static versions of each page (with ordinary links between them) whenever I modified it.
Basically, a visitor (inc spiders) can explore the static site to their hearts content until they hit an 'Add to Cart' button, and from then on it's all dynamic.
Works well for me, but depends on the layout / backend of your site whether it's a good solution.