Forum Moderators: open
I have two sites (ecomm) where almost all the pages (except the about us, etc.) are dynamic - like this sitelink.com/Products/ProductDetail.asp?PROD_ID=140
Does this mean, the content for each product description won't be spidered?
Another newby question - what is pr(x)
Thanks
[google.com...]
(at bottom of page)
Part of the problem what that, on some sites, SessionIDs could be re-used, or possibly hijacked and used by a malicious user. (I.E. Session Reply attack or establishing a session outside of an SSL session and later using that same SessionID to access 'secure' information...)
One of the recommended solutions was to reject 'user-supplied' sessions. If the session passed from user to server is not in an 'active' list, to either ignore it and generate a new one, or throw an error.
Depending on what the site may do it may not be generally wise for the GoogleBot to crawl sites with SessionIDs if it cannot knowingly strip them out when presenting the SERP.
The other main point is that the CMS shouldn't require session-ids. Google is actually pretty great at dynamic urls and getting better, but session ids cause a problem because they expire. Make sure to check for that, too.