Forum Moderators: open
After changing the architecture of our site to ASP, we have finally been indexed to the level that we want. There has been an unexpected side effect though that is causing trouble on the site.
We are using Ecometry to run our paper and online catalog business. This program issues a User ID upon connection to our HP. The problem is that visiting spiders are not issued, or do not accept a User ID. Instead, our URL's are indexed as:
[mysite.com...]
This 'error' state of the User ID is maintained throughout the shopping session.
Now, I would rather see to it that none of the strings get indexed at all. If a User ID were ever indexed, or worse, if multiple User ID's were indexed, we would have quite a mess.
Is there a way to rewrite the URL's to remove everything after the question mark, before the spider reads the page? Is there a better approach to feeding these ASP pages?
You are setting a session variable when a session is started with your global.asa that assigns your userid. Since spiders can't accept cookies you are setting that variable to "Error on Connect". That is passed around your query string during crawl. Why not just set it to spider. I think you will want to check for this variable in the global.asa. When a user hits the site check if they can recieve cookies. If so then pass out a valid user id.
My question is why pass it in the query string. Keep it in the session variable. That way no spiders or users ever know the user id. Let me know if that is what you mean.
When we get the CGI formatted over to the ASP pages, then we will send it along as a session variable.
For now I think that static pages are going to be the way to go.