Welcome to WebmasterWorld Guest from 126.96.36.199
I understand there are 2 issues.
1.The characters in the urls.
Will google bot have a tough time with these urls?
Is there a solution, simple or complex, to issues 1 and 2?
[edited by: jatar_k at 7:52 pm (utc) on Jan. 14, 2003]
[edit reason] shortened url a little [/edit]
It looks to me like your two questions cover the same issue, the characters in the url which are the session id.
Have you looked at using a transparent session id?
look for session.use_trans_sid
this one shows you how to use ini_set to change it, the user comments below are good as well.
bot have a tough time with these urls?
Yes, anytime you add?var=value you are starting to flirt with possible problems, regardless of what the engines say. For every var you add your possibilities for problems seem to multiply exponentionally.
(1) to a spider, '?sect=foo&subsect=bar' is not the same as '?subsect=bar§=foo', even though they are of course the same to your script. Pick one. [Adam - this problem doesn't quite grow exponentially, but it does grow as the factorial of the number of variables, which is close enough for horseshoes ;)]
(2) If each visitor gets a session ID, and URLs on your site are re-written to include it, you might easily fool a spider into thinking that you had removed all links to a page it saw last month, just because the session ID is different.
(3) I've probably overlooked at least one possible complication.
I'd pretty much say that session IDs in the URL are a bad plan if you want to let spiders in. I've got a site that uses them, but it's a place where spiders are most unwelcome for other reasons. (If you can't log in, go away.)
The word exponential was carefully chosen to imply that query strings should be avoided if humanly possible but thanks for calling me on it dingman. ;)