Forum Moderators: open
It's definitely a large advantage to keep your URL's as short and uncomplicated as possible. It's not only that URL's above a certain length may not be read on any websites. There also is the problem that if you do not have a very high PR, then googlebot may use stricter rules upon you for what it will read and how deep it will go into your website.
AFAIK it's almost impossible to optimize a page for more than two or perhaps three keywords unless you go for a complete phrase. For optimization's sake there is really no good reason for having a lot of keywords in a URL. Unless perhaps you have structures like www.keyword1.com/keyword2/keyword3/keyword4/....
[somesite.com...]
To:
[somesite.com...]
Don't most search engines rank sub-directories on a domain with a lower priority than pages found in the root directory? I realize that just getting a ranking is better than nothing, but it seems that these pages would be deemed less important than the mypage.asp file itself.
I've seen some sites that do something like this:
[somesite.com...]
That seems like it would rank better. Does anyone know of a company that sells such an ISAPI filter for IIS?
[webmasterworld.com...]
I have urls that are very long and get crawled
example :
www.site.com/p2-cat997-si-page10-sort2-per12-pass-user-type.html
http://www.domainname.com/Merchant2/merchant.mvc?Screen=CTGY&Store_Code=FN&Category_Code=J
Just to stress: keep the URL variables to 2 or fewer. The URL you posted has 3 and there is a very good chance that it won't be crawled.
If you are concerned about the length of the URL, you could shorten the names of the URL variables. You could change your code so that "Screen" = "s", "Store_Code" = "sc", and "Category_Code" = "cc".
That would turn:
into:
I've also started using popular keywords as variable names so that I have those keywords in the URL. But I don't know how much that helps.
i somewhere read about 1024 or 2048 chars as spec, but can't verify in my bookshelf yet.