homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
Forum Library, Charter, Moderators: mademetop

General Search Engine Marketing Issues Forum

URL structure
We're close, but stuck on details

 9:42 pm on Feb 3, 2009 (gmt 0)

We're developing a local site and have some questions on the best url structure.

1. For the listing pages, we need to include an ID number (#*$!XX) for architecture purposes. Here is what we're torn between:


I'm leaning towards the 2nd option. I don't like using parameters in the url and don't want to associate the page another layer off the root if i can avoid it.

2. There are multiple detail pages within this listing. So our options are:


Which do you think would be better?

3. On the category page, the URL will be:


But there is unavoidable pagination involved, so we're not sure if we should do either of the following:


Suggestions greatly appreciated - thanks!



 2:51 am on Feb 6, 2009 (gmt 0)

don't want to associate the page another layer off the root

I don't see why. If the interlinking is good and equal, there's no real difference.

I would prefer example.com/category/city-state/2 (to avoid the ?)

Just make sure that page 2 is inter-linked, same as page 1.


 1:36 pm on Feb 6, 2009 (gmt 0)

From an SEO perspective move as far away from ? and other parameters as you can!


 2:12 pm on Feb 6, 2009 (gmt 0)

Our biggest issue is with the main listing and the ID we need to put in there.

Which is better?



 2:15 pm on Feb 6, 2009 (gmt 0)

I like...


No file extension, short and sweet. Eliminate all erroneous identifiers from the string. Be sure the string is consistent also. If you move to lower levels in the taxonomy, the string will most likely get appended like a breadcrumb.

It looks like you may end up with some long URIs based on your original post. I'd be focusing on trimming those back as much as you can. Shorter URIs are much easier to work with no matter how you view it. :)


 2:21 pm on Feb 6, 2009 (gmt 0)

Thanks for the input. Any concern that putting the ID as a folder pulls it another level off the root? Will that have an adverse impact on rankings?


 2:23 pm on Feb 6, 2009 (gmt 0)

Don't put the ID as a folder. That would be the page level destination and the end of the URI string. No forward slash, no extension, just

Any concern that putting the ID as a folder pulls it another level off the root? Will that have an adverse impact on rankings?

Not really unless of course click path is affected. It is not the directory depth that one needs to be concerned with but "how many clicks" it takes to get to the final destination. Shorter click paths equal better performing pages in most instances.


 2:27 pm on Feb 6, 2009 (gmt 0)

Thanks so much!


 2:30 pm on Feb 6, 2009 (gmt 0)

P.S. Don't end those URIs with trailing forward slashes unless there is content under that level. Either end with the file extension or no extension which I highly recommend moving forward.


 2:35 pm on Feb 6, 2009 (gmt 0)

I never realized this before, but are you saying that


is the same level as


as long as there's no trailing slash?


 2:41 pm on Feb 6, 2009 (gmt 0)

example.com/something/somthing is the same level as example.com/something.html

No. The first one is at the first sub directory level. Your second example is at the root. They are not the same. But, this and this "usually" are...


What you see above is referred to as Content Negotiation where extensions are removed from the URI strings. There is no need for those to be visible or indexed.

It gets pretty tricky when working with this type of structure and you have to be very strict in your naming conventions for scalability. You also need to make sure that all permutations of the URI return the proper server headers. I usually walk backwards through the URI (hack it) to see what gets returned in the headers...



 3:07 pm on Feb 6, 2009 (gmt 0)

I was looking at some of your original examples and I'm feeling awfully giving this morning. :)


So it appears that you are setting up a directory that is global in nature and you are doing a regional type taxonomy? After dabbling in that space for many years, I've learned some things. And with the search engines becoming as smart as they have, I've changed my strategies in some areas. For example, I don't want long keyword laden URIs anymore. I "know" that Google can determine that California and CA are one in the same given the taxonomy of the website. I also "know" that Google can determine that US is the United States.

Knowing the above, I might look at a URI structure like this...


I see you are trying to get the company name in the URI? That is going to cause scalability issues moving forward. You are surely going to have two companies with the same name. And yes, I see that you are blending other parameters with the company name to negate this. I wouldn't do it that way. I'd give each company a unique ID and use that moving forward. That way there is "never" a chance for duplication and it scales nicely.

You may also find yourself receiving requests to remove company names from URI strings. I've been faced with that in some instances. It is a branding issue for them and if you start appearing in top results for company name searches, you have to tread lightly. Removing the company name reference from the URI protects your bases in this area. You have plenty of other ways to target company name searches that are more subtle than stuffing the URI. Also note that some company names are very long. That is only going to add to the unmanageability of the URIs moving forward.

Now, if you are dealing with US only, I'd go this route...


And, I'd be sure that I was serving a Table of Contents (Sitemap) at each sub directory level that was marked as noindex. I just want the bot to follow those TOC links so that it can traverse deeper into the taxonomy.

There's a bit more to this but I think you can get a feel for where I'm going with it. Be sure to have a master Sitemap that links to all the sub level Sitemaps. Keep everything connected in a logical sequence. Take control of the bots and provide them with direction. Don't just let them come in and start bouncing all over the place. That would not be an optimal indexing.

Global Options:
 top home search open messages active posts  

Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved