homepage Welcome to WebmasterWorld Guest from 54.234.141.47
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Should I flatten my store's URL structure for better SEO?
quicksilver1024

5+ Year Member



 
Msg#: 4543213 posted 12:51 am on Feb 7, 2013 (gmt 0)

From what I understand, the closer a page is to the root the better it is for SEO.

So something like
http://mystore.com/book1 has more SEO authority over http://mystore.com/cat1/author1/book1 given that there's no requirement to keep cat1 in the url.

If I were to flatten my store's URL to
http://mystore.com/book1, should I also reflect this in the breadcrumb so that it no longer follows cat1 > author1 > book1?

And if I were to flatten my store's URL, how should I configure my breadcrumbs since there will be no real directory structure for product pages. Ex. For breadcrumb
Store > Cat1 > Author1 > Book1, Cat1 will point to http://mystore.com/cat1 and that page will contain a product listing of items under cat1.

Note that I want to keep the page
http://mystore.com/cat1 and even http://mystore.com/cat1/author1 (or an alternative http://mystore.com/author1).

Thanks

 

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4543213 posted 3:19 am on Feb 7, 2013 (gmt 0)

From what I understand, the closer a page is to the root the better it is for SEO.

That may have been the case quite a while ago, however from what I can it isn't true these days.

The file path itself seems to be a very small factor these days - as well it should be IMO, since the URL is a purely technical issue that is not really a predictor of how well the page will be received by an end user. Google is getting better at using other factors to predict a happy chemistry with their user. After all, why should one site be rewarded for being able to use a plug-in when another is more technically challenged but has great content?

A flat URL structure is nicer to look at, and it won't be truncated as often in the SERP. That might help the clickthrough rate and so it would indirectly be a small boost. But I see examples today of ridiculously garbaged up monstrous URLs that are pretty difficult competitors to outrank.

Robert Charlton

WebmasterWorld Administrator robert_charlton us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4543213 posted 3:23 am on Feb 7, 2013 (gmt 0)

From what I understand, the closer a page is to the root the better it is for SEO.

So something like http://mystore.com/book1 has more SEO authority over http://mystore.com/cat1/author1/book1 given that there's no requirement to keep cat1 in the url.

This is conflating a bunch of different considerations... including PageRank drop in levels from home, directory structure vs link structure, hierarchical navigation vs flattened navigation vs "flattened urls", faceted navigation and duplicate content issues, and breadcrumbs... etc. It's not clear which you're asking about, but I suspect the whole ball of wax.

First, you're possibly confusing navigation structure with directory structure. They're not necessarily the same. Hold that thought for a moment.

"Flattening navigation", in general usage, describes a navigation structure that puts as many nav links as possible on the home page... thus, some feel, boosting link juice for SEO. Not true, as you're dividing available link juice among a greater number of links, and not prioritizing anything. For many people, IMO, this is an approach to avoid thinking about what a good navigation should be. The link-everything-from-home navigation structure doesn't scale very well, and a good hierarchical navigation structure is likely to work a lot better. That's my quick take on flattened navigation. I'll let someone else come back to nav structure later.

Your main concern, though, seems to be about faceted navigation and breadcrumbs... ie, about different ways of reaching the product page... which also comes about because of confusing nav structure and directory structure (aka folders).

I you have faceted navigation and you confuse your navigation structure with your directory structure (as many often do), that will result in dupe content, as you'll have different folders in your filepaths.

There has evolved a strategy of putting product pages in the root, independent of the navigation path, to permit faceted navigation while avoiding multiple urls, with different file paths, for your product pages. That, along with the breadcrumb issue, is discussed in this thread....

Choosing the best url structure
http://www.webmasterworld.com/google/4475535.htm [webmasterworld.com]

This best url structure thread isn't about flattened navigation, though. That's another whole other issue. It is one of the best discussions we've had on faceted navigation, though, and should help with many of your questions.

ZydoSEO

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4543213 posted 4:09 am on Feb 7, 2013 (gmt 0)

Your URL structure has zero effect on how many clicks away from the home page a particular URL "lives". I can have a URL that is example.com/L1/L2/L3/L4/L5/L6/L7/L8/L9/L10/page.html. If I link to it anywhere on the home page then that page is exactly 1 click away from the home page... not 11 as the URL structure might imply.

"Clicks away from the home page" is ALL about internal linking of your pages and has absolutely zero to do with your URL structure.

quicksilver1024

5+ Year Member



 
Msg#: 4543213 posted 4:26 am on Feb 7, 2013 (gmt 0)

Thanks for all the explanation guys. And thanks for the link and explanations Robert - I got a lot from it.

So it seems then my URL/directory structure doesn't have to follow the breadcrumb/faceted nav structure, meaning it'll be optimal to have all my product pages in the root; and I can have different points of my breadcrumb link to the necessary category listing pages that my product falls on.

The issue I can see here is disorganization when I reach 200+ products all listed under the root folder. Although each product will likely to have unique product titles and I can prevent duplicate pages with the canonical tag, I'm not sure how Google will see my store if I have 200+ (and possibly one day up to 500+) products under root.

People always talk about siloing your structure, but I'm not sure how this works for products that have an unbalanced category/subcategory. My store only has 3 categories: classics, modern, other. The subcategories are more numerous being a list of author names. Because the subcategories are so narrow, 90% of the time there will only be 1 product under the subcategory.

If I were to NOT add all products under the root, should I then use the structure http://mystore.com/cat1/book1 (where book1 is the product name) instead of http://mystore.com/author1/book1?

So to make it more clear, I have two questions:

- Should I put 200 - 500+ products all under root for optimal SEO (keeping keywords close to the domain name)?

- If I were to put products under a directory, should it be my very narrow 'author1' subcategory or my very broad 'cat1' category?
.

[edited by: Robert_Charlton at 5:04 am (utc) on Feb 7, 2013]
[edit reason] Disabled auto-link to make sample url display [/edit]

Str82u



 
Msg#: 4543213 posted 1:16 pm on Feb 7, 2013 (gmt 0)

My busiest site has most pages in the root, breadcrumbs link to the index pages for the content, three main categories like yours. If I were to move into a directory structure, it would be purely organizational and logical, only using enough directories to do the job honestly.

As for SEO, @tedster is right that "flat" was effective but my crappiest competitors are doing very, very well with made up directory structures that are only used to stuff keywords in the URL.

ergophobe

WebmasterWorld Administrator ergophobe us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4543213 posted 3:24 pm on Feb 7, 2013 (gmt 0)

Devil's advocate hat going on and me wandering slightly off topic to respond *only* to this castoff statement by Ted:

After all, why should one site be rewarded for being able to use a plug-in when another is more technically challenged but has great content?


Because, assuming they both have great content and links and to a machine that still isn't all that smart it's not at all clear which one has better information...

  1. The URL is part of the UI and an intelligent URL structure does indicate quality in the same way that having logically structured pages does.
  2. Quality is subjective. You might think good grammar is an indicator, I might think readable URLs are a good indicator and Bob might think getting the video frame rate right is an indicator of quality ;-) So I might reasonably count all of those, and all are, from a given perspective "technical" issues (as of this moment, I will henceforth consider myself a "language technician" when copyediting).
  3. The technically challenged are constantly being penalized. Why should this be different? Lately I've been doing "first look" SEO analyses of sites for a firm who gets a lot of bites from people at the bottom of the chain. 75% don't even canonicalize their domains. Sure, Google *tries* to work things out for you if your site is accessible and www.example.com and example.com, but they are plagued with the most basic dupe content issues, things that I thought most webmasters had solved a decade ago. Google would *prefer* not to penalize you for being technically incompetent, but they aren't necessarily that smart yet.


Gimme my morning tea and I think I might have four more reasons why.

Obviously, I'm not in any way disputing the excellent advice from Ted, Bob and Zydo. Don't worry, I'm not about to now turn to my advice on the importance of getting your meta keywords right. I'm just in a philosophical mood and couldn't let a "why" go by without a "because" (forgive me Ted, but as you know, I'm wont to get sidetracked into such things when there's laudry that needs doing).

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4543213 posted 3:46 pm on Feb 7, 2013 (gmt 0)

I could have expressed myself more clearly, too. I have worked with some major corporate sites who could not easily fix technical issues without a huge development cycle and many millions of dollars down the drain. It's important for Google not to be throwing away such content lightly.

In fact, I've noticed over recent years that Google does compensate more and more for common technical issues that websites have. That doesn't mean that best practices can just be tossed away, but many times they do not cause the ranking problems that they used to.

TheMadScientist

WebmasterWorld Senior Member themadscientist us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 4543213 posted 4:03 pm on Feb 7, 2013 (gmt 0)

If you bread crumb right (and no, they don't have to match the file path) those are what will show in the SERPs rather than the URLs, so personally, I'd make the URLs whatever works best and 'get the bread crumbs right' so they show ... Ugly URLs in SERPs issues solved ;)

(I don't think anyone (except webmasters) really cares much what they look like in the browser ... I've seen Google's and they don't seem to have much of negative impact on traffic.)

ergophobe

WebmasterWorld Administrator ergophobe us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4543213 posted 2:39 am on Feb 8, 2013 (gmt 0)

I've noticed over recent years that Google does compensate more and more for common technical issues that websites have.


Agreed (you know I was just making noise, right?). I've recently seen sites with terrible canonicalization, but Google nailed it - doing a site: search showed no non-canonical URLs.

I think that as much as possible, Google does not want to count technical issues against a site - URL canonicalization, code validation and all that - and they have put efforts into overcoming those hurdles. I think these days you have to be pretty messed up to confuse the Google crawler - though in the last two months I saw a two year old personal blog/brochure site for a performer who does all his own writing and it had 85,000 pages in the index according to the numbers returned by a site: search. Must be a hell of a prolific writer ;-)

So yes - Google will do its best to decipher your site. In fact, though people will probably scream at this, I would say that Google wants to rank your site as high as possible based on the content and give as little weight as possible to the technical setup. The technical issues only start to really hurt when you are so screwed up that Google just can't figure out what's what.

Robert Charlton

WebmasterWorld Administrator robert_charlton us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4543213 posted 8:14 am on Feb 9, 2013 (gmt 0)

To try to get this discussion back on topic...

quicksilver1024 - I think you still have some basic areas of confusion which make it hard to discuss this. You're still thinking of directories and categories as the same thing, and they're not. Before you can deal with either the faceted navigation problem or the question of categorization, you need to make that distinction. As I'm pressed for time and g1smd has already said it extremely well, I'm going to refer you to two posts/discussions in which he addresses those issues, and I'll try to guide you through them.

First... within the "Choosing the best url structure" thread I recommend above, there's a link to another thread that you may have to read a bunch of times to absorb. It's...

How important is it to organize pages into directories?
http://www.webmasterworld.com/google/4364322.htm [webmasterworld.com]

Note the post where g1smd says....
While the old way was to have a folder heirarchy for the site and show that structure in the URL, nowadays that will often get you into trouble.

The problem comes when a deep content page might be listable in several categories...

He then goes on to say, and this is important...

...for a site with multi-faceted navigation it is fine to have a folder structure for category and search pages, but the URLs for the final product pages should NOT include any of the folder names.

Examples in the post compare the "old way" of organizing product pages... in which the product url does contain the folder names, and the "new way", in which category and subcategory pages all link to the same product url, which is in the root, as this one is...

www.example.com/173820082-acme-model-34684-widget

See a very similar discussion, also by g1smd, in this thread...

Matt Cutts Interviewed by Eric Enge
http://www.webmasterworld.com/google/4097582.htm [webmasterworld.com]

Most important regarding this discussion...
The content page needs a single URL:

www.example.com/24981-left-handed-widget

The URL does not need to record the category hierarchy path the user took to get to that page.

Note that g1smd's product examples all have a unique record number (or row number or catalog number) to organize the database (or, if you're doing this manually, your spreadsheet). That's a separate discussion, probably in a separate forum on WebmasterWorld... depending on your platform, etc. The url doesn't need to have a number in the url, but for product pages on a site with faceted navigation, in my experience that turns out to be the easiest approach.

Note that the issue in putting the product page in the root... or, if you choose, in a directory called /products/... is not, as you ask, keeping the keywords "close to the domain name." It's about keeping the directory names out of the product page urls.

Categories and Subcategories

We then have the question of categories and how to organize them. This is where questions about siloing, etc, come in.... and in your example there is kind of a question about flatness, but that's a navigation structure issue, not really a url issue.

Just briefly regarding nav structure, let me say that "flat" is a relative term. Many discussions about "flat" jump from the ridiculous to the ridiculous... ie from....

- an obvious wrong way to do it, say to have only three product category links from home...
to...
- a way that is just as wrong but not so obvious (which would be to have several hundred product links from home).

Chances are your site should fall more into the 20 to 30 to 40 links from home... categories, major subcategories, etc... but I'd have to know more about how good your inbound links are...

...and then, before I could break it down, I'd need to know what's searched and what you sell.

IMO, cross-indexing "classics, modern, other" with authors strikes me as something that's going to give you some really uneven clustering.

Linked from home, you'd probably want some links for author categories... maybe including a semi-alphabetical breakdown to make sure you cover them all, and best selling author categories...

...and then a more granular set of categories than what you've described, along with some best selling titles or most popular type categories.

But, these are NOT different url paths to individual books. These are URL paths to different categories and the subcategories that then link to individual book pages which reside in the root directory, or in /products/ or /books/ or a similar directory. I wouldn't break the directory down any further, though, because as your inventory changes and evolves, it's likely your breakdown is going to change, and you don't want to be locked into product pages where the directories are built into the file path.

I hope that starts to make sense. Once you've gotten that, we can look at other discussions about navigation structure, keeping in mind that faceted navigation is going to require this slightly different mindset.

Also, I think it's a very bad approach to just be sloppy about it and hope canonical tags will tell Google how to figure it out. That's not a good way to go about it. Even if Google is getting better at it, a lot of organizational issues affect how your users are able to navigate the site.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4543213 posted 10:11 am on Feb 9, 2013 (gmt 0)

www.example.com/p173820082-acme-model-34684-widget is the product page. It's a single URL for the product.

It is linked to from:
www.example.com/widgets/acme/left-handed
www.example.com/manufactures/acme/widgets/left-handed
www.example.com/left-handed/widgets/acme
www.example.com/sale-items
www.example.com/100-most-popular-items
www.example.com/featured-product-of-the-day

where each of those category pages shows its own breadcrumb navigation.

Rather than show breadcrumb trails at the top of the product page, instead a box on the right says
Find more...
Left-handed widgets
Acme widgets
Sale items

etc, and those link back to the relevant category pages.

If you used cookies, it would also be possible to show a personalised breadcrumb trail of how the user got to this page; and omit it for searchengines and users without cookies.

There's also a set of standard navigation across the top of the page. The bottom of the page links to a larger number of categories and sub-categories.

Additionally, the reviews are at
www.example.com/r173820082-acme-model-34684-widget and detailed product specs are at www.example.com/s173820082-acme-model-34684-widget

Extensionless URLs are a key part of the process - makes the rewrite rules much easier to craft. The system is also self-checking, and this is much easier when the URL contains a unique ID. You'll see a lot of systems where the unique ID is at the end of the URL path. I prefer it at the beginning. Were you to request the truncated URL
www.example.com/p173820082-acme-mo the system would redirect you to the correct product URL. This also allows the page URL slug to be amended to correct typos in product names. The system auto-updates search engine listings and redirects users continuing to request the typo'd URL.

The most important point is that you "gotta have a plan" for the URL structure and the URL format. It needs to be documented in detail before any coding starts.

quicksilver1024

5+ Year Member



 
Msg#: 4543213 posted 5:22 am on Feb 12, 2013 (gmt 0)

Thank you all for your insight! I'll put what I've learned to practice.

quicksilver1024

5+ Year Member



 
Msg#: 4543213 posted 2:43 am on Feb 13, 2013 (gmt 0)

I just thought about this some more. Do you mind explaining why having a unique ID in the URL is a good idea and why it helps keep the database organized?

Also, why would someone have a unique URL for different kinds of information for the same product? This was mentioned above:

"Additionally, the reviews are at www.example.com/r173820082-acme-model-34684-widget and detailed product specs are at www.example.com/s173820082-acme-model-34684-widget"

Why wouldn't you just have www.example.com/173820082-acme-model-34684-widget containing both the detailed product specs and reviews?

ergophobe

WebmasterWorld Administrator ergophobe us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4543213 posted 3:26 am on Feb 13, 2013 (gmt 0)

Don't want to answer for someone else, but...

>>unique URL for different kinds of information for the same product?

This is assuming these are unique pages. If the reviews, specs and product info are on one page, that's one URL. It's up to you. The point is to have a URL schema that is logical and easy to check programatically.

>> unique ID in the URL is a good idea

Because

1. you just don't have to remember things to avoid collisions. You'll never accidentally attempt to assign the same URL to a different product

2. If you start with the unique ID, even a partial URL can be redirected as needed as long as the whole identifier is there. So if someone links to

https://example.com/r173820082-acme-model-34684-widget-with-all-kinds-of-descriptors

and that gets cut off in an email client to

https://example.com/r173820082-acme-mod

I know that if I have [prs]\d{10}- (using g1smd's example), I can do a database lookup for that and 301 to the proper URL. It's fairly brilliant.

It also means I could potentially achieve a variety of DB efficiencies, like I could actually look up URLs based on an primary key consisting of page type (p, r or s) and an integer index, which is going to be faster than a looking everything up on a VARCHAR field.

I've honestly usually put the unique identifier at the end because it looks nicer, but now that g1smd has mentioned the other advantages, it's so obvious and clever! I think that's really the ticket from now on.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4543213 posted 8:30 am on Feb 13, 2013 (gmt 0)

Yep. That covers all the salient points.

Many people recommend the ID should be last in the URL, stating they want the keywords relating to that page to appear first. I'm not convinced it makes all that much difference, and prefer the enhanced broken URL request handling and the much easier RegEx pattern in the rewrite:
^([prs][0-9]+)-(.*) or similar.
The redirect system also works for URLs with junk appended on the end: something that's a regular question in the Apache forum.

Extract $1 and do a database lookup for that ID. If it doesn't exist, return 404. If it's a discontinued item, return 410. If it exists, make sure $2 has the right wording. If it does, show the page. If it does not, redirect to the correct URL. It's a few dozen lines of PHP.

ergophobe

WebmasterWorld Administrator ergophobe us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4543213 posted 11:27 pm on Feb 13, 2013 (gmt 0)

keywords relating to that page to appear first


That's always been my reasoning. And in many cases I think that's still fine. If you don't have a huge number of URLs and they aren't long, I doubt it makes much difference.

But if you have a lot URLs or a strong possibility of getting truncations or cruft, the ID-first strategy is really compelling.

I have to say, when I read your first post on it, it was so immediately obvious once it was out there, it was one of those "Wow! I should have seen that 10 years ago" moments. Actually, I built my first system to handle and canonicalize URLs about 15 years ago now (of course, I didn't know the word "canonicalize" then) and though I thought I thought it through pretty well, this thread would have helped a lot!

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4543213 posted 11:35 pm on Feb 13, 2013 (gmt 0)

I think it's at least five years since I first mentioned the "ID first in URL path" stuff.

Where were you? :)

I took these thoughts to various open source ecom projects a while ago and they rejected them. I don't think they thought it through.

ergophobe

WebmasterWorld Administrator ergophobe us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4543213 posted 2:37 am on Feb 14, 2013 (gmt 0)

It's like my grandmother used to say about my dad: "He's not hard of hearing. He's hard of listening." Me, I'm not hard of reading, I'm hard of thinking.

No seriously, don't know if I've come across it before, but sometimes if you aren't primed to *think* about something, a quick skim does no good. So maybe I read it, but just didn't think it. Can't say.

Either way, it's very clever.

zehrila

5+ Year Member



 
Msg#: 4543213 posted 5:05 pm on Feb 26, 2013 (gmt 0)

This thread is an exact reason why i believe webmaster world is gold mine of superlative content. Thanks to Robert and g1smd for detailed input.

I would like to ask the same question which quicksilver1024 asked earlier and raise my concerns related to seo.

I have seen sites which break down their content in following ways

www.example.com/r173820082-acme-model-34684-widget reviews
www.example.com/s173820082-acme-model-34684-widget specifications
www.example.com/p173820082-acme-model-34684-widget pictures
www.example.com/v173820082-acme-model-34684-widget video

So should not it be a good idea to have it all on one page using tabs/anchors/javascript? e.g if the main page is www.example.com/s173820082-acme-model-34684-widget specifications and you click on a tab which says Reviews, the actual url do not change but you still be able to read reviews www.example.com/s173820082-acme-model-34684-widget#reviews

Some products don't really have enough images, reviews, videos for them to have a separate page, but the way most of scripts are built, once you create a product page all other side pages are created automatically.

I believe it might be a better idea to present all the information on a single url, it has potential to make product pages rich with data. Your thoughts?

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4543213 posted 8:39 pm on Feb 26, 2013 (gmt 0)

have it all on one page using tabs/anchors/javascript

There can be challenges, depending how the function is written, googlebot may not be able to read all the content. If the content that is invisible on the first page load is still in the original source code - that is, it doesn't require another call to the server - then all is well.

moopy

5+ Year Member



 
Msg#: 4543213 posted 9:02 am on Feb 27, 2013 (gmt 0)

Great thread!

I kinda like g1smd's solution because it's the friendliest to the user experience.
Rather than show breadcrumb trails at the top of the product page, instead a box on the right says
Find more...
Left-handed widgets
Acme widgets
Sale items
etc, and those link back to the relevant category pages.

If you used cookies, it would also be possible to show a personalised breadcrumb trail of how the user got to this page; and omit it for searchengines and users without cookies.


I wonder though - it's this considered as cloaking?
From:
[support.google.com...]
Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.

Some examples of cloaking include:
[...]
Inserting text or keywords into a page only when the User-agent requesting the page is a search engine, not a human visitor


This becomes more complicated, as the path the user chose to get to the specific subcategory or product page reflects not only on the breadcrumbs but also on the category's navigation menu and possibly the descriptive text of the category.

What's your take on this?

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4543213 posted 12:27 am on Aug 24, 2013 (gmt 0)

I think that adding additional navigation for searchengines, and omitting it for visitors, would get you into trouble.

The reverse (add an extra breadcrumb bar for humans)? No issues.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved