Welcome to WebmasterWorld Guest from 54.198.52.8

Forum Moderators: Robert Charlton & andy langton & goodroi

Message Too Old, No Replies

Rewards and Risks of Changing to Hierarchical URL Structure

     
10:10 pm on Aug 3, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8510
votes: 228


The Situation
I've inherited a site with what I consider a poor URL format.

The site is about to be completely revamped - new design, new CMS. Based on early discovery, there will be some IA changes, some content suppressed, some added plus a certain amount of reorganization. The site is small, roughly 250 pages. Maybe 20-30 of those will be suppressed and perhaps 20-40 pages added. The internal linking will be significantly reorganized. So a fair number of URLs will change, go away or get added no matter what. It seems like if we were going to change URL formats, now would be the time.

There is much to change and most decisions are easy. The one I hesitate on is changes to the URL structure.

What's wrong with the current format?
- mixed case which is a problem on Windows-based hosting because (at least on the current platform) you can't catch incorrect case easily and canonicalize those URLs. It also tends to split them in reporting (GA for example - though you can create a filter to convert case, since we haven't two URLs that are identical except for case get reported separately).
- flat - everything is one level off root
- extensions- I don't really care, but if I were starting from scratch I wouldn't do it that way of course.

Previous Discussions Here
Most of the discussions on the subject here revolve around the value of keywords in URLs and misconceptions about it being better for SEO to have URLs that indicate something is closer to root.

- [webmasterworld.com...] (closer to root idea).
- [webmasterworld.com...] (keyword in hierarchical vs flat URLs discussion and multi-category problem - see next)

Some have noted a potential major downfall of hierarchical URLs - if your site has multiple possible hierarchies you run the risk of having multiple URLs to your page. That is if you auto-generate URLs based on product category and something is in both the "large" category and the "blue" category you can end up with two valid URLs with identical product listings if you aren't careful:
/widgets/large/blue
/widgets/blue/large

- [webmasterworld.com...] (multi-category concern)
- [webmasterworld.com...] (ditto)
- [webmasterworld.com...] (concern about how long to get the new structure indexed).
- [webmasterworld.com...] (most recent discussion on the topic)

What are the advantages?

So if I don't think that the new URL structure will have much positive impact on SEO and carries some risk, why consider it? A few items, in no particular order

1. Reporting. You can't use the content drilldown feature of Google Analytics. It makes it harder to see traffic patterns for logical buckets of content. I end up collecting sets of URLs and writing massive regular expressions to try to pull reporting on the various sections of the site.

2. Topic hinting to Google. Alan Bleiweiss was adamant at Pubcon that not having a hierarchical URL structure meant that you were removing important clues to Google regarding related content and hierarchical importance. In one of the threads cited above, pageoneresults said that non-hierarchical URLs are like putting all your papers in the same drawer. Personally, I've always thought that your internal linking structure was way more important, but I'm open to the idea that hierarchical URLs might help... I just haven't seen it.

3. Breadcrumbs in SERPS. That said, now that Google presents breadcrumb navigation in the SERPs, it seems like a hierarchical URL structure makes that much more likely.

4.Consistent UI. I've always tried to have my URL structure reflect my breadcrumb structure if the site is highly hierarchical. In cases where a page might be reachable via different categories/silos, I usually make the "leaf" pages (end of hierarchy pages) flat as often it's hard to have a breadcrumb in that case. Still, this site is highly hierarchical, so I think the match between main nav topic, the breadcrumb structure and URL structure would be consistent.

5. URI as UI. It allows users to edit URLs as a form of navigation. A usability expert from IBM said many years ago (when the web was newer) that she had never seen a user edit a URL in thousands of usability tests. But I do it all the time and wonder if more users do these days.

6. Descriptive. More descriptive URLs to hopefully improve CTR in general, not just in SERPs.

What's Holding Me Back?

Two things

- Cool URIs Don't Change (http://www.w3.org/Provider/Style/URI.html). I'm pretty sure the URI of that article has changed over the years though.

- FUD - it's a small site, but there is lots of money on the table and 2016 is likely to be challenging already (which is why the site is getting revamped)

What Are Your Thoughts?

What do you think about risks/benefits? If you were rebuilding a site from the ground up, what would you do?
11:54 pm on Aug 3, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 14, 2008
posts:2910
votes: 62


Months to roll out Panda... Months for other parts of the algo to be fully applied [JohnMu in a hangout said it can take 6 months to a year or more for parts of the algo to be fully applied after changes]... Site "has to rank"...

I might "pull the trigger" on the CMS/design change, but I think I'd be inclined to make it take months to make all the changes, especially to the content, interlinking and URLs.

I'd also seriously consider moving to query strings for the URLs based on the best practices outlined here: [googlewebmastercentral.blogspot.com...] -- I'm not saying I would move to query strings, but I'd definitely give it quite a bit of serious thought before I decided.
2:39 am on Aug 4, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8510
votes: 228


>>query strings

I think that's a good solution to the multiple category problem, but it doesn't apply here. As I say, this is a very siloed, hierarchical site. I mentioned the product issue because it's a common objection to hierarchical URLs.

However, it is possible that we will have a section that uses faceted search. So that's an excellent link and I'll keep that in mind.

As for the main part of the site, though, what I'm proposing is more like transforming the fairly inconsistent URLs and flat structure on the left into something like the structure on the right.


/Photos_My Location.aspx => /photos
/Photos_Dogs.aspx => /photos/dogs
/Cat_Photos.aspx => photos/cats
/Weather_My_Location.aspx => weather
/Summer_Conditions.aspx => /weather/summer
/Weather_Winter => /weather/winter
1:05 pm on Aug 4, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 14, 2008
posts:2910
votes: 62


I see the dilemma -- In thinking about this a little trip-down-memory-lane brought a GoogeGuy site-move recommendation to mind, which IIRC was a contradiction of what I said above, being: If you're moving a site and going to change the template: redirecting in one-hop > letting the changes settle [iow waiting] > making the template changes; was the recommendation, because too many changes at once might trigger [stuff] -- It was years ago and that's the best I remember paraphrased, but I think I might approach what you're talking about much like a site-move, except you're keeping the domain the same.

I guess the best way to summarize what I think I would do is:
Not everything at once...

So: template/cms > wait > URLs > wait > content additions > wait > content deletion > wait > relinking
Or: URLs > wait > template/cms > wait > content additions > wait > content deletion > wait > relinking
Or: some other variation of change > wait > rinse, repeat I didn't list.

It could be "all at once" will be fine, but with the time it's taking for Google to apply all parts of the algo to rankings after a change and the number of people who have posted about "making everything better" and then tanking, I think I'd go slow with it, because even though there are cases of "I changed the URLs and tanked" reported it seems [unscientifically, from memory] there are way more "I changed [one thing] and it worked fine" posts to counter "Houston..." > "All I did was change the URLs..." than there are posts about "I changed the template, URLs, internal linking, and overall structure and never missed a beat" to counter "I completely upgraded everything and all my user-metric stats increased, but my rankings and traffic tanked" reports.
3:32 pm on Aug 4, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8510
votes: 228


>>I'd go slow
The more I think about it, the more I think I think your advice is right on. There's going to be so much in motion already, that it's probably a bad time and I think changing the URLs is not worth it in terms of hassle and risk. I'm thinking the "right" way is that any new content follows a particular standard, and old URLs stay the same with the possibility of slowly moving them over later in the game.

One big downside of changing URLs is that it will make it way harder to compare historical data. Of course you can pull analytics and map the old URL to the new URL and track performance, it means extra steps which means less chance of actually doing it and doing it right.

The site isn't actually all that old, but they were not well-served by past developers I would say (this being one of the smaller pieces of evidence that leads me to say that).

>>I see the dilemma

I should have posted sample URLs at first. But yes, the dilemma is part of me says "Never change a URL unless the content disappears" and part of me says "Yeah but those URLs are awful".

OCD makes me want to change
FUD makes we want to hold steady.
5:33 pm on Aug 4, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 14, 2008
posts:2910
votes: 62


I'm thinking the "right" way is that any new content follows a particular standard, and old URLs stay the same with the possibility of slowly moving them over later in the game.

I think that sounds right too.

"Never change a URL unless the content disappears" and part of me says "Yeah but those URLs are awful".

OCD makes me want to change
FUD makes we want to hold steady.

I have that -- One look at the URLs you posted and I "got it". I would seriously struggle to not change those, to the point I want to go fix 'em in your post, so I don't have to look at 'em any more, because they're so wrong in so many ways.
7:37 pm on Aug 4, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8510
votes: 228


>>wrong in so many ways

:-)

To be honest, the actual URLs were worse, but those were the best I could come up with and not violate the TOS or otherwise leading people to the site in question.

I'm trying to be rational, but as I say - OCD + FUD = confusion ;-)
7:50 pm on Aug 4, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator buckworks is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 9, 2001
posts: 5725
votes: 89


My OCD would match yours, Ergophobe.

I say change to the new URL style and make sure every existing ugly URL is redirected to its cleaner new equivalent. 301, single step, no chains of redirects.

Those cleaner URLs will make many sorts of promotion a bit easier.
8:00 pm on Aug 4, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13002
votes: 212


I'm consulting on a mid sized ecommerce relaunch where we are changing all the URLs (going to skus, how bout THAT boys and girls) plus a lot of categories and products have changed, been deleted, or added or replaced, and they want to go full https eventually, and there are a few new subdomains - and overall, it's going to be one hell of a ride with thousands upon thousands of new redirects and something like 80,000 *old* redirects that have to be dealt with. We're doing everything slowly and in steps. Started with the dev in January and we *still* haven't finished, that's how slowly we're going. At the moment, organic traffic is a bit down and revenue is way up. I'd rather be the tortoise than the hare here.
6:20 pm on Aug 5, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3547
votes: 19


With a small site like this just in comparison I just rebuilt 5 websites (about the same size) went from http to https, url changed ending .htm to /, site navigation changed the only thing that didn't change was content and the url titles. So far no issues with any of the websites so. It has been 4 months.

Went from an aspx to a wordpress,so you know most all changed.

If I were redoing the website I would do it right set up all the redirects correctly and move on. Doing it piece by piece over a period of time IMO is not the way to go
3:47 pm on Aug 6, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8510
votes: 228


Thanks for the perspectives.

I'm leaning back toward OCD and away from FUD ;-)

I think on a site this size, it will get crawled and indexed pretty quick. So the key is getting the redirects right. Though a small site with modest traffic, there is a lot of money going through the site and a percent here and there equates to several salaries, so that's where the FUD comes from.
11:49 pm on Aug 6, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3547
votes: 19


keep us posted
3:23 am on Aug 7, 2015 (gmt 0)

Junior Member

joined:Apr 22, 2015
posts:55
votes: 34


Cool URIs Don't Change
From the description, the original URIs aren't anywhere near cool, the objective is to make them rational if not kinda cool.

FUD
My inclination would be to do:
/Photos_Dogs.aspx => /photos/dogs
or something similar first and see how Google reacts. I'd guess they'll have no issue if there are good redirects. Then add the rest of subtopics to "/photos/" and evaluate.
If no major issues, roll it all out.
4:33 am on Aug 7, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:7782
votes: 531


Redirects exist for a reason. So....

No matter what you do to the site, if the redirects do as needed, that traffic will find you.

As for indexing.... a 300 page site is a blink of the eye, so will not take long for that to happen.

folders should mean something to the webmaster FIRST, and if he/she is commonsense, that will make sense to others as well.

I'd pull the trigger. Do it all at once. ANY change will cause a slight hit. However, if the change is actually for the better that hit will be small and endured only ONCE, not each time a new change is rolled out. One way shows there's a new game in town, the other shows fumbling about and where's the direction?

YMMV
8:30 am on Aug 7, 2015 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:11902
votes: 294


ergophobe - Forgive rushed post in advance. Regarding the "flatness" aspect of files one level off the root... I'll stick with our earlier understanding that it is in fact the navigation structure, not the url structure, that is more important. I would not, in fact, use the word "flat" to describe the page files just off the root, as without a nav structure referring to those files and linking them to other files, there is in fact no relationship to site structure. "Flat" to me is more associated with the "depth" (or lack thereof) of a link from home, not just to the root level url by itself.

I think that Google can detect (or assign) what we're calling hierarchy without a traditionally hierarchical folder structure in a URL... at least if you don't have global dropdown menus fuzzing it all up (and maybe even if you do).

What would concern me most about this situation, as you describe it, is the increased difficulty of using Google Analytics for drilldown.

Regarding breadcrumbs on a page and "the page's position in the site hierarchy", Google has updated its guidelines on the use of schema and other markup to label your code in breadcrumbs. We have a discussion, still open, with some excellent comments, here...

Google Breadcrumbs Structured Data Documentation Updated
June 16, 2015
https://www.webmasterworld.com/google/4752718.htm [webmasterworld.com]

In this discussion, Senior Member inbound makes some astute observations about Google's ability to perceive nav structure with or without the markup, both via onpage clues and by linking patterns between your pages....

...If you use a greater than sign with a space on either side to separate two links I'm sure Google will see that as an indicator that it might be part of a breadcrumb trail.

There are other ways to spot it even if you don't make it as easy as that. Say your links are not in a strict hierarchy, it's still easy for Google to see if a link exists between the potential parent and child. Additionally, Google have put a lot of effort into figuring out what is navigation on sites - to overlook such an important navigation tool would be foolish of them. Being able to separate navigation from content makes many things easier with ranking, Google probably know your site a lot better than you suspect....

The Google Developers article discussed in our thread provides additional clues that suggest to me that Google is anticipating URLs that don't have hierarchical information in the URL structure....

Breadcrumbs - About breadcrumbs
https://developers.google.com/structured-data/breadcrumbs [developers.google.com]

...Pages may contain more than one breadcrumb trail if there is more than one way of representing a page's location in the site hierarchy. The page thestand.html may, for instance, additionally contain the following breadcrumb....

As I look at the Google article, "thestand.html" doesn't have a hierarchical pathname... it's just a file which could be in the root... and in fact if there's "more than one way of representing a page's location in the site hierarchy", then it seems to me that it couldn't have a hierarchical pathname, or you'd have dupe content.

When we discussed these files one level off the root, we were basically talking about product pages, but one can apply the principle to any kind of site in which multiple references to the same content are included, where you don't want to worry about duplication issues, or for which some control of emphasis (for both site users and for search) would be helpful.

It's not clear whether you want to structure an entire site this way... and in one or several of the threads referenced, g1smd did in fact suggest how to combine these pages with a site that uses more conventional folder hierarchy.

There's much more I'd like to say, but it's late and I've been slow in posting this.

With regard to what you're doing... yes, absolutely you must redirect those extremely uncool URLs to something more consistent. Many questions and issues yet to be discussed, but I wanted to address this particular question of hierarchy, which I feel is key.
3:20 pm on Aug 7, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8510
votes: 228


Thanks for that Robert_Charlton!

The breadcrumb question was a question mark for me that I had hoped to circle back to. Great resources. I didn't realize that there is schema markup specifically for breadcrumbs and I missed the June 16 discussion. Much appreciated.

As I was writing my initial query, I realized that knowledge of how Google discerns and displays the breadcrumb/hierarchy was a significant gap in my knowledge. I had assumed, as I wrote above, that the link structure would determine this, but I wasn't sure the degree to which URL structure was used.

I suspect that there is a chance that URL structure gets considered IF it matches the link structure (meaning it's used as corroboration, not primary evidence), but that's just because that's what I would do if I were writing an algo and wanted an easy check to see whether I had gotten it right.
3:30 pm on Aug 7, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 14, 2008
posts:2910
votes: 62


They're actually pretty good at "getting what breadcrumbs are", especially if you markup your <p> [or whatever is holding them] as class=breadcrumbs and then use <a href=url>text</a> > [or other "standard breadcrumb delimiter" like / or ].

I've had them "get it" that way on sites where the crumbs didn't match the URLs and didn't use any type of microdata other than calling the "container's" class or id "breadcrumbs" -- Obviously, better practice to "spell it out for them" with some microdata, but thought I'd point out they're pretty good at catching what they are with just a simple class or id reference and a standard delimiter.
3:33 pm on Aug 7, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8510
votes: 228


Thanks for that too! All helpful info.
5:48 pm on Aug 7, 2015 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:11902
votes: 294


Just to note quickly for now that HTML lists (both ol and ul) appear to have become the "container" of choice for breadcrumbs, with CSS evolving to handle varieties of breadcrumb displays.

Though the "breadcrumb" CSS class technically shouldn't affect how Google looks at site structure, you can be sure that Google looks at every clue.

Over the years, both Smashing mag and A List Apart have discussed the evolution not just of the CSS, but also the underlying structure... and pageoneresults and phranque in these forums pushed the importance of existing html tags.

I've felt that one of the most important clues is nav structure itself. When I mention the "inverted L" to most designers, they don't know what I'm talking about and can't imagine navigation without global drop-downs, which IMO do a lot to obscure the implicitly hierarchical structure of something like the inverted L.

I don't want to take this thread off topic, but I think these structural clues played a big part even before Google started using schema markup. I'm suspecting that hierarchical links in serps will depend on a lot of factors beyond markup, including utility to the user. I don't think that Google relies on any single signal for anything.

ergophobe... this kind of migration and the order in which it's done present some intriguing considerations, which reflect huge gaps in my knowledge. In moving ahead, a few areas raise questions that seem key to me for now... where does mobile fit into these plans, and are you going to be moving off of IIS? Along the way, if the change is slow and gradual (as I currently feel it should be), how do you avoid chained redirect issues down the road?
7:40 pm on Aug 7, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8510
votes: 228


>> mobile

It is currently not mobile friendly, but the new site will be responsive

>> IIS

I assume you're thinking of canonicalization problems related to case sensitivity and other issues on IIS? Yes, it will remain on IIS and 99% of my experience is on LAMP stacks with the rest being nginx. So I'm bothered by some of the shortcomings that are typically present in IIS. However, the CMS that the new site will use has an in-built setting to enforce lower-case URLs.

By deciding that all URLs will be lower case, it makes it easy to handle the case-related canonicalization issues... I hope!
7:45 pm on Aug 7, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8510
votes: 228


I missed the posts by Tangor and Rob_Banks at first.

Thanks for the moral support!

Since there are many millions of dollars per year going through this site, I really appreciate all the perspectives from people I have come to trust and respect over the years!
1:00 pm on Aug 10, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 19, 2000
posts: 2501
votes: 27


There are other ways to spot it even if you don't make it as easy as that. Say your links are not in a strict hierarchy, it's still easy for Google to see if a link exists between the potential parent and child. Additionally, Google have put a lot of effort into figuring out what is navigation on sites - to overlook such an important navigation tool would be foolish of them. Being able to separate navigation from content makes many things easier with ranking, Google probably know your site a lot better than you suspect....


I truly wish this thread had appeared more than a year ago. I followed some very poor SEO advice, drastically changed my entire link structure, which resulted in making my website pretty much irrelevant to my core business but very relevant to a lot of free information on my website. I should have listened to my gut (which told me not to do it), but when paying an SEO pro to advise you, one tends to quash one's inner voice.

This is a wonderful thread that anyone considering changing their link structure should be certain to read and then read again ... in case it didn't sink in the first time. Thanks to all for confirming that the new fellow I have been working with is actually doing things the right way.
2:01 pm on Aug 10, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member jab_creations is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Aug 26, 2004
posts: 3168
votes: 22


Honestly a URL like example.com/?page=134 being replaced with example.com/about is not going to have a negative impact as long as search engines see that pages are regularly being edited, replaced and redirects are being setup. If your URLs are crap, fix them! I've done so and my traffic more than trippled using mostly the same HTML and my URLs weren't half as horrible as some of the junk CMSes produce these days.

John
2:15 pm on Aug 10, 2015 (gmt 0)

Full Member

10+ Year Member Top Contributors Of The Month

joined:June 3, 2005
posts:298
votes: 12


Honestly a URL like example.com/?page=134 being replaced with example.com/about is not going to have a negative impact
I disagree every 301 redirect will loose some Page Rank. This is why Google is giving a free boost to https sites to counter the loss in 301s. If you are on http you could change your URL's to the new structure on https and benefit from this boost to your new URL's so you have virtually no loss (this is what I have done). You have to make sure you do only one 301 redirect from you old URL to a new one with https (not old url to new url via 301 and then redirect to 301).

I have moved over a classic ASP site with 2000+ pages to Linux. A laborious task, never to rush into, requires allot of planning and one need to understanding htaccess really well. Overall was it worth it... not really but I like clean URL's as purely a vanity thing.

Note: If you rely on AdSense do not move to https for a year or so.
4:42 pm on Aug 10, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2006
posts:1419
votes: 63


Ergophobe: What Are Your Thoughts?

Honestly I think you covered all the aspects, been there and that's exactly my same conclusion after several changes on sites, url structure and cms, all of that including redirects etc on projects from 50 to 6K pages of content. Came to the same conclusions by thinking, planning and by triald-testing-error.

The upper-lowercase sure it's a no no, the root one level for url structure sure is a no no. That site doesn't seem like is going to become a hundred or more pages website but still one has to think big, then diff articles ending with the same name-url are things likely to happen specially considering parts of the name-characters are stripped as they become url-safe.

I would consider TAG functionality. just instead. Why? ahead

Ergophobe: What do you think about risks/benefits? If you were rebuilding a site from the ground up, what would you do?

Risks? the site is yours so any change on structure or implementation is likely to follow the same order BUT you might need listings on just widgets, just blue, etc, like tags. Why? because the information grows, you might find your site with tons of pages a year from now. Also because you might sell it, you know how clients want a site about computers and displays, then want listings (by url) displaying brands, you can't afford that into the DB in categories just like that, yes you could but at some extent, then tags become useful, and to avoid interference with the new url structure just use /tags/etc or /customviews/etc or whatever suites your needs.

That's the only thing I would add.
5:51 pm on Aug 10, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 27, 2001
posts: 12172
votes: 61


The site is about to be completely revamped - new design, new CMS. The internal linking will be significantly reorganized. So a fair number of URLs will change, go away or get added no matter what. It seems like if we were going to change URL formats, now would be the time.

With everything else taking place, the URI structure change is something I would definitely recommend. Now would be the time to do it since you are making so many other changes that are going to affect the site's visibility. Also, changing the site's URI structure to correlate with all the other changes would only be natural and mandatory from my perspective.

I prefer what you've outlined in regards to URI structure. I typically travel no more than two (2) levels off the root for primary content. I try to keep everything at root/sub/page and on smaller sites, I will normally put top category pages at the root only going down one (1) level for /subs/ e.g. /root/category, /root/category/page. No empty /subs/ either - what I mean by that is the old "trailing slash" for everything, yuck!

So I'm bothered by some of the shortcomings that are typically present in IIS.

Don't be. I've been on IIS since 1994. Those shortcomings are only present if the server dev doesn't know about the tools available to make IIS behave like Apache. I've been using htaccess for years now on IIS. :)
6:34 pm on Aug 10, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 27, 2001
posts: 12172
votes: 61


I think I need to expand more on my response. I'm going to use a performance engine/website analogy in the process.

If I'm building a high performance engine, I'm not going to do everything BUT "Engine Management". All the other components in that engine rely on the "Engine Management" to make it work. The same applies to websites. If you're going to go through a major change in HTML structure, framework, etc., then now is the time to "do it all" and not piecemeal. If you piecemeal it, it's only going to drag the inevitable out - the loss of visibility in search.

Thing is, it doesn't have to be that way unless of course the site relies "heavily" on inbound links for its visibility. If that is the case, then you'll have some additional work involved getting those links updated. You'll be 301'ing everything so your end is covered and the indexers will process those 301s correctly, this has been my experience. Others have had less satisfactory results and I've seen why. You can't make any mistakes in this process. If you do, you'll blow the engine.

By the way, this is an excellent URI structure and the ONLY one I would use. I find that if you can easily type a URI into the address bar, you've done things correctly.

/Photos_My Location.aspx => /photos
/Photos_Dogs.aspx => /photos/dogs
/Cat_Photos.aspx => photos/cats
/Weather_My_Location.aspx => weather
/Summer_Conditions.aspx => /weather/summer
/Weather_Winter => /weather/winter

I'm consulting on a mid sized ecommerce relaunch where we are changing all the URLs (going to skus, how bout THAT boys and girls)

SKU based URIs rock! I consulted with a medium sized ecommerce site years ago and that was one of my suggestions - they went with it. It's still in place today. I find that many folks are searching based on manufacturer name and SKU. It's only natural that an ecommerce site be structured to match that behavior. Those URIs are the only ones allowed to be indexed, everything else is off limits.

https://example.com/brand/sku
8:31 pm on Aug 10, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8510
votes: 228


@explorador

Thanks for the perspective. Yeah, I tried to summarize the info that was already on WebmasterWorld so we didn't have to rehash all that old stuff and could move the discussion forward. That was my hope and I think people have really stepped up to help me out! Thanks again everyone.

TAG functionality

Sorry - didn't follow that part. I'm not sure what you meant there.

you might find your site with tons of pages a year from now


Possibly, but the site is over 15 years old and does millions of dollars per year in business. I'm not sure I see it going from 300 to 3000 pages anytime soon. We could get some great benefits by hiring writers and cranking out content, but the question is always one of ROI.
8:50 pm on Aug 10, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8510
votes: 228


@pageoneresults - first off, thanks for weighing in. Some of the old posts that most influenced my initial post and summary were yours.

>>IIS

That's reassuring. I often see bad setups with IIS and every one at the company in question (which has at least 30 sites) is flawed on some basic level. I have made sure that some of the worst canonicalization problems were mentioned in the Scope of Work on the new site. I have confidence in the devs and they will be responsible for server setup. Then it's just up to the admins to maintain things.

You can't make any mistakes in this process. If you do, you'll blow the engine.


A small number of the site's pages receive a huge amount of the non-controllable referral traffic (by non-controlled, I mean referral traffic that's not from one of our own email newsletters, social media accounts, etc). So I don't think it will be that hard to get the 301s right for 90% of the traffic and the links that provide 90% of the traffic probably have at least 90% of the SEO value. I say probably because though I have been trying for a long time to get them to verify with Majestic, that's still not in place.

every 301 redirect will loose some Page Rank


That is one of the downsides. How much PR is lost through a 301 hop? Exactly as much as if you had a page with exactly one outbound link and it linked to the page in question.

See Matt Cutts, How Much PR Dissipates due to a 301
[youtube.com...]

Given the other benefits, I don't have a problem with that as long as they are one-hop redirects. I do have some concerns about the admins getting that right.
9:04 pm on Aug 10, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 27, 2001
posts: 12172
votes: 61


I do have some concerns about the admins getting that right.

There is ONE tool you'll want to have at your disposal prior to launch. ScreamingFrog < That one piece of software will give you ALL the technical insights you'll need to make sure things are right out of the box. :)
This 64 message thread spans 3 pages: 64