Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Rewards and Risks of Changing to Hierarchical URL Structure

         

ergophobe

10:10 pm on Aug 3, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The Situation
I've inherited a site with what I consider a poor URL format.

The site is about to be completely revamped - new design, new CMS. Based on early discovery, there will be some IA changes, some content suppressed, some added plus a certain amount of reorganization. The site is small, roughly 250 pages. Maybe 20-30 of those will be suppressed and perhaps 20-40 pages added. The internal linking will be significantly reorganized. So a fair number of URLs will change, go away or get added no matter what. It seems like if we were going to change URL formats, now would be the time.

There is much to change and most decisions are easy. The one I hesitate on is changes to the URL structure.

What's wrong with the current format?
- mixed case which is a problem on Windows-based hosting because (at least on the current platform) you can't catch incorrect case easily and canonicalize those URLs. It also tends to split them in reporting (GA for example - though you can create a filter to convert case, since we haven't two URLs that are identical except for case get reported separately).
- flat - everything is one level off root
- extensions- I don't really care, but if I were starting from scratch I wouldn't do it that way of course.

Previous Discussions Here
Most of the discussions on the subject here revolve around the value of keywords in URLs and misconceptions about it being better for SEO to have URLs that indicate something is closer to root.

- [webmasterworld.com...] (closer to root idea).
- [webmasterworld.com...] (keyword in hierarchical vs flat URLs discussion and multi-category problem - see next)

Some have noted a potential major downfall of hierarchical URLs - if your site has multiple possible hierarchies you run the risk of having multiple URLs to your page. That is if you auto-generate URLs based on product category and something is in both the "large" category and the "blue" category you can end up with two valid URLs with identical product listings if you aren't careful:
/widgets/large/blue
/widgets/blue/large

- [webmasterworld.com...] (multi-category concern)
- [webmasterworld.com...] (ditto)
- [webmasterworld.com...] (concern about how long to get the new structure indexed).
- [webmasterworld.com...] (most recent discussion on the topic)

What are the advantages?

So if I don't think that the new URL structure will have much positive impact on SEO and carries some risk, why consider it? A few items, in no particular order

1. Reporting. You can't use the content drilldown feature of Google Analytics. It makes it harder to see traffic patterns for logical buckets of content. I end up collecting sets of URLs and writing massive regular expressions to try to pull reporting on the various sections of the site.

2. Topic hinting to Google. Alan Bleiweiss was adamant at Pubcon that not having a hierarchical URL structure meant that you were removing important clues to Google regarding related content and hierarchical importance. In one of the threads cited above, pageoneresults said that non-hierarchical URLs are like putting all your papers in the same drawer. Personally, I've always thought that your internal linking structure was way more important, but I'm open to the idea that hierarchical URLs might help... I just haven't seen it.

3. Breadcrumbs in SERPS. That said, now that Google presents breadcrumb navigation in the SERPs, it seems like a hierarchical URL structure makes that much more likely.

4.Consistent UI. I've always tried to have my URL structure reflect my breadcrumb structure if the site is highly hierarchical. In cases where a page might be reachable via different categories/silos, I usually make the "leaf" pages (end of hierarchy pages) flat as often it's hard to have a breadcrumb in that case. Still, this site is highly hierarchical, so I think the match between main nav topic, the breadcrumb structure and URL structure would be consistent.

5. URI as UI. It allows users to edit URLs as a form of navigation. A usability expert from IBM said many years ago (when the web was newer) that she had never seen a user edit a URL in thousands of usability tests. But I do it all the time and wonder if more users do these days.

6. Descriptive. More descriptive URLs to hopefully improve CTR in general, not just in SERPs.

What's Holding Me Back?

Two things

- Cool URIs Don't Change (http://www.w3.org/Provider/Style/URI.html). I'm pretty sure the URI of that article has changed over the years though.

- FUD - it's a small site, but there is lots of money on the table and 2016 is likely to be challenging already (which is why the site is getting revamped)

What Are Your Thoughts?

What do you think about risks/benefits? If you were rebuilding a site from the ground up, what would you do?

explorador

10:32 pm on Aug 10, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



About the 301 redirects, found not so clear and contradicting info while researching but most comments go around "there is no loss". After many thoughts about it well, I have moved everything using 301 with no problem mostly easing my mind thinking "can't tie my hands on that" regarding moving to a new structure. SEs will update, most static sites with direct links... won't ever but makes no difference as long as the 301 is kept functional.

Sorry - didn't follow that part. I'm not sure what you meant there.

In regard to URL structure we can have:
web development
web development/perl
web development/php
web development/frameworks
web development, etc...

and so on, yes I got your example on blue widgets, large, and large... blue widgets. It's about NAME and also two aspects for url creation: size and color. So, is important to reserve space on your URL structure for something like:

etc website. com / category-listings / XX
or etc website. com / grouped-listings / XX etc or whatever

so you can drop there on the XX tag keywords that you enter on some field on your database-articles, this way the same url can transform in to specific listings without ever needing to create any more category options. I'm pretty sure you already thought about this or have it implemented.

JAB Creations

12:32 am on Aug 11, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You have to make sure you do only one 301 redirect from you old URL to a new one with https (not old url to new url via 301 and then redirect to 301).


Non-sense! While newer is not always better (e.g. windows 8/10) how many people have good clean URLs to begin with? You've probably degraded the URLs and what they are supposed to represent if you lost page rank which isn't even as big of a factor (if at all at this point) these days. There are too many factors for most people to properly gauge how Google works, you're seeing what you want to see instead of looking to discover what is really there.

John

Johan007

11:58 am on Aug 11, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



I do have some concerns about the admins getting that right.

@pageoneresults, I dont think screaming frog will know how many hops a redirect has?

Remember a one hop from http to the new URL with https will mitigate any PR loss and this is one of the reasons one must really understand htaccess (or whatever Windows has) to admin level by working very closely with them. My htaccess file had taken about 20 hours to create and is about 200k!

pageoneresults

2:10 pm on Aug 11, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I dont think screaming frog will know how many hops a redirect has?

It does. In the "Spider Configuration", you uncheck the box to "Always Follow Redirects". Spider site. Go to Reports > Redirect Chains and you'll have everything you need.

ergophobe

4:30 pm on Aug 11, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



ncheck the box to "Always Follow Redirects". Spider site. Go to Reports > Redirect Chains


I didn't know about that feature. Thanks!

Non-sense!


JAB Creations - what specifically is non-sense?

Whether for SEO or simply load speed, your redirects should ideally be single hop.

One of the reasons I say I'm leery of the admins getting it right is if a request comes in for http://example.com, they have it set up to take two redirects to get to https://www.example.com. If the URL is redirected, that's a third hop.
- one to change protocol
- one to add the www
- one to redirect from the old address to the new

That should be one redirect if you have it set up right which completely aside from SEO gets rid of 40 to 100ms of page load by getting rid of two lookups.

And going a bit off-topic...
My htaccess file had taken about 20 hours to create and is about 200k!


I think in most cases changing from .htaccess to httpd.conf is a micro-optimization, but that sounds like you would benefit from moving those rules to a Directory block. The .htaccess is inefficient in two ways
- has to be parsed for every request
- once you turn on AllowOverride, the server has to check every directory in the tree from the request level to web root for a .htaccess which on a Wordpress site where images can be buried deep is a fair number of reads.

That's why the official Apache docs say
Using .htaccess files slows down your Apache http server. Any directive that you can include in a .htaccess file is better set in a Directory block, as it will have the same effect with better performance.
[httpd.apache.org...]

tangor

2:28 am on Aug 22, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@ergophobe....

Have you made any determinations on the site direction?

Have any implementations been put in place, and if so, what?

Given the "millions of dollars" aspect revealed, there's no doubt that caution has been observed, yet, it is either static (ie, no change) or some change has been made. My thoughts (on a site of this size) have already been shared, but I am dang curious to know what has been done so far... and after these many (and some fine) observations.

ergophobe

7:24 pm on Aug 22, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@tangor

Based on everything said here, I'm going to push for a change in the URL structure and I'm going to go for the all at once approach as somebody recommended above in this comment:
Do it all at once


;-)

The current site is on life support and has so many issues that I'm not confident the 301s would be set up right. I do not have direct server access, so I can't do much there except beg. With the new site, however, I will have the ability to refuse to sign off and refuse to let the final (substantial) payment get released until all 301s have been tested and verified and I'm satisfied. Additionally, the team on the new site knows what they're doing and so I think it will be a piece of cake for them to get it right..

So the current plan is to launch the new site right after Christmas (ca Jan 15) and that is when this will get done. We were in an inital IA round when I posted that question. Now we're moving through wireframes now. Visual design is next, development over the fall, shakedown through the holidays, launch in Jan during the post-holiday lull.

Probably will have real year over year numbers in March. This thread will probably be locked by then, but I'll start another thread on findings and see if an admin can come in here and cross link.

JAB Creations

8:29 pm on Aug 22, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



ergophobe, you can always have a suggested list of pages on 404 pages until you've setup 301s.

John

crobb305

4:41 am on Aug 23, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



3. Breadcrumbs in SERPS. That said, now that Google presents breadcrumb navigation in the SERPs, it seems like a hierarchical URL structure makes that much more likely.


Can't you still create breadcrumbs with a flat url structure?

ergophobe

5:55 pm on Aug 24, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Can't you still create breadcrumbs with a flat url structure?


Certainly. The idea was that the more clues you give, the more likely you are to get a breadcrum nav in Google. While primarily based off your on-site navigation, I was just kind of wondering whether or not Google is also looking at URL structure to help decide when to do this and when not. But I would imagine overwhlemingly it's a clear navigation structure that does this.

suggested list of pages on 404 pages until you've setup 301s


Not going to do that. The 301s have to work BEFORE anything changes. I'm not sending someone about to drop $1000 to a 404 page in hopes they click one of the suggested links!

tangor

5:35 am on Aug 25, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



the current plan is to launch the new site right after Christmas


That pleases me no end.

Yes, we all want to change sites and get them going as quickly as possible.... and in that rush to do we sometimes rush right over the obvious. From your to do list it appears good order will be observed, a projected time line and goal completion, and a single, but mandatory, fully tested 301 requirement suggests that the final transition to live will go well.

Do keep notes on the progress, including start/end times for each part of the project, tools used (as needed) and after the site goes live come back and tell us what was done, when, how, why (we know where...the web... and who...you).

The "millions of dollars" is important, of course, but so many other webmasters who might not have that kind of income to deem their sites equally as important, whether info, entertainment, gov.... whatever it might be. Knowing your steps taken, time involved, any costs which seem reasonable to share (not expensive, expensive, more expensive then we thought, etc.) would be very beneficial.

ergophobe

1:12 pm on Aug 25, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I will certainly report back, but more on the impacts on traffic.

After being typically involved as the developer or sometimes technical SEO and having full access to the server or at least the ability to say "Add these lines to the <Directory> section, I don't even know what to tell people on IIS except "This is the result I need. Please make it so."

I'm more of the "product owner" in Agile talk (not the dev, not the project manager). So there are some aspects that may be perpetually opaque to me (tools used for example). And costs will be rolled into a general deployment phase cost.

Robert Charlton

1:22 am on Dec 28, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



the current plan is to launch the new site right after Christmas (ca Jan 15)
I thought I'd kick this thread up now, not because I'm expecting a reply in the near future, but simply because I don't want it to expire before mid-January.

I'm looking forward to news when time permits. In the meantime, I hope it's going well.

Robert Charlton

9:27 am on Feb 3, 2016 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I'm kicking this up again, this time to report a change in Google's guidelines, which changes the wording regarding hierarchy... and for me, anyway, officially confirms that "conceptual" hierarchies (ie, those with non-hierarchical urls) are OK.

What was changed was this wording...
Make a site with a clear hierarchy...
...to this wording...
Design your site to have a clear conceptual page hierarchy.

More extended discussion and references on this thread...

Google Webmaster Guidelines Updated, 2016
February, 2016
https://www.webmasterworld.com/google/4789094.htm [webmasterworld.com]

ergophobe

3:07 pm on Apr 12, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The site in question has launched.The data is still really noisy - there are pockets of the web that really do take 48 hours, perhaps more, to update DNS.

Overall traffic hasn't died. We'll see in a couple of weeks when the signals quiet down how other measures are doing, but also the site was a bit rushed at the end, so there is a fair bit of missing or thin content, so the early days may not be the best indication. I'll try to report back once we have a few weeks of data.

It's an IIS server, which is a first for me and I don't have any direct server access (and never will). It's been like pulling teeth to get the developers to set up redirects right and some other basic canonicalization. They made every correction a new hop, so if you go from http://example.com/OldPage to https://www.example.com/new-page, it takes FOUR redirects
http => https
non-www to www
mixed case to lower case
oldpage to new-page.

They didn't see anything wrong with this. In fact, they didn't see it at all until I demanded they change it and they've been working on it for days... I can't say too much b/c I don't know IIS, but I know this would be at most an hour's work on Apache.

I could go on, but the main point being that it will be a while before the kinks are worked out and we have decent data.

pageoneresults

3:41 pm on Apr 12, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I can't say too much b/c I don't know IIS, but I know this would be at most an hour's work on Apache.

Same on IIS. There are two ways to approach this; via htaccess or webconfig. If they are running an older version of IIS then htaccess, if newer version then webconfig.

webconfig

<?xml version="1.0"?>
<configuration>
<system.webServer>
<rewrite>
<rewriteMaps>
<rewriteMap name="Redirects">
<add key="/sub/" value="/sub/page"/>
</rewriteMap>
</rewriteMaps>
</system.webServer>
</configuration>

htaccess

There is an ISAPI filter from Helicon Tech that provides support for Apache .htaccess and .htpasswd configuration files in Microsoft IIS.
[helicontech.com...]

^ I've been using the ISAPI filters for over a decade. Our IIS servers behave just like Apache. In fact, they behave better! :)

Example of rules we use in our default htaccess files.

# Force no browsing to /index
RewriteRule ^(default|index)\.asp / [R=301,L]
RewriteRule (.+)/(default|index)\.asp /$1/ [R=301,L]

# Force everything to be lowercase
RewriteCond %{REQUEST_METHOD} (GET|HEAD) [NC]
RewriteCond %{REQUEST_URI} ([^?]+\u[^?]*)(?:\?.*)?
RewriteRule (.*) $1 [R=301,CL]

ergophobe

11:09 pm on Apr 12, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks P1.

It's a new server and I suggested some rewrites/redirects. The key thing is your basic principles: do your specific redirects first, then your general, and if you make one transformation (http to https), do all necessary redirects then.

Reading the IIS docs, it seemed to me that it was pretty straightforward and suggested some rules that as near as I can tell are the right syntax.... I would post them here, but don't want to divert the conversation too much. I asked for it to be done and it is getting done. And there are a million other things that need to get done.... a pretty messy launch.

tangor

3:03 am on Apr 13, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



That "best laid plans" kind of thing. As long as the work proceeds the migration will complete. Just wonder how many "ghost" urls" will end up in the g database (which never forgets a url it has met) before all the redirects are sorted.

ergophobe

7:19 pm on Apr 14, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Tangor, the redirects are mostly working. Very few 404s. It's mostly just the issue of not making it take 4-5 hops for a single URL. That seems pretty simple, but they just marked the issue as "fixed" and in fact, on the test URL, it went from four hops to three. Similarly they tell me gzip compression is enabled, but every test I've done on various test sites (b/c anti-virus software can throw off results on your own computer) tell me otherwise.

It's hard for me to understand why these things are hard.

tangor

7:37 pm on Apr 14, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Because there's a committee involved? :)

I've worked for companies before. Always seems like it's more work than necessary for the small things.

Andy Langton

8:25 pm on Apr 14, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It's hard for me to understand why these things are hard.


It's usually a combination of factors, including:

- You're asking for unusual things as far as the dev team is concerned
- Because it's unusual, the dev team need to look it up, and will usually follow guidance from whichever source they trust (which may or may not be accurate)
- They don't really understand what you're asking for - or why - and so they can misinterpret what you're trying to achieve
- You've given them code to tell them how to do their job. From bitter experience, they're not going to use your code (even if it happens to be great code)
- Dev team is under-resourced and possibly under-qualified

You could potentially live with the hops, would be my opinion ("works as intended"). If you wanted Gzip and there's no Gzip, you should push for satisfaction ;)

ergophobe

6:33 pm on Apr 16, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Probably all true.

BTW, I didn't provide code, just"guidance" (e.g. "specific redirects before general") and pseudocode to illustrate what I wanted.

The hops aren't that bad, true, but we have a lot of users on 3G and Google specifically highlights those users as being adversely affected (200ms to 400ms per hop). And a few years ago, Matt Cutts said that if you went over 3 hops on your redirects, there was a decent change Google would give up on following them.

The specific practical issue we see is that CSS/JS sometimes does not fully load for mobile users and thus the nav does not resize correctly or the images are served up the wrong size (so people see one little piece of a large image). Another example: all static resources were getting a 301 because there was a rule in place to convert to lowercase and the static resources were in a directory called /Static and the CMS was creating the correct links, all of which were getting a 301. So it might take 4-5 requests to get to the page, and then another two to load a CSS or JS file. With 40 separate JS files.... They fixed that one quickly.

I therefore care about optimizations like redirect hops, gzip, CSS/JS minification and aggregation, etc.

This is a super image-rich site and they have handled image loading with <picture> and srcset and lazy loading. So given all of that whiz bang to keep things fast for mobile users, it seemed like a shame to miss out on easy fixes and it seemed like a small ask to simply do redirects and gzip as God intended them to be done. But the former are all within the CMS and what I'm asking for is (or should be) at the server level. And there are weird things - they say gzip is turned on according to the server settings so they don't know why it isn't being served up to the public.

- Dev team is under-resourced and possibly under-qualified


My general impression is that this is a strong team with lots of resources charging a lot of money with little problem throwing code at a problem and fixing it, at least within the context of the CMS itself. Which is why I expected these things to be like falling off a log.

I think they just don't commonly have clients who load 1pages while watching HTTPLiveHeaders and the GTMetrix waterfall.... So yes, I'm probably not the ideal client.

Andy Langton

7:14 pm on Apr 16, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Matt Cutts said that if you went over 3 hops on your redirects, there was a decent change Google would give up on following them.


The "official" guidance is 5 hops these days, I believe. Obviously ditch them if you can!

I was probably a bit presumptive in my list above. There's always "dev team is stuck with website technology that is a massive pain in the neck", of course ;)

You're right to pursue it - I was being a bit world-weary about how the implementation process sometimes turns out :)

pageoneresults

7:32 pm on Apr 16, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



My understanding is that the first hop preserves the transfer of equity and any subsequent hops fragment that equity. That's my "old school understanding" of hops. Knowing that, my dev and I have everything down to no more than 1 hop. It does take a very skilled developer to fine tune the directives and get them in the right order for everything to function the way it should. When we run Screaming Frog on our sites, it's 200 for all internals and maybe a handful of 302s. Then we get to the external links which we are always chasing. One month an external reference uses the www. The next month they don't. A month after that they're back to the www. I've narrowed that practice down to WordPress users not knowing what all the little checkboxes mean.

Is all of this hopping around now happening after the fact? The site is live?

ergophobe

1:47 am on Apr 17, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>>The "official" guidance is 5 hops

I'm sorry, I stated that too categorically. The video I'm thinking of Matt *did* say not to go over five. He said to try not to exceed three though. This is from 2011, and I don't find anything more recent
[youtu.be...]

Apart from the SEO considerations, there is the page load and latency issue (leaving aside whether those *are* SEO issues and to what degree).

Search Console help docs
Avoid chaining redirects. While Googlebot and browsers can follow a "chain" of multiple redirects (e.g., Page 1 > Page 2 > Page 3), we advise redirecting to the final destination. If this is not possible, keep the number of redirects in the chain low, ideally no more than 3 and fewer than 5.
-- [support.google.com...]


Documentation on "Mobile Analysis in PageSpeed Insights"

(2) Number of redirects should be minimized
Additional HTTP redirects can add one or two extra network roundtrips (two if an extra DNS lookup is required), incurring hundreds of milliseconds of extra latency on 3G networks. For this reason, we strongly encourage webmasters to minimize the number, and ideally eliminate redirects entirely - this is especially important for the HTML document (avoid “m dot” redirects when possible).
--- [developers.google.com...]


In most cases, if people didn't forget the www or add a trailing slash or some such thing, people arrive at their destination in three hops. But if someone adds one small error or, in the future, we use the Redirect module in the CMS, we could end up at 4-6 hops.

In all cases:
- http -> https
- mixed case to lower case
- old address to new address

But we could easily add
- non-www to www
- strip trailing slash
- and a final redirect from the CMS redirect module (which going forward is really the only way we'll be able to do this)

My understanding is that the first hop preserves the transfer of equity and any subsequent hops fragment that equity.


My understanding is a bit different and, again, based on a not super recent Matt Cutts statement. In 2013, Matt said that the loss of link equity due to a 301 was identical to the loss due to passing through a link (roughly 10% of equity).
-- [youtu.be...]

If that's true, that means that five hops would be .9^5 or .59 which would mean you've given away 41% of your equity. I wonder though if all 301s are treated the same or if Google is smart enough to realize that stripping the www and the trailing slash are not the same as a redirect from one domain to another.

My general feeling, though, as with dupe content issues and such is not to ask more of Google than I need to. If I can give Google an explicit instruction rather than depending on Google to figure out my messed up setup, I'd prefer to do that.

The site is live?


Yes it is. They didn't want to implement any 301s on the staging server, so the site went live, then I tested.

Overwhelmingly, the final addresses are right. Not many 404s. So they got the redirects right in their mind and in the mind of the project manager on our team who clicked through all 301s immediately upon launch and all final addresses were right (which is really the main thing). So mostly I'm being the difficult client for saying "Uh, actually, I do not consider these 'perfect.' I would like avoid chained redirects and see them get below three hops."

And basically the response was, in essence, "we have no idea how to even test for chained redirects."

And by the way, there are MUCH bigger issues on the site than this that are *my* fault. Just working through the issues. The frustration here is not that they made a mistake or are idiots... it's that it's another thing on the punchlist and it's one that I can't just go in and fix.

dev team is stuck with website technology that is a massive pain in the neck


I think there's some of that - stuff happening on a load balancer and spread across a couple of Amazon servers. So when they say "I'm looking at the server and gzip should be enabled" I fully believe this is strangely complex and the devs are stuck with our systems and they don't necessarily know how to sort it all out.

[edited by: ergophobe at 1:58 am (utc) on Apr 17, 2016]

ergophobe

1:54 am on Apr 17, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Back to the main point of the thread, this is very early, but so far it seems that organic traffic and conversions took a huge hit in the short term, but seem to be coming back.

Conversion rate took the biggest hit of all, but Friday it was higher than the last Friday pre-launch.

We have a lot of optimization to do in terms of conversion rate - just getting all the content right and making the CTAs prominent enough and so forth. So I think the first meaningful results won't come for a couple of weeks when I think we'll have enough data to actually start looking at year over year metrics.

Andy Langton

11:05 am on Apr 17, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The "5 hops" number comes from John Mueller, here:

Too many 301s shouldn't be a problem, In general, what happens is Googlebot will follow up to five 301s in a row, and then if we can't reach the destination page we will try again the next time.

[plus.google.com...]

How much value could be lost for each hop is a separate question, of course. The general consensus remains in the 10-15% range per hop, as you note above (exceptionally difficult to test this, of course). I wouldn't be surprised if multiple hops were less damaging than they used to be, though, since hop-hop-hop is a technical mistake of the sort that Google probably doesn't want to affect rankings.

ergophobe

4:07 pm on Apr 17, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Ah, thanks for the John Mueller reference. Added to my notes.

>> Google probably doesn't want to affect rankings

That was my thought as well. Google will try to figure out your mess - dupe content, chained redirects, etc. I still prefer not to depend on Google to figure out my mess though. I'd rather clean it up at the source whenever possible. And then there's the question of how well Bing follows redirects and sorts out canonicalization and I have no idea there.

pageoneresults

4:26 pm on Apr 17, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm handing out "Pink Slips" to anyone who says "Let Google handle it." That's not even a consideration. If you have control over something of this nature, you NEVER let an indexing entity "handle it" via their error routines. And yes, there is Bing and all the others that are crawling and indexing. Sites with excessive redirect chains are nothing but trouble. Take at gander at your GSC, how does the crawl activity compare to the previous site? How are the pages performing?

Andy Langton

4:30 pm on Apr 17, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I only have very old info for Bing:

be sure not to stack redirects. Doing so almost ensures we won’t pass the value through to the end.

[blogs.bing.com...]

Who knows how Bingbot responds in reality, though! ;)

If you have control over something of this nature, you NEVER let an indexing entity "handle it" via their error routines.


That's the key thing, though - "if you have control". And if you don't have control, how much weight are chained redirects compared to other tech requests?

Edit: I did find some slightly newer Bing info that's related:

If Bingbot sees a 302 redirect, say 5 times in a row, we’ll assume you meant 301 and transfer value as if it’s actually a 301 redirect. No need to go back and clean things up for Bing


I confess that makes no sense at all to me. Five redirects chained together? Five 302s on one site? ;)
This 64 message thread spans 3 pages: 64