homepage Welcome to WebmasterWorld Guest from 54.204.231.110
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Matt Cutts Answers How Much PageRank is lost through a 301 redirect
Str82u




msg:4548794
 7:53 pm on Feb 25, 2013 (gmt 0)

This was a good piece of information to know, thought the number was closer to 66% because that's the best I've seen but this should answer a lot of questions for those that worry about this type of redirect. Note that he stumbles around "almost exactly the same"/"is currently identical" [youtube.com...]

 

1script




msg:4548808
 8:42 pm on Feb 25, 2013 (gmt 0)

What is he talking about "use 301 or a link"?! 301 is an automatic redirect (if your browser follows those automatically which most do), and link needs to be clicked at. Who in the world would use them interchangeably and why?

Also, interestingly, he keeps using the word "dissipates" in terms of PR. So, was he actually thinking about the page which has the link, not the page linked to?

Perhaps my caffeine is still waiting to kick in today but I feel like I'm more confused now than I was before I saw this.

Andy Langton




msg:4548809
 8:48 pm on Feb 25, 2013 (gmt 0)

Yeah, this video doesn't really cut to the chase at all. I guess it's trying to answer the question as to whether linking to a 301 necessarily results in more lost value than a direct link. And the answer to that is no (which I assume is widely known).

That said, IMO there are reasons why value may not flow through a 301 while it might flow through a link, and this question isn't addressed at all.

If the advice is that leaving 301s within your internal links is all fine and dandy, I would have to disagree. 301 external URLs only.

Str82u




msg:4548812
 9:15 pm on Feb 25, 2013 (gmt 0)

If you don't read anything into it, this was a pretty simple Q&A: Q:"How much PR is lost in a redirect versus a text link?" A: "The same, 15%". I read into them myself, just saying...

I think the point was more that if a user has a directory of pages that they would like to rename or restructure, how much of the current pagerank is going to be lost using redirects rather than linking to the new pages (with a big red banner) OR if they are planning on moving from one TLD to another, how much PR is lost in the move.

The difference between canonical link value versus on page link is the only question I was left with; would a canonical link on a high PR page pass PR?
Who in the world would use them interchangeably and why
If there was a way to game more PR, why else?
dissipates
like in a redirect chain, more gaming or bad webmastering; allow three redirects in the path and the link has lost more PR than one redirect.
Robert Charlton




msg:4548815
 9:32 pm on Feb 25, 2013 (gmt 0)

Matt's video confirms only that the loss of PageRank in a 301 currently is essentially the same as the loss in a link. He doesn't actually confirm the 15% number (which has been my rule of thumb number too).

He also does not in any way suggest, as Andy emphasizes, that a link and a 301 are equivalent in other ways... and probably for reasons of secret sauce recipes, needs to steer clear of why that is.

Matt mentioned that he first brought up the dissipation of PageRank through 301s in order to discourage the use of 301s instead of links... and 1script, you're right, it doesn't make sense to me either, but apparently some SEO Matt had spoken to had once been contemplating doing this.

See this excellent discussion about 301s and PR loss from March 2010, prompted in part by Matt's prior comments on the topic....

301 Redirect Means "Some Loss of PageRank" - says Mr Cutts
http://www.webmasterworld.com/google/4097565.htm [webmasterworld.com]

deadsea




msg:4548836
 10:43 pm on Feb 25, 2013 (gmt 0)

Given how ambiguous his answer is, how can we devise an experiment to actually test it? Maybe this would do it.

Create a chain of ten pages. Link each one to the next. Put some unique content on each page. Put a link to the first page from somewhere.

Create a second chain of ten pages. Link each one to the next but in such a way that there is a single 301 redirect. Put a link to first page in the same place as the other link.

Create a third chain of ten pages. Link each one to the next but make sure each link does two 301 redirects to get to the next page.

Let it run for a month and collect log files. Measure how many times googlebot visits each page. All other things being equal (especially freshness) Googlebot will recrawl pages proportionally to their PageRank. See if the chains get the same number of re-crawl visits from Googlebot. If they do do, then 301 redirects have no negative impact on passing pagerank. If the chain with no redirects gets crawled the most, followed by the chain with one redirect, followed by the link with 2 redirects, then 301 redirects leak pagerank.

Robert Charlton




msg:4548859
 1:12 am on Feb 26, 2013 (gmt 0)

...chain(s) of ten pages...

deadsea - Ingenious experiment, but I don't think it will work. It assumes that your numbers will stay relevant to what you're trying to measure, and, IMO, that's not likely to be the case.

Something Matt Cutts said a while ago (yes, that guy again) makes me believe that when you have too long a chain, you will start getting more discontinuities from sequential redirects than you will from a chain of links.

For Matt's original comments, see this video, which is a good accompaniment to this thread anyway.

Is there a limit to how many 301 (Permanent) redirects I can do on a site?
Matt Cutts - Aug 4, 2011
trt 4:30
http://www.youtube.com/watch?v=r1lVPrYoBkA [youtube.com]

Here's my paraphrasing of what Matt says...
Matt discourages chained redirects.... If you can do it in one hop, that's ideal. Google is willing to follow multiple hops... but if you start getting up into 4 or 5 range, that's a little bit dangerous, since Google might decide not to follow all those redirects. Keep it down to 1 or 2 or maybe 3.

Though Matt was talking about redirects within a site, my guess is that the number of sequential redirects you want to test would still produce glitchy results. "Google might decide not to follow" suggests that in this range, Googlebot behavior is probably going to be inconsistent.

More discussion on this thread...

Is there a limit to how many 301 (Permanent) redirects I can do on a site?
Matt Cutts - Aug 4, 2011
trt 4:30
http://www.youtube.com/watch?v=r1lVPrYoBkA [youtube.com]

Also, fwiw, note that Matt discusses two possible meanings of "how many"....
Is there a limit to how many 301 (Permanent) redirects I can do on a site?
How about how many redirects I can chain together?

I'm only talking about chained redirects right now.

brotherhood of LAN




msg:4548861
 1:35 am on Feb 26, 2013 (gmt 0)

>chains of links/pages

ciml done a lot of work on this about 10 years ago and iirc he mentioned that experiments were of limited use so far along the chain; googlebot would lose interest. i imagine purely redirects would fare the same or worse.


i think a lot of redirects are entirely unnecessary and would be better served by an internal rewrite. redirects are just lazy in many cases ;o)

browser events, non-www to www and one domain to another are three major ones I can think of as decent cases. even non-www to www seems unnecessary given the availability of the canonical tag.

[edited by: brotherhood_of_LAN at 1:42 am (utc) on Feb 26, 2013]

Lorel




msg:4548862
 1:38 am on Feb 26, 2013 (gmt 0)

It seems to me that if a site uses #s for file names and/or they contain capitals and/or underscores that a 301 redirect is the way to go, as in that situation it will improve ranking even though it may be negated by degrading the PR slightly. Yes?

tedster




msg:4548877
 4:04 am on Feb 26, 2013 (gmt 0)

...even non-www to www seems unnecessary given the availability of the canonical tag.

I still prefer to have the server return the final URL than to depend on Google to get it right.

brotherhood of LAN




msg:4548954
 1:16 pm on Feb 26, 2013 (gmt 0)

No doubt that's the prudent approach and help keeps fate in your own hands.

I guess the same could be said for unnecessary redirects, There shouldn't be 2,3,4+ redirects when 1 page could handle the rule set. I see it as more of a courtesy to the user rather than some PR preservation though.

deadsea




msg:4548985
 3:00 pm on Feb 26, 2013 (gmt 0)

As far as canonical vs 301 go, I would expect Google to handle them the same behind as far as pagerank passing goes. If a 301 redirect leaks PR, I would expect canonical to as well. I don't see any reason why Google would code them differently into the pagerank algorithm.

Andy Langton




msg:4548989
 3:10 pm on Feb 26, 2013 (gmt 0)

I think the point was more that if a user has a directory of pages that they would like to rename or restructure, how much of the current pagerank is going to be lost using redirects rather than linking to the new pages (with a big red banner) OR if they are planning on moving from one TLD to another, how much PR is lost in the move.


I don't think this is true. He's answering the theoretical question:

If there are two identical links to page 'A' with one going via a 301 and the other links directly, is there any greater loss of pagerank for the 301?

In your examples above there are other reasons for loss of PageRank and other ranking signals. For instance, where existing content has moved, Google will compare the old content to the new destination and does not necessarily pass value at all if that doesn't add up.

There are other factors involved, of course. Do 100% of social likes travel across a 301? Does 100% of historic clickthrough data?

It can be dangerous to apply a theoretical point to a real world situation.

Similarly:

If a 301 redirect leaks PR, I would expect canonical to as well.


In a mathematical sense, yes. But in a real world sense canonical is less effective, since there are two URLs to evaluate. I've seen plenty of cases where canonical does nothing because of simple differences that occur between two Google visits to the two URLs (e.g. the news articles linked in a sidebar updated).

This is why (in real world scenarios) direct links still beat a 301, which still beats a canonical attribute.

ZydoSEO




msg:4549029
 4:56 pm on Feb 26, 2013 (gmt 0)

Personnally, if at all possible... I always chose 301 redirects over the canonical link element to correct canonicalization issues. I ONLY use the canonical link element if the redirects are far too complex or impossible to implement. There is one distinct advantage...

While the canonical link element may solve the duplicate content and split PageRank/link juice issues surrounding non-canonical URLs, it perpetuates other people linking to your site with non-canonical URLs. By this I mean, most people create links by browsing to the page to which they want to link and copying the URL of that page from their browser address bar. They paste that URL into their link and their done. If you're using the canonical link element, people will continue to see non-canonical URLs in their browsers and will continue to copy and paste those non-canonical URLs when creating links. Each time this happens in the future, you're likely only getting 85-90% of the PageRank/link juice that you could have gotten had they linked with the canonical URL (assumes Google would also cause canonical link elements to decay or dissipate PageRank just like a link which I think is good assumption).

This is NOT true when canonicalization issues are corrected with a 301 redirect. Not only does it do all the things a canonical link element does, but it almost 100% insures that all future links created pointing to that page(unless the URLs are manually typed in) will be using the canonical form of that URL. If someone wants to link to your URL and they navigate to that page by clicking on a non-canonical link on your site or another site then the browser will detect the 301, request the canonical URL, and change the URL in the browser to the canonical. When the user copies the URL from their browser to create a link, it will now ALWAYS be the canonical URL. This will maximize the amount of juice all future links pass your site.

As far as the Cutts video...

I don't see why this is such big news to so many webmasters and SEOs honestly. A few years back when Cutts announced that 301s did cause a slight loss of PageRank being passed, I knew then the amount lost was likely exactly equal to the amount loss by a link. Blame it on "d"! ;)

If you've studied the original PageRank algorithm in the "The Anatomy of a Large-Scale Hypertextual Web Search Engine", the formula is based on a "random surfer" model (yes I know it's changed a lot since then but the basic concept is probaby the same). And the formula has built into it a damping factor ("d") that represents the probability that the random surfer will get bored and navigate directly to another random page rather than click on a link on the current page in their browser. This damping factor is typically set around 85% (or .85) according to the original docs. And I've always heard Googlers use figures in the 85-90% range. It is this damping factor that causes a link's PageRank/link juice to decay or dissipate roughly 10-15%. It is this same damping factor that causes that exact same amount (as Matt said in the video) of PageRank to be lost or to decay/dissipate when you 301 redirect a URL.

Think about it... If browsers didn't automatically follow redirects then in the random surfer model, a redirected page would be synonymous with a page that used to have mulitple inbound and outbound links which has had all of it's outbound links replaced with a single outbound link pointing to the target URL of the 301 redirect. And assuming d=.85 in this scenario (since there is only 1 outbound link on the page) roughly 85% all available PageRank could be passed out on that single link to the target URL of the redirect.

I think what Cutts is saying is that:

PageA with inbound links and 1 outbound link to Page B


passes EXACTLY the same amount of PageRank link juice as when that same:

PageA with inbound links is 301 redirected to Page B

The decay or dissipation in PageRank is the same.

Why would Google overcomplicate an already very complex PageRank algorithm with an exception to cover 301 redirects when it can simply be treated as a special case (a page where the outbound link count = 1) that is already covered by the general algorithm? They likely wouldn't create exception code because there would be no need and they would want to keep the algorithm/formula as clean and elegant as possible.

Just my $0.02.

tedster




msg:4549181
 8:34 pm on Feb 26, 2013 (gmt 0)

It is this damping factor that causes a link's PageRank/link juice to decay or dissipate roughly 10-15%.

There's another part to this. According to the original formula, PageRank calculation is not once and done. It is iterated (repeated) indefinitely around the web until the values stop changing on each round - that is, the number stays the same up to a certain number of decimal points.

Without a damping factor, PR values would never stabilize like that.

deadsea




msg:4549193
 9:07 pm on Feb 26, 2013 (gmt 0)

You wouldn't need the damping factor on the 301 redirect. Unlike pages that can contain many links, a redirect has exactly one link. Taking out the damping factor for redirects only would not prevent the algorithm from stabilizing.

Without the damping factor, it would basically be that a page many urls: its canonical url, and all the urls that redirect to it. Or in other terms, the page and all the urls that link to it are treated as a single document.

tedster




msg:4549198
 9:15 pm on Feb 26, 2013 (gmt 0)

In the video from a year ago, Matt Cutts did mention that the lost PageRank factor was intentionally added for 301 redirects - and that he had to check with the algorithm team to verify that. (He is the head of the spam team, and Amit Singhal is head of the main algorithm team.)

deadsea




msg:4549204
 9:26 pm on Feb 26, 2013 (gmt 0)

The statements from Google on this have until now been clear: each 301 redirect suctions off a percent of the pagerank. Now Matt says he was actually answering a different question to prevent people from trying to use redirects instead of links.

From a technical and algorithmic standpoint, it could be implemented either way.

It's as clear as mud.

lucy24




msg:4549228
 10:05 pm on Feb 26, 2013 (gmt 0)

If you've studied the original PageRank algorithm in the "The Anatomy of a Large-Scale Hypertextual Web Search Engine", the formula is based on a "random surfer" model (yes I know it's changed a lot since then but the basic concept is probaby the same). And the formula has built into it a damping factor ("d") that represents the probability that the random surfer will get bored and navigate directly to another random page rather than click on a link on the current page in their browser. This damping factor is typically set around 85% (or .85) according to the original docs.

Holy ###. Just how old is this algorithm? How many users even know they're being redirected? This isn't about "The page will redirect in X seconds"; it's about any 301 anywhere.

At this point I detoured to Firefox and opened up a Forums page (a different one) knowing that forums always involve some redirection-- to the login page, to and from the "compose post" page and so on. I was hit with a steady blizzard of FF alerts; at some time in the past I must have checked the "Warn me when websites try to redirect" box. (Uhm... This isn't on by default is it? Can't imagine it would be.) It never occurred to me that it also applied to sites within the same domain; guess it should have, since it specifically says "or reload".

Even there, however, the displayed warning only applies to redirects that are coded in the page. Redirects that happen in htaccess before you ever reach the page don't trigger the alert; FF quietly sends in a fresh request.

Query: Overall, what proportion of 301 redirects give the human user the option of not following them? I mean in actual practice, not in browser-prefs-changing theory.

ZydoSEO




msg:4549263
 12:09 am on Feb 27, 2013 (gmt 0)

Yep the algorithm requires multiple passes before PageRank approaches some asymptotic value where it stabilizes.

Query: Overall, what proportion of 301 redirects give the human user the option of not following them? I mean in actual practice, not in browser-prefs-changing theory.


I wasn't implying that any 301 redirects give the human user the option of not following the redirect.

But in real life, how often do you navigate to a random web page where you are only given the option to 1) click on a link to visit another page or 2) navigate to another page random page... and are never allowed to hit the "back" button? LOL Only in the case of those annoying, typically spammy sites (ironic) that disable the back button to trap you on their site. Yet this is the "model" the algorithm for PageRank was based on.

I was simply pointing out that in this "make believe world" of "the random surfer" where you never hit the back button and can only click a link or navigate directly to another random page... that if the user clicks on a link on PageA pointing to PageB and PageB has been 301 redirected to PageC ( by the Wizard of Oz for example), that it would likely be a nobrainer for Google to "treat this the same as if":

1) the user clicked on the PageB link on PageA
2) the browser fetched/displayed PageB which only had one link on it (to PageC),
3) the user clicked on the PageC link on PageB
4) the browser fetched/displayed PageC

The end result is exactly the same in the real world: the user clicked on a link on PageA pointing to PageB and 2 page requests/hops later (one for PageB followed by one for PageC) the user finally arrived at PageC. The only real difference is that instead of PageB being displayed with a single link to PageC and the user clicking on that link in the random surfer model... in the redirect scenario the redirect is automatic and essentially is equivalent to the user clicking on that PageC link.

Sgt_Kickaxe




msg:4549269
 12:52 am on Feb 27, 2013 (gmt 0)

I watched the video and his message was pretty clear, I don't see where all the confusion is coming from ?

I liked his saying "my pages will have 10% more pagerank and so more pages will rank". To me that means that your pages do indeed need a minimum PR to rank at all, the floor is no longer zero, if it ever has been.

deadsea




msg:4549271
 12:55 am on Feb 27, 2013 (gmt 0)

What video did you watch Sgt_kickaxe?

lucy24




msg:4549290
 2:22 am on Feb 27, 2013 (gmt 0)

The only real difference is that instead of PageB being displayed with a single link to PageC and the user clicking on that link in the random surfer model... in the redirect scenario the redirect is automatic and essentially is equivalent to the user clicking on that PageC link.

But that's all the difference in the world.

Doesn't g### also say-- out of the other side of their mouths, I guess --that it doesn't matter how complicated your URL looks, it only matters how many clicks it takes the user to get there?

That's the analogy to look at.

example.com
>>link>>
example.com/directory
>>link>>
example.com/directory/subdirectory
>>link>>
example.com/directory/subdirectory/subsub
>>link>>
example.com/directory/subdirectory/subsub/subsubsub

That last place is four steps away from where you started.

But if it's

example.com
>>link>>
example.com/directory/subdirectory/subsub/subsubsub

then it's only one step and the slashes in the URL don't matter.

Similarly

example.com/oldyuckyurl
>>301>>
example.com/oldishbetterurl
>>301>>
example.com/newishprettygoodurl
>>301>>
example.com/newperfecturl

is extra work for your server and for the user's browser, but it's completely invisible to the human user. Granted it may affect user experience if the connection is so slow that it takes them five seconds to arrive at the page. But isn't that something g### already measures separately? Other than the potential time factor, the user hasn't done any more work than if you'd said

example.com/oldyuckyurl
>>301>>
example.com/newperfecturl

in the first place.

Edge




msg:4549950
 7:55 pm on Feb 28, 2013 (gmt 0)

Good timing on this thread and Matt's video for me...

Simple question question simple answer - good job Matt..

Str82u




msg:4549951
 8:03 pm on Feb 28, 2013 (gmt 0)

Simple question question simple answer
right. Nice discussion here but the Q&A was good for the old notebook.
TheOptimizationIdiot




msg:4550028
 4:48 am on Mar 1, 2013 (gmt 0)

I liked his saying "my pages will have 10% more pagerank and so more pages will rank". To me that means that your pages do indeed need a minimum PR to rank at all, the floor is no longer zero, if it ever has been.

Just for clarification MC was quoting the SEO who wrote to him not making the statement himself.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved