| This 211 message thread spans 8 pages: < < 211 ( 1 2 3 4 5  7 8 ) > > || |
|Learning About PR Sculpting: internal links with rel=nofollow |
| 10:32 am on Dec 2, 2008 (gmt 0)|
< Note: this thread begins with posts that were split out
from another thread: Experiments in keyword rich links to Home [webmasterworld.com] >
Could you say that the introduction of the nofollow attribute has allowed Google to implement more strict borders for over optimisation? Now we can nofollow "home" links without removing them as a user feature and Google has left us no excuse to have over optimised sites!
Nofollow is a blessing and a curse!
[edited by: tedster at 9:27 am (utc) on Dec. 10, 2008]
[edit reason] moved from another location [/edit]
| 11:51 pm on Dec 14, 2008 (gmt 0)|
To my mind, site: searches are closer to PR order than general keyword searches, since they remove keyword-level relevance. But that isn't anywhere near saying "PR order". If nothing else, site: and link: searches seem to be deliberately obfuscated.
| 2:15 am on Dec 15, 2008 (gmt 0)|
"Ok, now we can take off the tin foil hats, yes?"
Like he said in the other message, Matt again says nofollow discards pagerank.
--- "the nofollow meta tag does that same thing, but at a page level"
--- "Plenty of other mechanisms would also work (e.g. a link through a page that is robot.txt'ed out)"
Both of these mechanism discard pagerank.
So time to take off the tinfoil hats. Until they say differently, Google has said nofollow is a way to granularly deny pagerank to pages, not increase PR for other links.
| 5:24 am on Dec 15, 2008 (gmt 0)|
I disagree Steve and I've tested enough to see repeatable results.
Proper use of nofollow can result in a dramatic increase of PR on some pages. The purpose of nofollow is to guide the PR to where you want it and stop it from going where you don't. It doesn't create or remove PR, it's just the roadmap that guides it around your internal links.
| 6:30 am on Dec 15, 2008 (gmt 0)|
|Matt again says nofollow discards pagerank |
lol, i don't know what's confusing you about the statement. (bolded below)
You can "interpret" it however you want.
Or you can TEST it if you so choose --
But the way it READS and TESTS, say the same thing,
so one can either use it to their advantage or not.
At this point, it's up to the individual.
"... gives webmasters the ability to modify PageRank flow at link-level granularity ..."
| 9:55 am on Dec 15, 2008 (gmt 0)|
LOL, go ahead and ignore what he said, since you don't believe it anyway, but if you truly don't understand it, you really need to read what he said. You CAN modify pagerank *in the same way* as two ways he specifies that DISCARD pagerank.
Same = same
Same != different
Again, just because he says it doesn't mean it has to be true, but his statements obviously, clearly do not say what you want them to say, and instead state it is like two things that we know for sure kill pagerank. Go ahead and ignore it if you want, but pretending when he says "a Firestone is like a Michelin" that he really meant a Firestone is like an aardvark is not a good idea.
| 11:22 am on Dec 15, 2008 (gmt 0)|
I'm the FIRST one to say take MC comments with a grain of salt.
So that argument doesn't hold water with me.
Speaking of water,
It's quite silly to be discussing the dragon that eats ships at the ends of the world, when several people in this thread have already circumnavigated the globe and reported the world is round, not flat, and no dragon exists.
Btw - here we are 2 years ago (almost to date) having the SAME discussion with me saying "test it" blah, blah. lol, i had ALREADY tested it then. (You know real testing of how the PR flowed and silly junk like that) BEFORE MC said squat about rel=nofollow PR flows or trust issues.
I NEVER need MC to confirm, deny, or even speak about something I've already tested to my satisfaction.
And automatically I will test anything he says that doesn't jibe with my previous testing or deductions.
Wrong argument, wrong person.
| 12:43 pm on Dec 15, 2008 (gmt 0)|
Um, I did test it. No doubt before you. So have many others. There is little evidence it works, but since toolbar PR is so far behind the times what is being done now, I'm not going to make any claims about it. I'm merely pointing out that Matt's comments strongly suggest that it still dies.
As for that other thread, you seriously STILL don't believe in hub scores or that linking out helps? LOL, good luck with that carry on.
| 12:51 pm on Dec 15, 2008 (gmt 0)|
|Um, I did test it. No doubt before you. So have many others. There is little evidence it works |
no offense, but you're going have to post a link or PM me with your contrarian info.
I (and others that are publicly search-able) have tested this over and over.
Although the phrase "little evidence" points to:
you not testing it yourself.
you not testing it well.
It either works or it doesn't.
There's no "little bit" "half-way" or "sometimes" about the algo flowing real PR to pages or not.
(and if you're looking at Toolbar PR to gauge results, you have NOT conducted the "right" test.)
It's a rather easy test to set up as well.
|As for that other thread, you seriously STILL don't believe in hub scores or that linking out helps? LOL, good luck with that carry on |
It's pretty obvious you didn't read MY comments or understand the context of what i was saying, so at this point, it's a pretty moot debate between us.
| 6:28 pm on Dec 15, 2008 (gmt 0)|
I'm in the midst of redesigning and overhauling my entire site. And of course I'm looking at what the PR sculpting proponents want to achieve. The goal is understandable, the question is the methodology correct.
I'm doing exactly what I've always done. Appropriate anchor text, good navigation, and theme siloing. Stuff that Brett wrote about, what, decades ago now? It's still my assertation that this stuff works as well as anyone needs. It's basic principles.
Theme siloing and inbound link development should be able to do effectively what PR sculpting does. At least, it should get you 99% of the way there. And the enourmous work to get the extra 1%? I dispute it's worth it. In fact, there's an endpoint to the benefit of PR sculpting, whereas the combination of siloing and inbound link development has no such bound.
| 10:51 pm on Dec 15, 2008 (gmt 0)|
"(and if you're looking at Toolbar PR to gauge results, you have NOT conducted the "right" test.)"
Well now you just made all your comments nonsense.
The only way to judge this is with toolbar PR.
If at this point you are saying you made pages rank better by using nofollow, then you are talking about something off topic.
The issue is simple. If you start with a PR5 root page, a 100 page total 'site', 12 sitewide links on every page, exactly how many total toolbar PR points do you assert you will increase pages on that site if you nofollow two of the sitewide links?
If you can't answer that, you haven't done anything. Toolbar PR is so unreliable that PR tests are not to be fully trusted, but they take at least nine months to confirm.
It's pretty clear now you are not discussing PR flow, but rather overall ranking, a much more complex subject, and that's why you are going astray.
| 12:41 am on Dec 16, 2008 (gmt 0)|
Here's the way I understand the different points of view. The part that isn't questioned is whether rel="nofollow" stops PR from flowing through that link. Yes, that's what everyone seems to agree is true. But Google also says that the link is dropped from their link graph. What does that mean altogether? That is the issue.
For instance, suppose we have a page with 10 links. Does adding nofollow to 1 link mean that the total vote of the page is now divided by 9 instead of 10, giving each of the regular links 10% more pop? Or does it mean that the total vote is still divided into 10 slices, but one of them just "doesn't work" any more?
For PR sculpting to work, the first option needs to be true - each of the remaining links need to actually increase in its share of voted PR. "Dropped from the link graph" needs to mean "is no longer included in PageRank calculations".
So how can we test this? The only PR metric we have direct access to is on the toolbar, and as a metric toolbar PR has several flaws.
- TBPR only updates every three or four months - it's pretty tough to determine cause and effect on that kind of schedule.
- TBPR is only expressed in whole digits from 0 to 10, not carried out to many decimal places the way that real PR calculations must be.
- Google plays games and has bugs with TBPR reporting, casting doubt on the numbers that are reported.
Even when PR does increase for a url, most of the time it goes from a "middle 3" to a "high 3" or something like that. So we still can't get a direct measurement. What's a tester to do?
The only viable option I see is using secondary metrics - increases in ranking and search traffic. And yes, Virginia, PR still does affect ranking and traffic. It's one strong factor among several hundreds of other factors of varying power.
These secondary metrics need to be aggregated over many urls and many sites to negate the influence of other causes for ranking and traffic changes. But using them - in aggregate - bypasses all three pitfalls I listed above.
What I've seen is that the theory does work out in practice. Other URLs in the domain do improve their ranking and search traffic after well crafted PR sculpting cuts off the flow of link juice to other pages.
One example would be anecdotal evidence. 10 examples would provide an intriguing suggestion. But tested over a variety of sites and therefore hundreds or thousands of urls, the improvements become a statistical proof of concept.
It's not something that's worth discussing in the abstract any more - that discussion happened years back. It's been tested, and proven. The experiments are repeatable, and the results have been verified by many SEOs at this point.
Here's the caveat - PR sculpting is not some super weapon of SEO. It will probably not help you move your home page from #15 to #1 on a highly competitive, single word trophy phrase. For one thing, the PR gains are not that dramatic. But what I do see is improvement on secondary queries that lead to internal urls, and even more significant jumps out on the long tail.
[edited by: tedster at 9:25 am (utc) on Dec. 16, 2008]
| 1:05 am on Dec 16, 2008 (gmt 0)|
What about the scenario that you alluded to earlier in this thread:
Page 1 with a certain PR value has a link reference to Page 2. We decide to "nofollow" sculpt this link to Page 2. Page 2 has a link back to Page 1. By "nofollowing" the link to Page 2, which links back to Page 1, will the severed link graph back to Page 1 effect its overall PR ?
Thanks in advance. I hope my question makes sense.
| 1:15 am on Dec 16, 2008 (gmt 0)|
Good summary of incomplete logic.
Reasons the use of nofollow can increase rankings and traffic include many with nothing to do with PR. Removing duplicate URLs makes for a healthier overall domain. The same with removing supplemental pages, and pages the bots don't like (like login pages with little text). Nofollowing a duplicate "click here" link while allow the bot to folow a relevent link text link is another thing with huge implications. By using nofollow links, or noindex meta tags, the overall respect and health of domains should increase. And as pages age, this phenomenon increases.
[edited by: steveb at 1:34 am (utc) on Dec. 16, 2008]
| 1:34 am on Dec 16, 2008 (gmt 0)|
|By "nofollowing" the link to Page 2, which links back to Page 1, will the severed link graph back to Page 1 effect its overall PR ? |
Sure, that's true. And it can be one of the problems with poorly designed attempts at PR sculpting, especially on sites with weak overall link structures. You can end up creating these isolated pockets around the site that accidentally get starved of most link love, but you really needed them to rank.
But in a well thought-out implementation, the INCREASE in PR circulation among the pages that are benefitting can more than make up for the losses of backflow circulation from the nofollowed pages.
Really understanding the math involved can be a challenge, and it takes a lot of computing to carry out rigorously. It can break your brain and give you no real practical results after it's all done. Remember that PR calculation is iterative. You can't generate an accurate mental model of PR flow based on just one cycle A > B and B > A...or even between just a few pages.
Then there's the fact that Google isn't even using the exact published formula any more - we don't really know what they are doing, and they say they changed again, as recently as the beginning of this year (2008).
But theoretcially PR sculpting always looked like it might work - and people tested it and found that it DOES work. So there's not much need for continued abstraction and theory any more, the way I see it. Just combine a decent grasp of the original PR paper with an understanding of your site's structure. Make a plan, and execute it. If you want to test on sites of lesser importance first to get a hands-on feel, that makes plenty of sense, too.
...or just let it alone completely if that's what feels right to you. Good site structure still works wonders for search traffic without any sculpting at all, and many sites would benefit much more from attention at this most essential level first.
[edited by: tedster at 9:27 am (utc) on Dec. 16, 2008]
| 9:01 am on Dec 16, 2008 (gmt 0)|
|Good site structure still works wonders for search traffic without any sculpting at all, and many sites would benefit much more from attention at this most essential level first. |
Could not agree more.
I think Tedsters last two posts should be re-read by all. Especially the bit about statistical evaluation.
And if you do not think that proving a statistical correlation between cause and effect is sufficient to induce the conclusion that somethink is true, well you better start questioning thhe following things that are known for the same reasons:
1) Smoking causes cancer
2) Obesity cases heart problems
3) Some drugs cause mental health issues over time
More quixotic examples exist, such as:
4) Owning a cat causes you to live longer
You can measure the effect of PR over a wide, varied dataset, preferably with a contol group, and preferably measuring against PREDICTED results, using technical terms like "to the 10% significance level".
When people criticise other's testing, it is usually because their idea of testing is "try it, see what happens". They assume others just do that too. And then they say "ah, but then there are factors you have not considered". Which is technically known as a 'straw man' argument.
So, over a varied test group, measured against a control group, with a predicted result, which will be measured in such a way as to statisitically discount randomness- is this a fair test methodology? And if not, how do you propose to know ANYTHING about the Google algo?
| 9:29 am on Dec 16, 2008 (gmt 0)|
So can we talk sculpting methodologies now please? I am familiar with the use of third tier push and other simple sculpting techniques but would love to hear other peoples techniques/experinces/knowledge/observations on this interesting subject.
| 7:59 pm on Dec 16, 2008 (gmt 0)|
I remember Matt talking about this several times at Pubcon 2007... in the hallways, in the site reviews where he was on the panel, and several times at the Meet the Google Engineers party where he was bombarded with questions for an hour and a half. The topic came up a lot. In every case that someone asked him about rel="nofollow" or where he was suggesting possibly using rel="nofollow" he lead me to believe that doing so DID in fact increase the amount of PR passed to the remaining "followed" OB links on the page. He even said something to the affect that typically this would only have a minor affect and should be a fine tuning SEO measure after all other low hanging SEO fruit had been taken care of.
If you think about it, most pages have 50+ OB links on them. So nofollowing one OB link typically doesn't buy you much. Instead of each OB link getting ~(1/50)*PR(PAGE)(accounting for their damping factor), they would now get ~(1/49)*PR(Page)... a very minor difference. However if you nofollow 10 links on a page w/50 OB links, each might gain an extra 25% or so of ummph (going from 1/50 or ~0.02*PR(PAGE) to 1/40 or ~0.025*PR(PAGE). This was why I thought he was stating it would typically make a very minor difference - most pages have lots of links and typically only a few might be nofollowed.
Even Steveb agrees Matt has stated that Google drops nofollowed links from their link graph. Why would Google maintain a link graph? They don't really need to know inbound and outbound links for every page in order to spider them. One obvious reason for maintaining a link graph would be so that they could quickly determine the number and sources of all inbound links for a page as well as quickly determining the number and destinations of all outbound links from a page. If they use the link graph when calculating PR for PAGE A to determine which pages link to PAGE A and how many OB links each of those pages linking to PAGE A has then by the nofollowed links not being in the link graph, in Tedster's example above, they would only know about 9 outbound links and they would, in fact, pass ~10% more PR to the remaining 9 followed links than it would if there were 10 in the link graph but one was somehow flagged nofollow.
But who knows for sure... I don't think anyone can say definitively it is one way or the other. With different data centers in the mix, their algo changing constantly, linking between sites changing constantly, etc. there is almost no way to say 100% for sure.
| 5:07 pm on Dec 17, 2008 (gmt 0)|
After all this discussion, I've been convinced to give it a try. I did my first implementation over the past 24 hours. Nothing major, just some "sculpting" internally to see how all this stuff works. ;)
| 6:00 pm on Dec 17, 2008 (gmt 0)|
I'm trying some very small controlled changes as well. I still don't understand why interlinking is frowned so much by Google. If I have a home page A that links to pages B and C, why is it so terribly bad to have references back to the home page (Page A) from within pages B and C ? I find this navigation useful, and a good reference point, if pages B or C are found by a user directly from the search engines.
More importantly, I'm still unclear what the ramifications are of the following scenarios:
1) what happens if I "nofollow" the links from Page A to Pages B & C ? Does this cause problems since B & C link back to A ?
2) What if I "nofollow" the links to Page A from Pages B & C ? Will this effect the importance of Page A ?
My head is still spinning, after reading this entire thread 3 or 4 times......
| 6:25 pm on Dec 17, 2008 (gmt 0)|
|I'm trying some very small controlled changes as well. |
Unfortunately my changes are not in a "controlled" experimental environment. I just finished up another implementation. Both are on sites of 500+ pages. They are both organic websites that go through constant updates. Not from an SEO standpoint though, that remains pretty constant.
I went through all navigational includes. I
rel="nofollow"'d links to login pages and similar non-content pages.
I also took a close look at the repetition of links within those includes. For example, my footer may have had a link to Home. My header may have also had a link to Home. My Copyright Notice may also be linked to Home. So, what did I do? I used
rel="nofollow" on the footer link that contains the word Home. The header is a linked graphic with a proper alt attribute and the Copyright Notice contains the proper anchor text.
I made sure to tag all https "entry" links as
rel="nofollow". Even though these are blocked by robots.txt served dynamically based on the request, it appears that I can "STOP" the flow of PR passing through those links.
I'm used to stopping the flow at the page level with
noindex, nofollow. In many of the implementations, the destination pages already have the
noindex, nofollow REP (Robots Exclusion Protocol).
The way I understand it is that I've just stopped PR from even making it to that page. I've removed the flow of PR from the graph period. I've come "up a level" to manage the flow of PR. Did I get that right? Did I mention the flow of PR? ;)
I'm such an SEO Wantabe!
Pssst, please don't tell anyone.
| 9:04 pm on Dec 17, 2008 (gmt 0)|
"I used rel="nofollow" on the footer link that contains the word Home"
Can you just clarify why you would do that?
Surely having a 'home' hyperlink is a natural link to have in the eyes of google?
How would this effect Page Rank and distribution of link juice?
| 11:15 pm on Dec 17, 2008 (gmt 0)|
P1R, was that a dig :p
Cheesy, the apparent consensus at this stage is that nofollow can affect PR flow, and G does not attach a stigma to this, so naturalness is not a necessary consideration. Merely keeping PR on the page, to be funnelled to the pages you want it to go to.
There is masses of information on these boards and elsewhere that duplicate links have less beneficial effect to the destination page, so why do it.
| 9:30 am on Dec 18, 2008 (gmt 0)|
"I used rel="nofollow" on the footer link that contains the word Home"
Can you just clarify why you would do that?
Surely having a 'home' hyperlink is a natural link to have in the eyes of google?
How would this effect Page Rank and distribution of link juice?
The reason I nofollow user features like the "home" links is so the link does not pass value back to the homepage with unoptimised/irrelevant anchor text.
| 3:38 pm on Dec 18, 2008 (gmt 0)|
I noticed today that Yell*com serve different pages to users and Googlebot. They add a meta nofollow on the user pages and not on teh Googlebot persented pages.
User pages are presented this:
<meta name="robots" content="noindex, nofollow" />
Bot pages are not but instead this (which is wise)
<meta name="robots" content="noodp,noydir" />
This messing about with meta nofollow on your homepage seems particularily risky to me, what are other peoples thoughts?
| 4:14 pm on Dec 18, 2008 (gmt 0)|
Why would you present a nofollow to actual people, but not bots? Its for spiders only.
Unless I've got completely the wrong handle on this
| 4:44 pm on Dec 18, 2008 (gmt 0)|
|For PR sculpting to work, the first option needs to be true - each of the remaining links need to actually increase in its share of voted PR. "Dropped from the link graph" needs to mean "is no longer included in PageRank calculations". |
I think this would almost have to be the case. Think about a blog page that has 10 links when a blog post is made. Now think about 200 people with nofollowed "name hyperlinks" leaving comments. It would be silly if the total link graph was now 210.
| 5:13 pm on Dec 18, 2008 (gmt 0)|
I agree with you BradleyT, and I'm sure I've proven that to my own satisfaction. However not everyone in the thread was on the same wavelength about it, so I was hoping to clarify a relatively stuck discussion that was more about IF sculpting works.
I'd like to see this thread move on to discussing different ways to use PR sculpting, different situations where it helps and so on - and that seems to be happening now.
One of the situations where I've seen PR sculpting help is with pagination troubles - passing PR into the deeper paginated results pages. As long as there is another link path to the top items in a paginated result set somewhere on the site.
PR sculpting can remove the splitting of PR in a situation such as [pages 1-10 out of 47] . With PR sculpting, you can take the available PR and focus it on page 10. Page 10 can then pass more PR to even higher pages.
But you do need to have, or build, another link path for the items in [1-9], or else the top items on those urls drop out of Google results. You don't want to just shuffle one set of problems for another.
| 1:46 pm on Dec 19, 2008 (gmt 0)|
Right, mind if I pose some questions to get this going again. (nofollow=NF)
1) For a page where 2 of X numbers of links point to the same page:
i) Does NFing one affect the other?
ii) Where both use the same anchor, is there a diffence to which one you NF?
iii) Where varied anchor, does the NF link affect relevancy passed to destination page from the normal link
2) Theme Siloing where you have a structure
Group 1 -> SubGroup 1A
Group 1 -> SubGroup 1B
Group 2 -> SubGroup 2A
Group 2 -> SubGroup 2B
Group 3 -> SubGroup 3A
Group 3 -> SubGroup 3B
Group 3 -> SubGroup 3C
Group 4 -> SubGroup 4A
i) Within a SubGroup, would you have the nav menu
A) Display all Groups and Subs, all followed
B) Display all Groups and Subs, NF everything but the related Subs
C) Display Subs in current Group, plus other Groups (not subs), NF the other Groups
ii) Is NFing ALL links that move UP the structure, except a link to HOME from the END page of the sculpted chain the way to go? (alluding to TooTricky)
iii) Inline internal links to other themed sections- NF or follow? Why?
3) Search results (esp hyperlinked search results) that are NoIndexed (but allowed in robots). Assuming you NFed the link that took you to the results page, presumably the only PR here is from external sources.
i) Do you
A) NF the PAGE
B) NF links to results
C) Follow everything (after all you've lost no PR to this page anyway
A) NF everything
B) Follow Page 1, NF the rest
C) Follow the NEXT PAGE, NF the rest
4) Boilerplate text (contact us, T&C, about etc)
A) NF always
B) Follow on Home, NF otherwise
C) Follow on NoIndex Pages, NF otherwise
D) Something else
I'm not necessarily PERSONALLY interested in these, but I thought they might get a discussion going
| 3:14 pm on Dec 19, 2008 (gmt 0)|
From my understanding reading the NF official descriptions is that this link attribute should be treated by spiders (some at least) like it doesn't exist.
Something perhaps is not stated in the thread (at least the posts I read) is what we do if we never had the NF in the first place? Or how would you structure links before the time of the nofollow? One way was to use forms. Any link can be converted to a form. Forms are more complex to structure so nowdays the NF is a convenient way to use instead.
This is also a note for people who believe webmasters spam search engines by abusing the NF. There is no abuse. The method was always available one way or another. How you place the links for navigation and content is another matter though and keep the visitors in mind, not search engines.
| 3:49 pm on Dec 19, 2008 (gmt 0)|
I must say, this is a very interesting process. As I go through various templates and determine which internals to
rel="nofollow", I'm seeing a little bit of duplication that may warrant this type of internal link micro-management.
I am also taking into consideration the position of the remaining links that are dofollow. Since I use SOC (Source Ordered Content), it allows me to apply that
rel="nofollow" and still take advantage of links in context. That's the key, allowing the links that are "in context" to carry the weight.
How many of you have a top navigation and then a bottom navigation that mimics the top? I do it for user convenience. If I have 9 Category Tabs in the header, I'm going to have 9 Textual Links in the footer to match, I just feel that is good practice.
Now, which of those do I nofollow? In my case, it will be the links in the header since those fall after the footer links (using SOC). In some instances, the footer links may be a little more verbose and more "in context" than those that appear in the header.
Mind you, I'm just learning all of this nofollow schmollow stuff. And, I'm applying these things like candy using my "gut instinct". After reading and participating in this topic, I feel I'm a Certifiable NoFollower.
| 3:55 pm on Dec 19, 2008 (gmt 0)|
| This 211 message thread spans 8 pages: < < 211 ( 1 2 3 4 5  7 8 ) > > |