| This 31 message thread spans 2 pages: < < 31 ( 1  ) || |
|Matt Cutts Explains PageRank Reduction for Websites Selling Links|
| 11:10 am on Aug 28, 2012 (gmt 0)|
The usual reason why a site’s PageRank drops by 30-50% like this is because the site violates our quality guidelines by selling links that pass PageRank [mattcutts.com ]
| 7:57 am on Aug 29, 2012 (gmt 0)|
|Or... Matt was just being nice to the guy by taking time to answer his question, then (as stated in the blog post) publishing the email so as to benefit anyone else. |
The part that's been overlooked in this discussion is that it took a spam report to identify the infraction. It wasn't the algo. It seems like when the algo is unleashed to hunt down paid links, the scope is restricted in some way. That seems to indicate that Google's ability to algorithmically spot paid links is limited.
There were three steps in this manual action. The first was the actual action, the second was Matt's reply, and the third was the publication of the case study.
The first just happened, nothing interesting there. Matt's reply was probably just helpful, although the cynic in me suspects his eye was on Stage 3- putting the information into the public domain.
Why did he do this? Is it because Google want's everyone to pay attention to TBPR? Unlikely, they've spent quite a lot of effort de-emphasising it. Is it because MC is a prolific blogger, churning out advice about trivial topics? No- this is only his 5th post this year (one of which was a 2-factor Authentication public service announcement). Is it because loss of TBPR is the most pressing concern of our times, dominating debate at all levels of the SEO and wider webmaster community, demanding personal redress by the High Priest of Google? Hardly.
So why did MC choose, of all topics, an obscure problem of interest to few, and with no discernable impact on even the site in question. Well, I suggest it has a lot to do with this:
"That seems to indicate that Google's ability to algorithmically spot paid links is limited"
So, if they can't spot them, how can they stop them impacting the clever system the Larry Page devised which took Google from newcomer to behemoth in a few short years? The USP that Google search was built on. The original secret in their sauce?
By stopping it before it happens, and that means highlighting the negatives at every opportunity.
|If all links suddenly became nofollow... pagerank would cease to have any relevance UNLESS Google selectively ignored nofollow... If Google is being selective about nofollow, and nofollow protects you from harm, why isn't every single outgoing link you have protected with it? What is the incentive or benefit of not using it? |
A lot of assumptions in there. For a start, NF links makes up a small percentage of links. This is unlikely to change overnight.
In regards outbound links- I've found that good linking increases the value of the page. Spammers actually know this too- which is why they often link out. I would imagine that NF on every outbound would look somewhat dodgy... and while wikipedia is a counter-example, they are considerably bigger than yow.
| This 31 message thread spans 2 pages: < < 31 ( 1  ) |