Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Is the writing on the wall for PageRank as we know it today?

         

BeeDeeDubbleU

2:57 pm on Feb 22, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



DEVIL'S ADVOCATE
I think the recent cases where J.C.Penney was penalised for usingpaid links and now Forbes getting it for encouraging paid links may be a sign that Google is considering the long term future of Pagerank.

[google.com...]

As we all know Pagerank is more or less based on the number of inbound links, the relevance of their anchor text and authority of the linking website. This has been a great way of ranking websites and it was the rock upon which Google was founded. It was a great way of using external signals to rank a site but it is now being devalued by the proliferation of link building schemes. Since its introduction link building has become a legitimate occupation.

The use of appropriate inbound links means that websites are ranking well when they should not necessarily do so. In other words they don't deserve their ranking. All it takes is an effective link building campaign and in many cases the site content is less important than the inbounds.

Traditionally Google changes its algorithm when the web design and SEO community catches up with the algo and gets in a position to manipulate the SERPs. They (Google) are always looking for ways to outwit us and vice versa.

Could the recent events signal the beginning of the end of Pagerank? Will Google be looking for an alternative and perhaps moving back to ranking websites that deserve to rank for their content?

.

Robert Charlton

6:57 pm on Feb 22, 2011 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Note regarding the Google Webmaster Central thread referenced in the post above.... It's a discussion about the paid links on forbes.com site that were flagged by Google, with responses from Matt Cutts and others that include extensive references on Google's policies.

Unnatural Links message
Google Webmaster Central
[google.com...]

Responses also by rustybrick and JohnMu.

A good reference for anyone dealing with the paid link question.

TheMadScientist

7:08 pm on Feb 22, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Will Google be looking for an alternative and perhaps moving back to ranking websites that deserve to rank for their content?

We can only hope...

Robert Charlton

8:04 pm on Feb 22, 2011 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



To broaden the discussion a bit... inbound links are not just about PageRank. They're also about anchor text, trust factors, and relevance factors between pages in the linking chain.

The paid links we've been discussing had a lot to do with anchor text links to deep pages. One of the more striking things the efficacy of these links suggested, IMO, is that Google has been slow to penalize or discount many links with patterns of identical anchor text that some of us have assumed it could catch.

There were beginnings of algorithmic demotion for the JC Penney links, at least, just as the NYT article came out. It may well be that Google in fact can detect these links, but is erring on the cautious side for handing out penalties... or that it simply has been slow to implement many capabilities I assume are possible using Caffeine combined with phrase-based indexing.

At the least, I expect that Google is going to turn the dial down on large quantities of identical deep links from crappy pages... maybe turn up the dial on relevance factors between pages.

This is not about whether PageRank itself is going to be obsolete or not. Plain vanilla PageRank in that sense has needed help from 199 other factors in the algo for a long time. It's more, IMO, about what else Google is implementing. Collateral damage gets to be a bigger problem in these algo changes, and I'm sure that's also a consideration as Google moves ahead.

deadsea

8:29 pm on Feb 22, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If I were google I would start to value links very differently:

1) Weight pagerank passed to the links on a page by prominence of the links. Google has the technology to render pages and see where links on those pages are. Instead of the pagerank being passed equally between the links on the page pass it proportionally to placement. Links that are bigger get more share and links that have a better position get more share.

2) Weight pagerank passed to links by Weight pagerank passed to the links on a page by click through rate on those links as measured by CTR of actual users with the Google toolbar installed or sites that use google analytics or adwords. When enough data is available for a page, prefer this metric to the heatmap metric mentioned in #1

There is some evidence that Google is already doing this to some extent. Footer links are no longer as effective at passing PR as they used to be. I don't think Google has gone full bore with this yet though.

Reno

8:53 pm on Feb 22, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



We all know that the Google algo is extraordinarily complex. I say only half jokingly that it must approach the sophistication of nuclear equations. We also know that Google is highly advanced in data mining, and in making sense of it all. The problem is that they are wrong too frequently, and when they are wrong, it seems the over-reliance on PR is in the mix. I would guess that they can come very close to identifying most paid links on their own (without the NYT!), so why not simply devalue them across the boards and then do away with site-wide penalties (for those sites with paid links)? Plus, as deadsea has suggested, implement a more realistic value linking hierarchy. As has been said in previously threads, PR was a brilliant innovation when it was originally conceived, but that was over a decade ago. To be candid I'm surprised they have not moved away from it already, when it was clear that black-hats were using linking schemes to game the system. Instead, they are constantly putting bandaides on the bleeding. But better late than never, so I hope you are right BeeBeeDubbleU ~ it's time has passed as the core of the engine. Speaking for myself, it can RIP, because I won't miss it.

........................

tedster

10:01 pm on Feb 22, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I noticed at least three separate times in the past few months a new comment from a Google spokesperson that "we just just changed the way we score backlinks." Add in the newer factors we know they are taking into account and I think the process is underway.

It's just that addressing something so very core to the algorithm is no minor thing - it's HUGE. The original PageRank algorithm was quite an insight and it launched Google's success. I don't thikn they've found any single insight that cuts through today's clutter in the same way that PageRank did back then. Instead we've got this very complex code.

Wild idea here - maybe some of the most unexplainable rankings shifts that pop in and quickly back out actually ARE the test of some new core factor being given a short day in the sun to gather data.

wheel

10:17 pm on Feb 22, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Link attributes still are the best way to evaluate sites.

I think the problem Google's having is that all the tweaking now is basically just chaos. They make something a bit better, something else gets a bit worse and overall they're only making things different not better. Their signals have devalued to the point where they're basically static noise.

Quite possibly all they're really doing isn't so much making the serps better (because overall, they're not,and haven't in years). All they're doing is shifting things frequently enough to keep any specific technique from being exposed.

BeeDeeDubbleU

10:33 pm on Feb 22, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It's just that addressing something so very core to the algorithm is no minor thing - it's HUGE.

Agreed and I don't think it will happen overnight. I think it will be a case of turning down the volume knob for link weighting over a long period of time and I think it will be a long time before it is totally discounted.

creative craig

11:36 pm on Feb 22, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think it will be a long time before it is totally discounted


I dont think that would ever happen - a vote of popularity via a backlink is still the best way to evaluate a sites worth IMO.

Will Google be looking for an alternative and perhaps moving back to ranking websites that deserve to rank for their content


In the niches I've watch and worked in links have always been king.

FranticFish

11:39 pm on Feb 22, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Link attributes still are the best way to evaluate sites.

I feel the same way too.

How do you determine that a website 'deserves to rank for its content ' using on page factors more than off page? Size of page? Natural keyword density with regular occurence of synonymns and stemming? Writing style (assuming that can be analysed which myself I don't)? Incidence of trusted OBLs? Size of page? Relevance of page to site? Size of site? Structure of site? Navigation system? Relevancy and 'naturalness' of title and meta tags?

ANY signal can be manipulated, and on-page signals can be manipulated just as easily as any others - perhaps more so.

A machine can determine that certain factors that might indicate passable content are present, but not that the page is quality or not. And at present these factors appear to be VERY easy to fake, if the huge number of machine-generated blogs passing PR and anchor text is anything to go by.

The whole point of links is that they are supposed to be a sign of noteworthy (N.B. not necessarily GOOD) content because the content has been cited. Social Media mentions, comments etc are newer ways of attempting to determine 'noteworthiness'.

I can't see off-page citations ever taking second place to on-page factors myself, but I would be interested in hearing what factors people think could be measured on-page that would be hard to fake. I can't think of any myself.

tangor

11:43 pm on Feb 22, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This is something google invented...and made a pile on...before the ordinary hucksters (and these have been around since Cain and Abel) figured out a way to game the system with their brand of Snake Oil. How G fixes it is the question, and how it impacts the regular folks who have come to rely on it is the followup.

I have a bit of grumble in this as I've been on the "web" since DARPA let it out of the cage... and prior to that BBS...which also had similar snake oil types... The more things change, the more they remain the same. I think I heard that somewhere before... :)

tedster

12:11 am on Feb 23, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As I see it, there are two areas that are lowering the importance of links - 1) extracting themes from documents and 2) sentiment analysis.

Both these areas have been Google hot spots for years. But even being on the cutting edge of either technology is not yet good enough to knock links out of the ring. Current theme detection and sentiment analysis do work OK on a larger body of documents. But at the more granular level needed to create excellent SERPs they can produce some real howlers - or just come up empty. And of course, both can be gamed.

Gaming is the big thorn in Google's side. The environment for search engines is always adversarial, to some degree. So pure academics needs to find a way to buddy up with street smarts.

brotherhood of LAN

12:30 am on Feb 23, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



moving back to ranking websites that deserve to rank for their content?


inbound links are not just about PageRank. They're also about anchor text, trust factors, and relevance factors between pages in the linking chain.


1) extracting themes from documents and 2) sentiment analysis.


.. and it seems like WebmasterWorld has been waiting for a more forceful algo in these departments for some time, theming at least.

An interesting read from 2002: [webmasterworld.com...]

In 2011 'off topic' links can still boost ranks.

martinibuster

2:07 am on Feb 23, 2011 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Is the writing on the wall for PageRank as we know it today?


Yes and no. Kind of but not. :)

It's possible they're moving toward different kinds of citations. Citations is what PageRank is about. Links just happened to be the handiest citations to count at the time PR was formulated. Today there are more kinds of citations other than links that are in use.

Google's product manageer for search (I think that's his title) recently stated that people today share more than they create web pages and the NYTimes recently published an article about how less people are blogging and instead turning to places like Facebook and Twitter.

You can see non-link citations in use in Google Places. The social search thing is using itations from social sites like twitter. I did a post about this topic three months ago, about unlinked citations [webmasterworld.com].

[edited by: martinibuster at 2:23 am (utc) on Feb 23, 2011]

tedster

2:26 am on Feb 23, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



moving back to ranking websites that deserve to rank for their content?

That kind of Camelot has never existed, not for any search engine. It's more like the search for the holy grail rather than the good old days.

McMohan

5:02 am on Feb 23, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There are a few things I don't understand -

1. I see this comment oft repeated
Moving back to ranking websites that deserve to rank for their content?

If it were the parameter, how is a search engine to rank Google vis-a-vis say Blekko when you search "Search Engine"?

2. Why can't Google just turn-off the visible TBPR? Someone said, one easy way to fight corruption is stop printing higher denomination currency notes. It might not eradicate corruption, but will bring it down to manageable levels. Removing TPBR may not stop link buying/selling, but will definitely bring it down. Page theme, reputation etc will come into the mix, while currently we are just fixated onto the TPBR.

Just my 2 cents.

TheMadScientist

5:37 am on Feb 23, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Why can't Google just turn-off the visible TBPR?

Actually, they might be able to when they start getting enough data from Chrome, until then I think they value the data they get more than they want to end the link games.

Reno

7:08 am on Feb 23, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think the problem Google's having is that all the tweaking now is basically just chaos.

Bingo ~ change something, change everything:

"Chaos theory in the West developed from the 1960s work of meteorologist Edward Lorenz. Lorenz developed a simple meteorological model based on differential equations. When he ran his model on a computer, Lorenz discovered that a very small difference (less than one part in one thousand) in the initial conditions led to large changes in the weather predicted by his model over time. This discovery, sensitivity to initial conditions, is one of the fundamental characteristics of chaos theory."

.......................

tedster

7:23 am on Feb 23, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I agree, and I'll bet so does Google. I'm sure something of chaos theory (complexity theory) is in their model. They didn't hire all those PhDs for nothing.

Sgt_Kickaxe

7:40 am on Feb 23, 2011 (gmt 0)



I think we need to expand on our idea of "links" so that we incorporate discussions about "something" even if it's not linked to. The link being a discussion about...

I think we still get a good chunk of our rankings due to on page SEO, not the secret sauce type stuff but you know... having the spark plugs where they should be ;-) Not being well SEO'd doesn't mean you'll get penalized but it does mean you're not boosting your site (within allowed guidelines) as much as you could.

I think that concentrating on providing the most unique content possible IS the new secret sauce.

I think too much.

Jane_Doe

8:25 am on Feb 23, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Regarding the Forbes.com issue, I'd like to know how you get to be the "Digital Marketing Manager (SEO | Social Media | Web Analytics) at Forbes.com " and not know (or maybe just pretend to not to know) what paid links are.

Isn't that kind of like an astronomer asking someone what those long pointy things with the lenses are for?

BeeDeeDubbleU

9:16 am on Feb 23, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Regarding the Forbes.com issue, I'd like to know how you get to be the "Digital Marketing Manager (SEO | Social Media | Web Analytics) at Forbes.com " and not know (or maybe just pretend to not to know) what paid links are.

Yes, Jane_Doe that's a bit of a laugh isn't it. This is the guy who says he is a "Dynamic e-marketing leader offering 10 years of online marketing management with extensive experience in search engine optimization, search engine marketing, affiliate marketing, and social media marketing."

BeeDeeDubbleU

9:17 am on Feb 23, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



How do you determine that a website 'deserves to rank for its content ' using on page factors more than off page?

Good question and one to which that the search engines must find an answer. If anyone here had the answer they would hardly be sharing it with the rest of us, would they? :)

FranticFish

1:49 pm on Feb 23, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Here's why I think Google will struggle:

You can have a fantastic quality page with contributions from academics, well-researched content, wonderful insights. It can be easily ripped off and rewritten.

However to get fantastic quality links / citations you have to be an active influencer in your niche - the sort of person whose opinion counts. The people who give out these links I think are far more likely to be able to spot recycled content from a 'faker'.

So even if you can fake on-page signals you'll struggle to create the right off-page signals IMO.

Google say this is what they focus on. I think they have a LOOOOONG way to go.

brotherhood of LAN

2:07 pm on Feb 23, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



True enough FranticFish,

OTOH I think there's a problem at both ends
(1) Competitive / high value terms require lots of SEO/links to rank. There is an economy to buy links to rank for them. People link easier with cash.
(2) Less competitive / lower value terms require less links and optimisation. It doesn't take many links to appear for more obscure/exact terms.

Google actively discourage purchasing links to game search engines and also consider the 'trust' of a domain thats linking out. Apparently brands are also considered which likely would help more with (1)

I wonder what results would look like if they put PR into a black hole for the lowest 80% of sites. More granular searches would probably suffer, though a lot of dodgy links would also disappear off the map.

PR worked at the time 'to organise the world's information', but now they're having 'to organise the world's information depending on how much we know you'. Very elitist ;o)