homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 177 message thread spans 6 pages: < < 177 ( 1 2 3 4 [5] 6 > >     
Google Page Rank Update November 2007
Much awaited update!

 7:17 pm on Oct 26, 2007 (gmt 0)

< part two of this discussion is here: [webmasterworld.com...] >

I see PR changes for my 5-6 sites. Some went down some went up.

Check on these datacentres.

My blog went from 3 to 5 PR. :)

[edited by: tedster at 7:35 pm (utc) on Nov. 1, 2007]



 5:31 am on Oct 29, 2007 (gmt 0)

Matt Cutts once mentioned on his blog that there are factors in Google's total algo intended to help "mom and pop" sites

Could it be the reason why a large numer of low quality sites found their way to the top for the past month or so? Did they tweak their algo this way?

It's not because you have a mom and pop type of website that you are not trying to increase rankings. And it's not because you have such site in SERP's that it enhances searcher's experience whatsoever.
Mom and pop sites I know of are likely to buy cheap SEO offers and therefore cheap manipulation is somewhat misleading to the searcher landing on the site at the en of the day.

Even though this sounds like a nice idea, the only justification I find for this is increasing PPC revenue, I would be glad that someone just pitch in and give another logical explanation.
I don't know how this works in Google's algorithm but let's say that a certain % of the top 20 would be given to so-called "mom and pop" sites.
Shouldn't this be restricted to showing up for obvious local type of searches? As of now I find the lowest end of local businesses compteting along with top level authorities on non-local kewyords and this does not make any sense to me at all.

[edited by: followgreg at 5:37 am (utc) on Oct. 29, 2007]


 5:35 am on Oct 29, 2007 (gmt 0)

Changes in two sites,
one is down by 1
one was grey the entire year and now it is 3 :)


 10:08 am on Oct 29, 2007 (gmt 0)

Two sites of mine stayed the same.
Six sites dropped one PR point.
One site dropped from PR2 to PR0. I'm going to keep an eye on this one as the PR should have increased really.

It looks like PR has been rescaled and/or certain types of links are not as effective as they once were.


 10:30 am on Oct 29, 2007 (gmt 0)

Let's be sure that we're talking toolbar PR (TBPR) here - not the real thing!

Don't expect changes in this to have any effect on ranking. They were long since figured in.


 10:55 am on Oct 29, 2007 (gmt 0)

I'm seeing inbound relevancy of links as the biggest factor. Non-relevant links are not taken into consideration in the calculation of "visible" PR.

This makes a certain amount of sense if the objective is to kill the link markets. The easiest way of valuing a link is PR. It's now more difficult to buy your way to a high PR, and sell links off those pages.

It is disconcerting for link sellers, the bulk of whom are straight publishers rather than savvy webmasters. That will probably take many of them out of the market.

I'm not seeing however any change in rankings, which, after all, is what it's all about. I reckon those big brains at the plex have been trying to work out the best way of killing the mass market in links, while holding on to the precepts of their algo. This is the result - and I think it works quite well for that purpose.

So, what's the likely fallout?

1. Big link brokers' inventories will drop like a stone
2. Most unsophisticated link sellers will not be able to make much profit from the trade, so won't bother.
3. M&P businesses will stop buying links
4. Link trade carries on, but underground, remaining open to the savvy webmasters.
5. Matt Cutts asks for a raise.

Oliver Henniges

 10:56 am on Oct 29, 2007 (gmt 0)

> For calculation purposes (only), every url in the link graph is awarded some starting value. This value is completely arbitrary, because...

Yes, tedster, this was the theory in the original papers. However, from my very limited frog-perspective, there are a number of inconsistencies concerning present TBPR-values, particularly because I ftped some broader changes to my internal link-structure shortly before this update: None of us has the capacieties to completely perform this iteration-process for the main page, but for internal pages, where definitely no external backlink exists, this is different.

So, either this update is not finished yet, or - and this is what I believe - google is testing new algorithms to calculate/estimate pagerank "on the fly." Pagerank nowadays seems to have such a limited impact on ranking, that I doubt google is willing to invest the money in the required super-computer just for this aspect of its network.

Doing so, google would only follow the "law of cosmic laziness" (to quote Bertrand Russel): Why perform this huge iteration-operation over a network, where far more than 90% of the link structure remains the same anyway? The old starting values make up a very robust basis meanwhile. If google sticks to one, two or three iterations, and additionally puts some attention to devaluate certain classes of links and account for new 'nofollow'-tags, this would absolutely suffice. Such an operation could be performed every few days or hours without anyone of us even noticing, and showing the new values every twelve weeks would also amount to 50 or more iterations.

Another approach would be to extract (and iterate) a "core" of older or "more important" websites, more important maybe in terms of user-visits-data. This would explain, why sometimes it takes so long until new material is assigned pagerank at all.

All this is highly speculative, but I strongly believe that pagerank is no longer calculated the way it used to be.


 12:32 pm on Oct 29, 2007 (gmt 0)

I would agree that Google no longer performs the complete iteration when calculating page rank. As OH suggests, I suspect it now probably estimates and re-estimates it on the fly, which would make sense from a processing-ovehead point of view. Only a stable and unchanging site that has been completely indexed is likely to have the classic PR structure to be expected of a multiple iteration. The TBPR is a only a snapshot, and so one would expect to see anomalies.

But I don't agree with those who say PR doesn't matter. I suspect (although obviously cannot prove) that it is used in the algo which decides whether to place a page in the supplementary index. And that hits traffic, especially the long tail.

There is also the possibility that once a page is in the supplementary index, the PR passed by that page to other pages on the site may be devalued.


 12:53 pm on Oct 29, 2007 (gmt 0)

Youtube is now showing PR8 for me by the way.


 1:45 pm on Oct 29, 2007 (gmt 0)

I dont think the "flux" is over yet. Our site showing a drop of 1, yet showing more backlinks than ever and in GWT showing higher than average ranks... more high and medium rankings than ever before.. thus I dont think we've seen the end of it.... hold on .. it might be a bumpy ride.


 3:02 pm on Oct 29, 2007 (gmt 0)

yes, updates all around on my sites. I updated my site to clean urls mid week last week, and completely lost page rank (only a 2, but still) on my blog. I'm a bit puzzled, as almost every other site I manage went up by one.

Alas. I'll have to retool till March/April.

[edited by: nhansen at 3:17 pm (utc) on Oct. 29, 2007]


 3:11 pm on Oct 29, 2007 (gmt 0)

Yea, I'm thinking it's not over either. The page I mentioned above somewhere that disappeared is back today, albeit with a PR0 (it was previously a 4). All the rest of the pages on that site are 2, 3 and 4, and that one page previously had the highest PR of the site, so it doesn't make a lot of sense that it would lose all.

Heh, I had a parked domain that has just been sitting unused, just pointing to a "coming soon" type page, but for some reason had been showing PR5 for the past year. That one got corrected to a PR1. Can't say that was unexpected. In fact, I can't imagine why it's even got the PR1.


 3:32 pm on Oct 29, 2007 (gmt 0)

I too have a holding page thats gone from a grey bar to PR1.. and yet my established site has gone from a 5 to a 4.... Crazy!


 4:28 pm on Oct 29, 2007 (gmt 0)

all very well if my new site got "mom n pop" help with PR.
But I'd happily swap for dominating the SERPs: out, out I say to my geo-pod page (with its tiddly PR), and in with the newbie site...


 4:47 pm on Oct 29, 2007 (gmt 0)

One odd thing I have noticed is that my sites main page has been creeping up the rankings for many terms over the last couple of months.

I was kind of expecting when the next PR update came out that it would still be the same PR5 as I don't think I have enough links yet to edge up to PR6, I check today and it has in fact dropped to PR4. I am almost up to my main competitor who does have PR6 for my main search term so not too worried!

Does anyone else think we may be seeing some weird effects of the dmoz.org switching to www.dmoz.org a few months back?


 5:12 pm on Oct 29, 2007 (gmt 0)

Nice post, Oliver. There's so much of rubbish in these PR threads and so much of ignorance even about what PR was that's it's refreshing to read a post like yours.

I do believe that there's been some fundamental change in the way PR is calculated. Conspirators would jump to the argument that tPR is a good FUD tool and that sneaking in new ways of PR allocation and distribution serves Google's interests by frustrating SEOs. However, there may some of the theory from the original paper in place if a lot of sites are dropping a little bit.

What I do find is that some pages with no incoming links except from a PR1 page are showing PR3-4 (I obviously don't rely on Google for IBL info). And a hundred odd links from PR0 pages - low value pages that will still be PR0 at the next and subsequent update - seem to be resulting in PR4s and 5s. Which, based on the original PR concept, shouldn't be happening.

This is worth keeping an eye on.


 7:16 pm on Oct 29, 2007 (gmt 0)

Such an operation could be performed every few days or hours without anyone of us even noticing, and showing the new values every twelve weeks would also amount to 50 or more iterations.

Google did indeed change the way they perform the PR calculation to something that is continual rather than dependent on an occasional but massive computation. I'm pretty sure this happened in 2004, a few months before the infamous Florida update. At that time, we called the new bouncing around in the SERPs "everflux". And in order to do this continual PR calculation, Google did change something about their computational approach.

In math, some iterative functions can be expressed in a more "elegant" fashion that is still identical in its result - and other functions cannot, or at least a second mathematically identical function has not yet been discovered. In many of those second cases, a calculation that at least approximates a full iterative calculation can still be created, but it is subtly "off" from the full intent of the original.

The question I've never heard answered, or even discussed, is whether this new approach is mathematically equivalent to the original PR formula or not. If it is not an exact equivalent, then Google would need to run a "full cycle of iterations" from time in order to fix any small errors that the approximation was folding in. Either that or they may have even gone to a slightly modified PageRank formula!

I have no certainty which is the case today with Google's PR calculation - or even if it is the same formula that they originally published. There are some occasional hints I've picked up from here and there - for example, Sergey Brin speaking at Berkeley in 2005 (YouTube video [youtube.com]) discusses developing PageRank with Larry Page. About 8 minutes in he says "We use a similar algorithm today."

So my best guess is that PageRank's mathematical definition has changed in subtle ways - and that subtle shift allows for continual calculation. Google certainly has the computing power today to double check and correct PR with a fully iterative cycle once in a while - and somewhere or other I thought that was mentioned way back when.

But Google's mathematicians are very clever and may well have a simpified method in their arsenal of tools - one that does not require a major "verification cycle" every so often.

And what does this mean for us, practically? I can't come up with anything.


 7:25 pm on Oct 29, 2007 (gmt 0)

some pages with no incoming links except from a PR1 page are showing PR3-4

Matt Cutts has often mentioned things like "...there are steps that we take to try to help those Mom/Pop sites as well." (reference: mattcutts.com [mattcutts.com]). Could it be done in tweaks to PR? I've seen small sites I helped friends create get PR4 and PR5 pages with almost no backlinks to speak of.

[edited by: tedster at 7:57 pm (utc) on Oct. 29, 2007]


 7:52 pm on Oct 29, 2007 (gmt 0)

Whats going on?!?!?!?!?!?
Many sites have got 2 or even 3 points in the green bar, statcounter is now 9!


 9:21 pm on Oct 29, 2007 (gmt 0)

i'm excited... i added the no follow and am climbing again...i had a drop to zero, now climbing - back at 2 rt now....

That's impossible.


 10:57 pm on Oct 29, 2007 (gmt 0)

And what does this mean for us, practically? I can't come up with anything.

ted, smart as always... but maybe you are too smart:

in that huge operation of running that email service, maps, video hosting, press release pumping and other stuff stumbling monster Google has become, I do believe that PR is calculated like it always was, just with slight changes and with significant more hardware.

Don't get me wrong, but US based companies do not invent large steps normally - they invent the little important steps right.

This includes especially marketing and visualization.

With that thought in the back, I think the current PR update was just a bottom up recalculation of the current filters, manually added filters and some slight changes they have implemented in the last 12 month.

I also do believe that they have 1-3 external factors they can directly attached manually to domains or URLs.

So, the original formula:

PR(A) = (1-d) + d (PR(T1)/C(T1) + + PR(Tn)/C(Tn))

was just enhanced with some fairy dust

PR(a) = the usual stuff * (fairy dust)

where I consider the fairy dust to be an extra sum of filters like manual factors and spam reports!

2 cents,

kamikaze Optimizer

 11:08 pm on Oct 29, 2007 (gmt 0)

Whats going on?!?!?!?!?!?
Many sites have got 2 or even 3 points in the green bar,...

The three sites that I have been watching who took a big PR drop last week now have their PR restored also.


 12:03 am on Oct 30, 2007 (gmt 0)

The three sites that I have been watching who took a big PR drop last week now have their PR restored also.

It was a test/glitch. Google has had a glitch with its Adsense programming for over a week. It's trying to do too much at once. Got some weird managers running the place over there (read: trying to)!



 1:28 am on Oct 30, 2007 (gmt 0)

V'Ger is on the loose again.

Anyway .. no change for me on the 5 and 6 PR

3 new sites went from 0 and 1 to 2x3 and 1x5 on homepage... the new 5er went there from 0 and has been up since August .. amazing... but it's a subdomain of the PR 6.

The 0 to 3 and 1 to 3 are ten year old domains that have now been filled with content since May.

The direct link from the PR 6 homepage seems to have been only picked up on full strength on the subdomain.

The 3ers have IBL, the 5 has only link from the PR6 site since it's a branch out website. Also no 301s on that, no content that had been on the www site.


 5:30 am on Oct 30, 2007 (gmt 0)

I have a large number of directory sites that I run adsense on. I have seen a serious hit of about 30% to the traffic and revenue since Thursday... so for those who claim that this is only a "Green Bar" update, I would beg to differ.

Interestingly, I had added the phrase "Advertise Here" to 4 of my sites as a test about 2 weeks ago.

2 of those 4 sites have taken the most serious hit on traffic (around 70%). Those 2 sites are currently showing a cached version of the site at Google PRIOR to Oct 24... meaning that the "Advertise Here" was detected prior to the algorithm change. The other 2 sites that said "Advertise Here" but did not get hit as seriously are showing a cached version of the site SINCE Oct 24... so I am fairly certain that the presence of this wording is what sent these sites crashing down.

For what it is worth, I do not sell outbound PR. (There is none to sell)... The links that i am selling are listings on a directory and those links have always been redirected internally for statistical purposes before being transferred to the customer site. Does anyone know if an internal redirect like this still requires a "nofollow"?

I also would like to know if Google is able to determine that a cluster of sites are "related" (same IP, same owner, same topic) and therefore not punish them for linking to each other. E.g. my directory sites are all hosted on the same IP and link to each other... is it possible that Google believes that I am selling PR to myself? (when in reality all those links are to affiliated directories on the same general topic, but covering a different geographical area)? Do I need to put "nofollow" on these links too?

I have removed the "Advertise Here" wording, but I wonder if my sites now have a "Scarlett Letter" that will take months to recover from.

Annoyingly, I have recently been complaining to Google about a site that is very high in the SERPS that is obviously buying/trading links from a link farm in an unrelated field. The site in question was not affected by the algorithm change and is clearly "cheating" while I believe that I am clearly not cheating, and I am being hit pretty hard.

Undoubtedly many fortunes are currently being made from Adsense but imho anyone who believes that it cannot all be taken away in an instant is fooling themselves.... ultimately you have NO CONTROL over what Google is paying you.... they could reduce the commission by 75% overnight and what recourse would you have?

Robert Charlton

 7:11 am on Oct 30, 2007 (gmt 0)

I have a large number of directory sites that I run adsense on. I have seen a serious hit of about 30% to the traffic and revenue since Thursday... so for those who claim that this is only a "Green Bar" update, I would beg to differ.

There have been three things going on in close succession, and the one we're talking about here is the Toolbar PageRank update, probably reflecting ranking conditions of several weeks ago.

A couple of days prior to that, there was a highly publicized "manual" Toolbar PR update, also probably reflecting conditions of several weeks back, affecting sites that had been selling links.

And, close on the heels of the second TBPR update have been many ranking changes, still ongoing. I'm beginning to see some of those now in areas I watch, with some top results resembling those of about a month ago... others very different. Too early to say anything, and that's for the SERPs thread.

Regarding the Toolbar PR update, I did notice an interesting glitch. For about ten minutes this evening, the home page of a client site had grayed-out Toolbar PR. It came back sooner than I'd expected, but I did expect it to come back. With a data shakeup as big as this one, I fully expect to see many more such anomalies before all this is over.


 9:36 am on Oct 30, 2007 (gmt 0)


2 of my site's Pr also down from 5 to 3 and 6 to 5. I have seen some major sites also effected from this google PR update. What all of you think that google will return the PR of some gr8 sites or this is a really penalty for all gr8 sites?


 10:15 am on Oct 30, 2007 (gmt 0)

I also would like to know if Google is able to determine that a cluster of sites are "related" (same IP, same owner, same topic) and therefore not punish them for linking to each other

About 1/2 weeks ago I changed the webserver from one physical server to another and had to move IP too into a completely different subnet. ie every single digit changed. The site took an immediate 20% hit in unique users + another site that changed too went down 50%. Midweek. I knew it would be a bad move to change, but the old server was, well, old and I just had to. Now I have all but one IP adress left that isn't in the same subnet .. Links from other servers to the site were all nofollow, so I am pretty sure the immediate drop was because of the IP change. The server is faster. Maybe there is a penalty for using OpenSUSE .. which was at the time the only Linux to install in the time frame I had .. ;)

I also changed my biggest site from round robin to single IP, this doesn't seem to have a negative effect, yet. Yesterday I changed to follow as a test in the same subnet, traffic is up on this one and down on the the linked to server, but this might be the new update.


 1:28 pm on Oct 30, 2007 (gmt 0)

My site dropped from 6 to 3. I never sold or bought links, do you think it worth writing to google support about that unfair loss? Or I am not the only one and they won't bother answering me?


 1:42 pm on Oct 30, 2007 (gmt 0)

Maybe I am missing something - but what does page rank have to do with anything other than unscrupulous link traders trying to sell or get links (I have a PR page of 7 - link to me, buy a link to me for $200, and your rank will increase bla bla bla).

There is an extremely competitive 1-word term (56 million + word page results) that I follow:

Position PR Misc
----- ---
1 5 Official site
2 5 authority site
3 3 sub page of authority site
4 6 Wikipedia citation
5 0 0fficial site PDF File
6 5 Authority .Gov site
7 4 Authority site (3 back links)
8 4 About.com citation
9 4 Authority site
10 4 no comment

5,5,3,6,0,5 then 3 4's - so what?

if a 4 gets you on 7 8 9 & 10 of 50,000,000 pages who cares what pagerank is?

Pagerank has something to do with it but there is much more to making money off of serps.

What do you think?


 1:47 pm on Oct 30, 2007 (gmt 0)

"Does anyone know if an internal redirect like this still requires a "nofollow"? "

I would say yes as the redirected links are followed and if they are followed and indexed then yes they do pass PR, and are considered an outbound link. This I would give you hundreds of one way out bound links and kill any PR you did have.

We as well use them on our site but they go to internal pages of customers we build profile pages for and take applications.

These redirected URL's are in the Google, Yahoo, and MSN index with the Title of the page the redirect goes to...


 2:48 pm on Oct 30, 2007 (gmt 0)

slay100, I know it is sad but I do not think G entertains any such requests. My theory for your downgrade is that in this update a lot of websites have lost rank and if the websites linking to you also lost rank you will also be affected.

This 177 message thread spans 6 pages: < < 177 ( 1 2 3 4 [5] 6 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved