Welcome to WebmasterWorld Guest from 220.127.116.11
I see PR changes for my 5-6 sites. Some went down some went up.
Check on these datacentres.
My blog went from 3 to 5 PR. :)
[edited by: tedster at 7:35 pm (utc) on Nov. 1, 2007]
Matt Cutts once mentioned on his blog that there are factors in Google's total algo intended to help "mom and pop" sites
Could it be the reason why a large numer of low quality sites found their way to the top for the past month or so? Did they tweak their algo this way?
It's not because you have a mom and pop type of website that you are not trying to increase rankings. And it's not because you have such site in SERP's that it enhances searcher's experience whatsoever.
Mom and pop sites I know of are likely to buy cheap SEO offers and therefore cheap manipulation is somewhat misleading to the searcher landing on the site at the en of the day.
Even though this sounds like a nice idea, the only justification I find for this is increasing PPC revenue, I would be glad that someone just pitch in and give another logical explanation.
I don't know how this works in Google's algorithm but let's say that a certain % of the top 20 would be given to so-called "mom and pop" sites.
Shouldn't this be restricted to showing up for obvious local type of searches? As of now I find the lowest end of local businesses compteting along with top level authorities on non-local kewyords and this does not make any sense to me at all.
[edited by: followgreg at 5:37 am (utc) on Oct. 29, 2007]
It looks like PR has been rescaled and/or certain types of links are not as effective as they once were.
This makes a certain amount of sense if the objective is to kill the link markets. The easiest way of valuing a link is PR. It's now more difficult to buy your way to a high PR, and sell links off those pages.
It is disconcerting for link sellers, the bulk of whom are straight publishers rather than savvy webmasters. That will probably take many of them out of the market.
I'm not seeing however any change in rankings, which, after all, is what it's all about. I reckon those big brains at the plex have been trying to work out the best way of killing the mass market in links, while holding on to the precepts of their algo. This is the result - and I think it works quite well for that purpose.
So, what's the likely fallout?
1. Big link brokers' inventories will drop like a stone
2. Most unsophisticated link sellers will not be able to make much profit from the trade, so won't bother.
3. M&P businesses will stop buying links
4. Link trade carries on, but underground, remaining open to the savvy webmasters.
5. Matt Cutts asks for a raise.
Yes, tedster, this was the theory in the original papers. However, from my very limited frog-perspective, there are a number of inconsistencies concerning present TBPR-values, particularly because I ftped some broader changes to my internal link-structure shortly before this update: None of us has the capacieties to completely perform this iteration-process for the main page, but for internal pages, where definitely no external backlink exists, this is different.
So, either this update is not finished yet, or - and this is what I believe - google is testing new algorithms to calculate/estimate pagerank "on the fly." Pagerank nowadays seems to have such a limited impact on ranking, that I doubt google is willing to invest the money in the required super-computer just for this aspect of its network.
Doing so, google would only follow the "law of cosmic laziness" (to quote Bertrand Russel): Why perform this huge iteration-operation over a network, where far more than 90% of the link structure remains the same anyway? The old starting values make up a very robust basis meanwhile. If google sticks to one, two or three iterations, and additionally puts some attention to devaluate certain classes of links and account for new 'nofollow'-tags, this would absolutely suffice. Such an operation could be performed every few days or hours without anyone of us even noticing, and showing the new values every twelve weeks would also amount to 50 or more iterations.
All this is highly speculative, but I strongly believe that pagerank is no longer calculated the way it used to be.
But I don't agree with those who say PR doesn't matter. I suspect (although obviously cannot prove) that it is used in the algo which decides whether to place a page in the supplementary index. And that hits traffic, especially the long tail.
There is also the possibility that once a page is in the supplementary index, the PR passed by that page to other pages on the site may be devalued.
Alas. I'll have to retool till March/April.
[edited by: nhansen at 3:17 pm (utc) on Oct. 29, 2007]
Heh, I had a parked domain that has just been sitting unused, just pointing to a "coming soon" type page, but for some reason had been showing PR5 for the past year. That one got corrected to a PR1. Can't say that was unexpected. In fact, I can't imagine why it's even got the PR1.
I was kind of expecting when the next PR update came out that it would still be the same PR5 as I don't think I have enough links yet to edge up to PR6, I check today and it has in fact dropped to PR4. I am almost up to my main competitor who does have PR6 for my main search term so not too worried!
Does anyone else think we may be seeing some weird effects of the dmoz.org switching to www.dmoz.org a few months back?
I do believe that there's been some fundamental change in the way PR is calculated. Conspirators would jump to the argument that tPR is a good FUD tool and that sneaking in new ways of PR allocation and distribution serves Google's interests by frustrating SEOs. However, there may some of the theory from the original paper in place if a lot of sites are dropping a little bit.
What I do find is that some pages with no incoming links except from a PR1 page are showing PR3-4 (I obviously don't rely on Google for IBL info). And a hundred odd links from PR0 pages - low value pages that will still be PR0 at the next and subsequent update - seem to be resulting in PR4s and 5s. Which, based on the original PR concept, shouldn't be happening.
This is worth keeping an eye on.
Such an operation could be performed every few days or hours without anyone of us even noticing, and showing the new values every twelve weeks would also amount to 50 or more iterations.
Google did indeed change the way they perform the PR calculation to something that is continual rather than dependent on an occasional but massive computation. I'm pretty sure this happened in 2004, a few months before the infamous Florida update. At that time, we called the new bouncing around in the SERPs "everflux". And in order to do this continual PR calculation, Google did change something about their computational approach.
In math, some iterative functions can be expressed in a more "elegant" fashion that is still identical in its result - and other functions cannot, or at least a second mathematically identical function has not yet been discovered. In many of those second cases, a calculation that at least approximates a full iterative calculation can still be created, but it is subtly "off" from the full intent of the original.
The question I've never heard answered, or even discussed, is whether this new approach is mathematically equivalent to the original PR formula or not. If it is not an exact equivalent, then Google would need to run a "full cycle of iterations" from time in order to fix any small errors that the approximation was folding in. Either that or they may have even gone to a slightly modified PageRank formula!
I have no certainty which is the case today with Google's PR calculation - or even if it is the same formula that they originally published. There are some occasional hints I've picked up from here and there - for example, Sergey Brin speaking at Berkeley in 2005 (YouTube video [youtube.com]) discusses developing PageRank with Larry Page. About 8 minutes in he says "We use a similar algorithm today."
So my best guess is that PageRank's mathematical definition has changed in subtle ways - and that subtle shift allows for continual calculation. Google certainly has the computing power today to double check and correct PR with a fully iterative cycle once in a while - and somewhere or other I thought that was mentioned way back when.
But Google's mathematicians are very clever and may well have a simpified method in their arsenal of tools - one that does not require a major "verification cycle" every so often.
And what does this mean for us, practically? I can't come up with anything.
some pages with no incoming links except from a PR1 page are showing PR3-4
Matt Cutts has often mentioned things like "...there are steps that we take to try to help those Mom/Pop sites as well." (reference: mattcutts.com [mattcutts.com]). Could it be done in tweaks to PR? I've seen small sites I helped friends create get PR4 and PR5 pages with almost no backlinks to speak of.
[edited by: tedster at 7:57 pm (utc) on Oct. 29, 2007]
And what does this mean for us, practically? I can't come up with anything.
ted, smart as always... but maybe you are too smart:
in that huge operation of running that email service, maps, video hosting, press release pumping and other stuff stumbling monster Google has become, I do believe that PR is calculated like it always was, just with slight changes and with significant more hardware.
Don't get me wrong, but US based companies do not invent large steps normally - they invent the little important steps right.
This includes especially marketing and visualization.
With that thought in the back, I think the current PR update was just a bottom up recalculation of the current filters, manually added filters and some slight changes they have implemented in the last 12 month.
I also do believe that they have 1-3 external factors they can directly attached manually to domains or URLs.
So, the original formula:
PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn))
was just enhanced with some fairy dust
PR(a) = the usual stuff * (fairy dust)
where I consider the fairy dust to be an extra sum of filters like manual factors and spam reports!
The three sites that I have been watching who took a big PR drop last week now have their PR restored also.
It was a test/glitch. Google has had a glitch with its Adsense programming for over a week. It's trying to do too much at once. Got some weird managers running the place over there (read: trying to)!
Anyway .. no change for me on the 5 and 6 PR
3 new sites went from 0 and 1 to 2x3 and 1x5 on homepage... the new 5er went there from 0 and has been up since August .. amazing... but it's a subdomain of the PR 6.
The 0 to 3 and 1 to 3 are ten year old domains that have now been filled with content since May.
The direct link from the PR 6 homepage seems to have been only picked up on full strength on the subdomain.
The 3ers have IBL, the 5 has only link from the PR6 site since it's a branch out website. Also no 301s on that, no content that had been on the www site.
Interestingly, I had added the phrase "Advertise Here" to 4 of my sites as a test about 2 weeks ago.
2 of those 4 sites have taken the most serious hit on traffic (around 70%). Those 2 sites are currently showing a cached version of the site at Google PRIOR to Oct 24... meaning that the "Advertise Here" was detected prior to the algorithm change. The other 2 sites that said "Advertise Here" but did not get hit as seriously are showing a cached version of the site SINCE Oct 24... so I am fairly certain that the presence of this wording is what sent these sites crashing down.
For what it is worth, I do not sell outbound PR. (There is none to sell)... The links that i am selling are listings on a directory and those links have always been redirected internally for statistical purposes before being transferred to the customer site. Does anyone know if an internal redirect like this still requires a "nofollow"?
I also would like to know if Google is able to determine that a cluster of sites are "related" (same IP, same owner, same topic) and therefore not punish them for linking to each other. E.g. my directory sites are all hosted on the same IP and link to each other... is it possible that Google believes that I am selling PR to myself? (when in reality all those links are to affiliated directories on the same general topic, but covering a different geographical area)? Do I need to put "nofollow" on these links too?
I have removed the "Advertise Here" wording, but I wonder if my sites now have a "Scarlett Letter" that will take months to recover from.
Annoyingly, I have recently been complaining to Google about a site that is very high in the SERPS that is obviously buying/trading links from a link farm in an unrelated field. The site in question was not affected by the algorithm change and is clearly "cheating" while I believe that I am clearly not cheating, and I am being hit pretty hard.
Undoubtedly many fortunes are currently being made from Adsense but imho anyone who believes that it cannot all be taken away in an instant is fooling themselves.... ultimately you have NO CONTROL over what Google is paying you.... they could reduce the commission by 75% overnight and what recourse would you have?
I have a large number of directory sites that I run adsense on. I have seen a serious hit of about 30% to the traffic and revenue since Thursday... so for those who claim that this is only a "Green Bar" update, I would beg to differ.
There have been three things going on in close succession, and the one we're talking about here is the Toolbar PageRank update, probably reflecting ranking conditions of several weeks ago.
A couple of days prior to that, there was a highly publicized "manual" Toolbar PR update, also probably reflecting conditions of several weeks back, affecting sites that had been selling links.
And, close on the heels of the second TBPR update have been many ranking changes, still ongoing. I'm beginning to see some of those now in areas I watch, with some top results resembling those of about a month ago... others very different. Too early to say anything, and that's for the SERPs thread.
Regarding the Toolbar PR update, I did notice an interesting glitch. For about ten minutes this evening, the home page of a client site had grayed-out Toolbar PR. It came back sooner than I'd expected, but I did expect it to come back. With a data shakeup as big as this one, I fully expect to see many more such anomalies before all this is over.
I also would like to know if Google is able to determine that a cluster of sites are "related" (same IP, same owner, same topic) and therefore not punish them for linking to each other
About 1/2 weeks ago I changed the webserver from one physical server to another and had to move IP too into a completely different subnet. ie every single digit changed. The site took an immediate 20% hit in unique users + another site that changed too went down 50%. Midweek. I knew it would be a bad move to change, but the old server was, well, old and I just had to. Now I have all but one IP adress left that isn't in the same 255.255.0.0 subnet .. Links from other servers to the site were all nofollow, so I am pretty sure the immediate drop was because of the IP change. The server is faster. Maybe there is a penalty for using OpenSUSE .. which was at the time the only Linux to install in the time frame I had .. ;)
I also changed my biggest site from round robin to single IP, this doesn't seem to have a negative effect, yet. Yesterday I changed to follow as a test in the same subnet, traffic is up on this one and down on the the linked to server, but this might be the new update.
There is an extremely competitive 1-word term (56 million + word page results) that I follow:
Position PR Misc
1 5 Official site
2 5 authority site
3 3 sub page of authority site
4 6 Wikipedia citation
5 0 0fficial site PDF File
6 5 Authority .Gov site
7 4 Authority site (3 back links)
8 4 About.com citation
9 4 Authority site
10 4 no comment
5,5,3,6,0,5 then 3 4's - so what?
if a 4 gets you on 7 8 9 & 10 of 50,000,000 pages who cares what pagerank is?
Pagerank has something to do with it but there is much more to making money off of serps.
What do you think?
I would say yes as the redirected links are followed and if they are followed and indexed then yes they do pass PR, and are considered an outbound link. This I would give you hundreds of one way out bound links and kill any PR you did have.
We as well use them on our site but they go to internal pages of customers we build profile pages for and take applications.
These redirected URL's are in the Google, Yahoo, and MSN index with the Title of the page the redirect goes to...