| 11:32 pm on Jan 9, 2004 (gmt 0)|
They wouldn't use alexa - Google has their own toolbar.
Site traffic level via the toolbar measurement is a theory I've been putting forth for almost a year. It fully completely, and unconditionally explains florida.
| 12:04 am on Jan 10, 2004 (gmt 0)|
|It fully completely, and unconditionally explains florida |
Brett I remember you posted something to this extent some time ago, are you suggesting that Google is collecting data from toolbar users to establish ranking criteria?
What exactly do you mean?
| 1:05 am on Jan 10, 2004 (gmt 0)|
Brett, thanks for your reply. Does Google make its Traffic Rank information available?
Even if it doesn't, both Google's toolbar and Alexa are tracking a theoretically random subset of users to extrapolate a ranking, so one would expect roughly parallel results.
So the question is, does this high-traffic weighting theory, which can be analyzed using Alexa stats (as an approximation of Google's traffic rank), hold true for a wide range of post-Florida SERPS?
| 2:37 am on Jan 10, 2004 (gmt 0)|
No, this theory does not hold up. I have data from hundreds of thousands of pages which does not support it.
| 3:05 am on Jan 10, 2004 (gmt 0)|
quotations, which theory?
| 3:08 am on Jan 10, 2004 (gmt 0)|
The idea that traffic rank is a major factor in ranking post Florida.
| 3:13 am on Jan 10, 2004 (gmt 0)|
Huh? Who suugested that theory? I was talking about the ones that suggests Google are not the true owners (Stanford) of the Page Rank algo and have had to drop it ready for the IPO.
[edited by: antrat at 5:50 am (utc) on Jan. 10, 2004]
| 4:13 am on Jan 10, 2004 (gmt 0)|
I work in an industry where traffic volumes are public knowledge and go up to 500,000+/day and I can assure you that if traffic was a factor then it would be noticed and it isn't.
The sites dominating are older ones and some new ones that follow a stricrt regime... seems like older indexed sites can get away with anything :P
| 4:48 am on Jan 10, 2004 (gmt 0)|
There was one?
I absolutely believe that Google is using an Alexa-like traffic ranking obtained via their tool bar as a factor in the SERPS.
This would explain, or at least be consistent with, everything that I am seeing in the SERPS.
If this is the case, I am sure that Google will not disclose a site's traffic ranking.
Does anyone of a sense for how many active Alexa toolbars there are versus Google toolbars?
| 12:51 pm on Jan 10, 2004 (gmt 0)|
Google has over 3 billion pages to rate and rank. How in the world do they have the hardware capacity and capability to apply all the extremelly complicated scenerios and algorithm mutations that some very smart folk on these forums think they are using?
| 12:58 pm on Jan 10, 2004 (gmt 0)|
A 10,000 box Linux cluster apparently...
|More Traffic Please|
| 3:18 pm on Jan 10, 2004 (gmt 0)|
If Google was using traffic stats from their toolbar as a variable in their algo, it would go a long way in explaining the trend of pages from large sites, directories, etc. dominating many industry SERP's.
The question I have is why would Google use such info as a variable in their algo if their main goal is to produce relevant SERP's? IMO, I don't see a real strong correlation between relevance and traffic. The other problem I have with this theory is that many of the inner pages of large sites that show up are very obscure and are buried deep down. I would guess that many of these pages receive very little traffic, yet they rank well just by being associated with a dominant site.
| 5:55 pm on Jan 10, 2004 (gmt 0)|
I've wondered for a long time if Google gives a new site a boost in the rankings to monitor click through via the toolbar. If users visit and do not return to the serps, it is a small but possible indication the site was correctly ranked for the search term. If noone clicks a ranked site, or there is evidence the serps are revisited immediately, then again this may indicate an inappropriate relevancy/ranking.
However, with some irrelevant site listings that remain week after week, I can still see no evidence of alexa data being used.
| 7:29 pm on Jan 10, 2004 (gmt 0)|
The conceptual problem with traffic being a significant part of the algo is that sites that are currently high traffic will continue to be high traffic, whilst sites that are currently low traffic will find it difficult to get high SERPs even if they develop great content, and hence will remain low traffic sites.
| 10:05 pm on Jan 10, 2004 (gmt 0)|
this would explain the seeming boost crosslinking subdomains seem to get..since the traffic is calculated across all subs (i presume)and boosting there traffic figure a quantum factor higher than if they were sperated into appropriate domains...