Welcome to WebmasterWorld Guest from 126.96.36.199
joined:June 3, 2007
for each document: a frequency of visit value based on a number of times the document was visited during a time period; and a unique visit value based on a number of unique visitors to the document; determining, at the server, for each document, a usage score from the frequency of visit value and the unique visit value associated with the document; and determining, at the server, an organization for the documents based on the usage scores for the documents.
wherein the plurality of documents include at least one document visited by multiple distinct counted visitors during a time period;
[edited by: Sgt_Kickaxe at 5:48 pm (utc) on Jul 22, 2011]
Proven performing pages are going to do well while new pages, or pages with little usage history, will struggle at least for a while until they gain some history.
[edited by: indyank at 5:49 pm (utc) on Jul 22, 2011]
joined:Dec 29, 2003
i think you are saying the opposite of what Bill had actually explained in the comments to that question. Usage stats of the nature explained in the patent (freq. of visits and unique visitors) will help newer pages on newer topics to compete on a level footing with the established sites.
Why Google or Google experts have said confidently there will be no quick fixes.The "experts" said Panda will not be easy since it didn't happen for a long time. During the first month everyone was full of optimism :). Google was don't try one thing, focus on users...blah blah...
There can only be two reasons for that.Or links during the time they were the only solution. It's a possibility, that's all I'm saying.
1) Page history 2) usage stats.
Under a usage information-based ranking approach, the pages might be ranked differently.Just re-read this. Doesn't this mean we can buy traffic to rank higher? Even if they mean traffic is a partial boost, as long as it's used you can buy your way out this. And then you have a death spiral scenario. Does anyone have a link on where to buy traffic from ;)
Looking just at a raw visit frequency, the pages might be organized into the following order: first page (40 visits), second page (30 visits), and third page (4 visits).
If those raw visit frequency numbers are refined to filter out automated agents and to assign double weight to visits from Germany, the order of the pages might change to: second page (effectively 40 visits, since the 10 from Germany count double), first page (effectively 25 visits after filtering out the 15 visits from automated agents), and the third page (effectively 4 visits).
The usage data might be combined with either or both the IR scores and the link scores.
joined:Dec 29, 2003
Londrum, yes, I think that's what the random stuff appearing on page 1 is all about...mixing things up. Have you noticed SERPs are a lot less static than they usually are? That's what we're seeing, anyway. Google has done this before (maybe when college students got out of school? So they could randomly test SERPs?) but I think it's going to be the modus operandi for a while if Panda indeed draws from this patent.Devil's advocate here: Isn't this useless for frequently changing pages? It takes loads of times to get enough traffic to test a page and by the time you are done it may have changed. Remember, Google has to 'see' the traffic too. How is showing junk scrapers on top quite frequently help Google, since this is an ongoing eval apparently?
Seems to me like a great signal to use. So if you're getting links from sites with no traffic, they're not worth anything
But I do think that if and when Google implemented/implements this patent, it'll mean an end to SEO as we know it
joined:Dec 29, 2003
I wonder if google is somehow using these usage stats to regulate the value of outbound links. Seems to me like a great signal to use. So if you're getting links from sites with no traffic, they're not worth anything.Sometimes the most authoritative books /papers are almost never read, relatively speaking.
Na, just that the rules are changing (as always), and we'll use different tools. Like spending more time pursuing links that send traffic as well as PR/link juice. And making sites stickier. And making the outgoing links more enticing to click, so visitor doesn't hit the "back" button.
We’re also told that instead of maintaining this kind of user data for individual pages, it might be done on a site-by-site basis, with the site usage information associated with some or all of the pages on that site.