homepage Welcome to WebmasterWorld Guest from 54.237.54.83
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 76 message thread spans 3 pages: < < 76 ( 1 2 [3]     
Google's Patent on Backlinks - many interesting clues from 2007
tedster




msg:3963007
 7:45 am on Jul 31, 2009 (gmt 0)

I read Google Patents often, not because they use everything they put into a patent. They don't - and especially not at first. But the patents offer clues about what MIGHT be coming down the road. And a couple years later, sometimes they give us nice clues about ranking puzzles that have begun to surface.

With the current update apparently doing "something different" with backlinks, I went back for another reading of the 2007 patent application Document Scoring Based On Link-Based Criteria [appft1.uspto.gov] and I found a bunch of interesting points. It's not necessarily the clue to this update, but more along the lines of explaining some other observations.

If you read the entire patent, you don't end up with a clear-cut list. Instead you get "sometimes this way, sometimes that way, and it all depends". Most interesting to me was the way some back link factors can work differently (and even work the exact opposite way), depending on the query terms.

The patent mentions some of the standard factors we already talk about - trust and authority of linking sites, spikes in back link growth, spikes of similar anchor text, and so on. But a closer reading brought me some other goodies. Here is my paraphrase for some of the paragraphs I found interesting:

PAGE SEGMENTATION and RATES OF CHANGE
[0051] Here Google defines a factor called UA [update amount], and it can be a factor that they weight differently for different segments of the page. Not only the back link juice itself is weighted differently, whether it changes is also given a different weight, depending on where the link appears on the page.

PAGE CHANGES CAN IMPROVE OR LOWER RANKINGS
...it all depends on the query terms!

[0052] Pages that show an increasing rate of change might be scored higher than pages for which there is a steady rate of change.

Now contrast that paragraph with this:

[0055] For some queries, content that has not recently changed may be a better result. So ranking factors can work one way for one search term, and the opposite way for another.

PARTIAL INDEXING OF PAGES
[0053] This paragraph deserves some exact quotes:

In some situations, data storage resources may be insufficient...search engine may store "signatures" of documents instead of the (entire) documents themselves to detect changes to document content. In this case, search engine may store a term vector for a document (or page) and monitor it for relatively large changes. According to another implementation, search engine may store and monitor a relatively small portion of the document.

And so we hear "why can't I find my page for an exact phrase search." And we also have a hint that sometimes Google may not have enough storage all the time.

RANKING FOR SEVERAL SEARCHES
[0063] How often a page appears for different searches can help boost rankings across the board. So maybe optimizing a single page for several different terms makes some kind of sense, eh?

RANKING CEILINGS, TRAFFIC THROTTLING and the YO-YO EFFECT
These two paragraphs deserve to get bumped together:

[0075] A spike in BACKLINKS can mean two things - a suddenly hot topic, or an attempt to spam.
[0102] A spike in RANKING can also mean two things - a hot topic or spam.

Now here's where it gets interesting: According to [0102], Google may allow a ranking to grow only at a certain rate, or apply a certain maximum threshold of growth for defined period of time. This might well account for the pain of "I've hit the ceiling" that we sometimes feel.

Even beyond those painful ranking ceilings, I've seen analytics that show amazing Traffic Throttling [webmasterworld.com]. The daily traffic graph looks like a barber comes in at 2pm every day and gives a buzz cut. And in order to throttle traffic that effectively, the only way I can see is Yo-Yo Rankings [webmasterworld.com].

This patent suggests that if a site experiences an extreme throttling of its traffic, (or a yo-yo between page 1, page 5, page 1, etc) then the site probably had some suspiciously spiky growth in back links -- spikes that couldn't be explained by a Hot Topic suddenly popping up for the general public. And so, Google put the site on their traffic regulator.

That lines up exactly with the cases I've worked with. And members here first noticed the yo-yo (traffic throttling) in 2008 - more than a year AFTER this patent was filed.

 

Shaddows




msg:3964700
 3:09 pm on Aug 3, 2009 (gmt 0)

All in, that sounds like its a folding methodology "yo-yo". As if Google doesn't know what combination of results would most satisfy the searcher, and is trying different resolutions. Presumably metrics such as "% requiring refinement", "no click, new search" and other criteria that us webbies wouldn't be able to record but that express user dissatisfaction would be used for this.

If anything, I'm surprised by the 'stability' of your site on SERPs- it only occupies two positions. Indeed, most yo-yos reported are against otherwise stable results. Yours is against chaotic results, and sounds more like stable churn (forgive the oxymoron) than the standard Yo Yo.

crobb305




msg:3964975
 10:13 pm on Aug 3, 2009 (gmt 0)

I am almost fearful to ask anyone for a link to my site these days. But recently, a friend offered me a link from his site, that would appear in his blogroll (sitewide). This site is about 2000 pages, but geesh, I fear a new sitewide link will trigger some mechanism that will cause the whole algorithm to go bipolar on my site and put me into a yo-yo or worse.

Should we fear getting links? And, do blogroll, or sitewide links even count?
C

dibbern2




msg:3965138
 4:51 am on Aug 4, 2009 (gmt 0)

Exceptional post Tedster. Thanks much.

CainIV




msg:3965150
 5:28 am on Aug 4, 2009 (gmt 0)

Shadows / Tedster. Just throwing this out there :)

Could it be that the changes in taxonomy (semantics) that have been theorized recently in this update *might* lie at the heart of the yo-yo phenomenon? From what I remember, the yo-yo started in and around the Universal search update, which in all of it's grandeur, is really about semantics and Google's ability to better match key phrases to user intention in search.

The idea of Google gathering metrics for websites / pages based on various metrics, calculating a level of 'term trust' for phrases then attempting to provide the best possible website / media to fulfill that query is sound.

In that scenario it makes total sense that Google would test other web pages and media and gather more data in order to arrive at the best possible match for the user.

tedster




msg:3965152
 5:40 am on Aug 4, 2009 (gmt 0)

There probably is an ingredient in there, Cain. Universal Search is, above all, a way of integrating or folding information from differed database partitions into one set of results. If a new semantic taxonomy is part of filtering the various partitions, that would kick up a storm.

Also it is a common yo-yo effect to see one SERP position go either to a Universal (blended) result or to the affected url - that is, they alternate for the spot.

When the infrastructure to force a certain position to be used for a certain "type" of result was first tuned up, we soon saw that infamous position 6 bug [webmasterworld.com] - Google let its underwear show for a minute!

randle




msg:3965338
 2:08 pm on Aug 4, 2009 (gmt 0)

As if Google doesn't know what combination of results would most satisfy the searcher, and is trying different resolutions. Presumably metrics such as "% requiring refinement", "no click, new search" and other criteria that us webbies wouldn't be able to record but that express user dissatisfaction would be used for this.

Are we talking about user behavior being blended into the ranking process, or am I misunderstanding?

Shaddows




msg:3965372
 2:42 pm on Aug 4, 2009 (gmt 0)

Are we talking about user behavior...

Yes, I was. But non-specific user behaviour- the type that cannot favour a particular website.

So, G displays a SERP made of a blend of particular TYPES of site, with the best candidate for each type filling the slot. If there is NO CLICK ON ANY RESULT, then the SERP could be considered unsatifactory.

I'm not saying I KNOW it happens. But we certainly track user behavior on our site. I imagine you do to. Google has one of the most sophisticated number-crunching architectures on Earth (algo, hardware and Statistician brain power)- I'd be confident in my bet that they track this kind of thing, and use it internally to improve their offering.

Edited for markup

CainIV




msg:3965520
 5:36 pm on Aug 4, 2009 (gmt 0)

User behavior, as well as a more granulated way that Google calculates the meaning of a given phrase, and what site(s) or media best fit that spot.

karkadan




msg:3966011
 10:55 am on Aug 5, 2009 (gmt 0)

This post has been more than insightful. Got a better undestanding on updates. I'm on fire. Thanks.

night707




msg:3970772
 7:28 pm on Aug 12, 2009 (gmt 0)

tedster,

very good post, compliment.

We know, that Google is making money from the lads who sell traffic. How about using that to break a traffic cap in case one of these operators can offer fairly decent visitor?

tedster




msg:3970793
 7:50 pm on Aug 12, 2009 (gmt 0)

Maybe I misunderstand your idea - but the cap is on traffic straight from Google organic search. How would a different traffic source play into the picture?

Got a better undestanding on updates.

This understanding comes from studying the current Google infrastructure. We'll need to see how it holds up when the new "caffeine" infrastructure rolls out.

night707




msg:3970810
 8:16 pm on Aug 12, 2009 (gmt 0)

The new "caffeine" looks fine.

imagine a site with a cap on adsense revenues.

We know of sites that receive almost the exact sums over some 5 - 6 months until it goes up or down in 100s steps in very close connection to the organic google traffic.

Breaking that scheme with a few thousand extra visitors, how would that affect a revenue cap.

For example, we have seen nice adsense increases and one may think, yeah upwards now. But before the end of the month there will be a decline in click values with the same traffic to balance that to make it the sum from the previous month.

Or vice versa, if the first half was bad, you see higher revenues by the end of the month.

It seems like a very complicated cap system.

tedster




msg:3970861
 10:05 pm on Aug 12, 2009 (gmt 0)

This forum and this thread are not about [theoretical] Adsense caps - we're talking about ORGANIC search traffic throttling here, and Adsense topics (click values, etc) should be discussed in the Adsense forum [webmasterworld.com]. It's just not all the same ball of string!

So please, let's keep the focus on pure traffic volume from Google's organic results. That's already more than enough.

Whitey




msg:3971049
 8:01 am on Aug 13, 2009 (gmt 0)

I don't think on-page changes can trigger the yo-yo. I think it's always backlink related

So the Yo-Yo may have some correlation to time-related triggers, and /or a traffic volume trigger ... hmmm - interesting and logical.

Does this say anything about Yo-Yo related penalty logic ie why would Google want to create this behaviour on a penalised site ?

I'm just wondering if there's any correlation in the logic or none.

[btw] one of the best quality threads I've read for a while - hats off to all involved.

Trax




msg:3971081
 10:05 am on Aug 13, 2009 (gmt 0)

@tedster... in your backlink factors, what is "Churn"?

Marcia




msg:3971345
 4:39 pm on Aug 13, 2009 (gmt 0)

[0063] How often a page appears for different searches can help boost rankings across the board.

So maybe optimizing a single page for several different terms makes some kind of sense, eh?


Would it be a single page, or a single host/site? How about having several pages on he site appearing for searches on related phrases? Wouldn't a site about baked goods (main topic) be expected to rank or have pages about bread, pie, cake, cookies - and wouldn't they link to those 2nd level categories from their main page in a user-friendly site architecture?

I've seen some strong evidence of this; and even though it may, for the most part, be anecdotal, I feel it's an observation worth considering - especially when some sites' rankings (or ranking drops) seem directly related to those site diversity/similarity factor.

This 76 message thread spans 3 pages: < < 76 ( 1 2 [3]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved