homepage Welcome to WebmasterWorld Guest from 54.198.148.191
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 76 message thread spans 3 pages: < < 76 ( 1 [2] 3 > >     
Google's Patent on Backlinks - many interesting clues from 2007
tedster




msg:3963007
 7:45 am on Jul 31, 2009 (gmt 0)

I read Google Patents often, not because they use everything they put into a patent. They don't - and especially not at first. But the patents offer clues about what MIGHT be coming down the road. And a couple years later, sometimes they give us nice clues about ranking puzzles that have begun to surface.

With the current update apparently doing "something different" with backlinks, I went back for another reading of the 2007 patent application Document Scoring Based On Link-Based Criteria [appft1.uspto.gov] and I found a bunch of interesting points. It's not necessarily the clue to this update, but more along the lines of explaining some other observations.

If you read the entire patent, you don't end up with a clear-cut list. Instead you get "sometimes this way, sometimes that way, and it all depends". Most interesting to me was the way some back link factors can work differently (and even work the exact opposite way), depending on the query terms.

The patent mentions some of the standard factors we already talk about - trust and authority of linking sites, spikes in back link growth, spikes of similar anchor text, and so on. But a closer reading brought me some other goodies. Here is my paraphrase for some of the paragraphs I found interesting:

PAGE SEGMENTATION and RATES OF CHANGE
[0051] Here Google defines a factor called UA [update amount], and it can be a factor that they weight differently for different segments of the page. Not only the back link juice itself is weighted differently, whether it changes is also given a different weight, depending on where the link appears on the page.

PAGE CHANGES CAN IMPROVE OR LOWER RANKINGS
...it all depends on the query terms!

[0052] Pages that show an increasing rate of change might be scored higher than pages for which there is a steady rate of change.

Now contrast that paragraph with this:

[0055] For some queries, content that has not recently changed may be a better result. So ranking factors can work one way for one search term, and the opposite way for another.

PARTIAL INDEXING OF PAGES
[0053] This paragraph deserves some exact quotes:

In some situations, data storage resources may be insufficient...search engine may store "signatures" of documents instead of the (entire) documents themselves to detect changes to document content. In this case, search engine may store a term vector for a document (or page) and monitor it for relatively large changes. According to another implementation, search engine may store and monitor a relatively small portion of the document.

And so we hear "why can't I find my page for an exact phrase search." And we also have a hint that sometimes Google may not have enough storage all the time.

RANKING FOR SEVERAL SEARCHES
[0063] How often a page appears for different searches can help boost rankings across the board. So maybe optimizing a single page for several different terms makes some kind of sense, eh?

RANKING CEILINGS, TRAFFIC THROTTLING and the YO-YO EFFECT
These two paragraphs deserve to get bumped together:

[0075] A spike in BACKLINKS can mean two things - a suddenly hot topic, or an attempt to spam.
[0102] A spike in RANKING can also mean two things - a hot topic or spam.

Now here's where it gets interesting: According to [0102], Google may allow a ranking to grow only at a certain rate, or apply a certain maximum threshold of growth for defined period of time. This might well account for the pain of "I've hit the ceiling" that we sometimes feel.

Even beyond those painful ranking ceilings, I've seen analytics that show amazing Traffic Throttling [webmasterworld.com]. The daily traffic graph looks like a barber comes in at 2pm every day and gives a buzz cut. And in order to throttle traffic that effectively, the only way I can see is Yo-Yo Rankings [webmasterworld.com].

This patent suggests that if a site experiences an extreme throttling of its traffic, (or a yo-yo between page 1, page 5, page 1, etc) then the site probably had some suspiciously spiky growth in back links -- spikes that couldn't be explained by a Hot Topic suddenly popping up for the general public. And so, Google put the site on their traffic regulator.

That lines up exactly with the cases I've worked with. And members here first noticed the yo-yo (traffic throttling) in 2008 - more than a year AFTER this patent was filed.

 

cangoou




msg:3963628
 8:34 am on Aug 1, 2009 (gmt 0)

I still got no idea how to "repair" an unbalanced backlink-profile. Of course I get natural links, but do you think it's enough to wait until the profile is balanced on this natural way, or would you try to change/delete old links gently to accelerate thinks a bit?!

tedster




msg:3963631
 9:03 am on Aug 1, 2009 (gmt 0)

I'd say strengthen your website's content and features, and then get the word out about the improvements. Think of it as more like conventional marketing or public relations. Network with others in your field, or in related fields.

If your site has the goods, then it will also attract the links when other's learn about it. But if you only try to cobble together links that you can control, then you often run into trouble - or just fall short.

dertyfern




msg:3963636
 9:19 am on Aug 1, 2009 (gmt 0)

As bhartzer noted, link profile is comprised of both incoming and outgoing links so you have some control in rebalancing your profile.

cangoou




msg:3963666
 11:50 am on Aug 1, 2009 (gmt 0)

Thanks for your replys. A site that has been hit doesn't have any outgoing links - would you suggest linking to authority sites? It's a bit tricky here since the only other valuable sites are competitors.

And the problems is that people get more and more stingy giving real links with no nofollow. There has been an analysis in my area that shows that blogs for example are giving only a small part of voluntary links now then they did some years ago - thanks to google.

dertyfern




msg:3963676
 12:16 pm on Aug 1, 2009 (gmt 0)

Why not link to other sites--even if they're competitors--if it'll give you a boost?

Your links to need to be enticing enough to elicite the outclick.

sailorjwd




msg:3963681
 12:26 pm on Aug 1, 2009 (gmt 0)

When looking at top searches in webmaster tools I always wondered why a particular term showed a position of 7 when every time i search the term the page shows at #1. Sounds like position 7 is the average and is the result of yo-yo effect.

signor_john




msg:3963702
 2:29 pm on Aug 1, 2009 (gmt 0)

There has been an analysis in my area that shows that blogs for example are giving only a small part of voluntary links now then they did some years ago - thanks to google.

The problem isn't Google; it's people who use blogs as spamming opportunities. (FWIW, people were spamming forums, listservs, etc. long before Google was born.)

cangoou




msg:3963707
 2:34 pm on Aug 1, 2009 (gmt 0)

No, the study was about links from inside the posts of blogs, not from the comment-area.

mrguy




msg:3963724
 4:04 pm on Aug 1, 2009 (gmt 0)

Isn't Google the one putting filters in place that are taking out perfectly white hat sites? How is that not Google's problem?

If your already doing everything according to Google's Gospel and your unfortunate enough to get caught up in the net as many have, whos fault is that?

No, it's Google's fault for thinking they have replaced God and will now do all the thinking for every person on earth.

I predict in the next 5 years, while Google will still be here, they will no longer be the dominate search engine and that is good for every body involved.

CainIV




msg:3963803
 7:04 pm on Aug 1, 2009 (gmt 0)

Interesting theory tedster. What properties would you think apply to that threshold of growth you are seeing being applied by Google.

Most importantly, what do you believe are the criteria that are met in order for Google to automatically cap traffic for any domain. Perhaps it could be strongly associated with semantics, where if a given keyword phrase is not semantically close enough to the core keywords on a website then Google applies the cap for duration x. If so, what governs the duration.

Exciting stuff!

Marcia




msg:3963896
 9:31 pm on Aug 1, 2009 (gmt 0)

In some situations, data storage resources may be insufficient...search engine may store "signatures" of documents instead of the (entire) documents themselves to detect changes to document content. In this case, search engine may store a term vector for a document (or page) and monitor it for relatively large changes. According to another implementation, search engine may store and monitor a relatively small portion of the document.

That can relate to indexing of URLs in a partitioned database, which is part of the series of Phrase Based Indexing patents filed. In descending order, partitions allegedly maintain decreasingly less information/data about URLs.

Remember that what was referred to as the "Supplemental Index" retained only the URL and the Document ID.

JS_Harris




msg:3963901
 10:04 pm on Aug 1, 2009 (gmt 0)

Google rotates different data sets into the serps depending on time. You can see if that is what's happening by looking at the TOTAL number of pages returned and compare it with your yo-yo charts. During any decrease in traffic there seems to be an increase in number of pages returned.

This also explains peoples complaint of "I was #3 and now i'm down to #12, why?" with the obligatory "I'm back at #3" when it returns to the previous data set.

My yo-yo charts suggest a different set of results (and more of them) from Friday to Sunday but it varies by site and topic.

Keeping webmasters guessing keeps Google in business, what better way to do that then to use multiple data sets? A sprinkle of chaos theory in the algorithm can skew even the most diligent of studies especially when modified over time.

CainIV




msg:3963984
 1:29 am on Aug 2, 2009 (gmt 0)

"A sprinkle of chaos theory in the algorithm can skew even the most diligent of studies especially when modified over time."

True, but this doesn't segment the characteristics for websites that do not seem to flux as much or even yo yo at all for keyword terms. From what I can see, the phenomenon does not effect all websites.

WarrenBuffett




msg:3963993
 1:49 am on Aug 2, 2009 (gmt 0)

I'm still penalised, have put in some links to authority sites to no avail. (Dmoz, wiki, encarta, botw, yahoo etc).

However someone mentioned in another thread that the penalty might have been activated because certain anchor texts occur at a frequency not to google's liking. So you can build anchor texts to make it look natural however, eventually if you pass a certain frequency amount, you could trigger a filter. This theory makes sense to me.

I will now try build rather odd looking anchor text, but still related to my niche, to see if it can make some kind of difference, and hopefully out of the penalty box.

tedster




msg:3964037
 4:54 am on Aug 2, 2009 (gmt 0)

What properties would you think apply to that threshold of growth you are seeing being applied by Google.

Here's my current idea. I believe that Google's staff contains more statisticians than any other specialty. The algo is, more and more, driven by statistics and probability. These statisticians watch query data as well as backlink data. That's what jumped out at me while re-reading this patent: backlinks PLUS queries.

Google's statisticians know what queries currently show bursts of interest from the general public. They know what companies are getting navigational queries - and they know when any online business is truly growing in brand recognition. For example, queries like [company keyword] will start increasing if there is a real growing interest. We puzzle over "Update Vince [webmasterworld.com]"? How about defining "brand" by folding data on navigational queries into the ranking algo.

When backlink numbers start growing, then that new "interest" at the webmaster level should be supported by the general population's query data. In other words, if the backlink growth is relatively "natural", then it should show a certain statistical footprint.

If the spike in backlink growth is too far outside that statistical footprint, then Google will take steps to limit the effect of that apparent SERP manipulation.

The statistically normal expectations are, by this time, quite granular and gaining in sophistication. The patent I mentioned in the opening post lists many possible measures that Google can take to determine when patterns are outside the natural range. And they're probably making many others we haven't even guessed at.

This is my current brainstoring area, and it's why I recommend the idea of ATTRACTING backlinks more than "building" them. Backlinks alone cannot create a statistically correct footprint for a growing, thriving website. Even though such a "dummied-up" impression has been a working tool for improved ranking in the past, it's a tool whose future is getting more and more cloudy.

whitenight




msg:3964039
 5:26 am on Aug 2, 2009 (gmt 0)

This is my current brainstoring area, and it's why I recommend the idea of ATTRACTING backlinks more than "building" them. Backlinks alone cannot create a statistically correct footprint for a growing, thriving website. Even though such a "dummied-up" impression has been a working tool for improved ranking in the past, it's a tool whose future is getting more and more cloudy.

Wow, so after all that, you've gone with the Company line of
"create such great content that people link to you naturally"?
aka
"Santa Goog knows who's been naughty and nice"

Umm, ok, sure.

Sounds like a winning business PHILOSOPHY....
(that can be said for ANY business problem)

But it's not Search Engine Optimization
and certainly not a SOLUTION to the page specific yo-yo.

And you started off so well with the "statistics" aspect of what the yo-yo is addressing.

--------------------------

Marcia was on track with the supplemental index issues (aka storage space)...

Funny how "out of sight, out of mind" works so well.

tedster




msg:3964053
 6:11 am on Aug 2, 2009 (gmt 0)

As I said above, I'm basically brainstorming -- and trying to encourage the sharing of ideas here.

So no, that wasn't a prescription for fixing an existing yo-yo. It's more like a way to avoid ever getting into a yo-yo. If you're already popping between page #1 and page #5, from what I've seen your recent backlink profile is not showing natural diversity and balance. The growth has probably quite "lumpy" and unnatural, compared to what's statistically expected.

I did say to attract backlinks "more than" building them, but I didn't say "instead of" building them. It's not all one thing and none of the other. And even more, "attracting" backlinks is not just a passive waiting game. It takes proactive publicity, marketing and networking. And it takes many intelligent actions within a site or family of sites to make effective use of the link juice that's generated.

So the "prescription" that I'm brainstorming is a way to balance the backlink profile, rather than throw it into even more unbalance or an even greater lack of diversity. Take up some new activities that are also, in the end, aimed at getting links.

Many backlink programs I've seen are like a dog that knows just a few tricks. Diversifying those actions (learning more "tricks") is a big help. And ultimately, if your online business is essentially weak and not very link-worthy - well, then you definitely need to work on that.

[edited by: tedster at 7:17 am (utc) on Aug. 2, 2009]

whitenight




msg:3964076
 7:16 am on Aug 2, 2009 (gmt 0)

The growth has probably quite "lumpy" and unnatural, compared to what's statistically expected.

Ok, so why the yo-yo?
Why not just filter the site ala -950 if it falls outside of statistical norms?

As I've always stated, believing the yo-yo is a purposeful action on Goog's part leads to false suppositions of it's causes.

family of sites to make effective use

I like this idea. ;)

But then again, in another thread, this sounds similar to the "link wheel" you cautioned against.
Hmm, which one is it?

Use one's sites to enhance one's rankings
or
Goog will spot such "gaming" a mile away, so avoid it?

fishfinger




msg:3964080
 7:36 am on Aug 2, 2009 (gmt 0)

Can any search engine combat something like that?

You found, it Google employees can find it. And they take down networks of sites on a regular basis.

I've seen networks working very well, but the bigger the network, the more customers, the easier it is to spot. It's a fast buck for them and for their customers. They have to sell links that Google notices - therein lies the problem. Google notices.

tedster




msg:3964082
 7:38 am on Aug 2, 2009 (gmt 0)

believing the yo-yo is a purposeful action on Goog's part leads to false suppositions of it's causes

What are you hinting at here, whitenight? A secondary database partition that contains a kind of mutated version of the main rankings, perhaps? I can see that as a possibility, especially if the partitions are all on the same data center. Still, the yo-yo is pretty common, so it's a challenge to see it as not purposeful.

In another thread, someone made the observation that the total number of results (the "about" number) seems to cycle between a larger number and a smaller one, with rankings dropping when the larger number is being reported. Perhaps the larger "about" number is a sign that a database partition with more urls (and less complete content indexing) has been rolled out?

The patent above that started my current line of thinking is the first place I've ever seen language from Google about intentionally restricting traffic or ranking growth. So I began to consider again the relationship between a yo-yo and traffic throttling.

At any rate, I'm not currently responsible for urls that are on the yo-yo. So this is a bit of an academic exercise for me, right now. And the yo-yos that I have seen fixed, escaped it after gaining a more diverse backlink profile. That reasoning could be the "post hoc ergo propter hoc" fallacy, so I still have an open mind about establishing cause and effect.

<added>
From what I've seen, link wheels use fluff for content and take on almost any member who applies. True website families would not do either one.

dailypress




msg:3964083
 8:05 am on Aug 2, 2009 (gmt 0)

Frequent changes to the main menu or the title - I wouldn't touch it. I've heard enough horror stories to know that those kind of frequent changes can really bite back. And if you've got an external link in the main menu area, it better be pretty stable.
And what about the "Most Popular Links", or "Most Recent Link", or "Most Relavent Links" that many sites have under their Menu Columns? Or What about the ARCHIVES that Matt Cutts has on each page of his blog and gets updated frequently?

The reason why I ask is I have installed a "Most Popular Links" module on my site that gets updated based on the traffic each page has and since I advertise specific pages on external sites, the list of Most Popular Links changes pretty often.

tedster




msg:3964085
 8:10 am on Aug 2, 2009 (gmt 0)

Clearly those kind of frequent changes are not a problem, they are even expected and are part of some very common CMS systems. They are also secondary site navigation - not the main menu.

CainIV




msg:3964265
 6:48 pm on Aug 2, 2009 (gmt 0)

believing the yo-yo is a purposeful action on Goog's part leads to false suppositions of it's causes

From what I have seen, there is just as much reason to believe that it is a purposeful event than a simple glitch.

In terms of navigational searches, brand and statistical patterns of link profiles, my only issue with this theory is how Google would apply this factor to a business that does not have a unique name online, for example widgets.com, where widgets is a competitive keyword.

How would Google be able to segment off inbound link anchors that appear to be naturally built via buzz or public interest, even in a burst, when the inbound keywords for 'brand' also represent the competitive keyword in question?

To be honest, I have seen link wheels done brilliantly, and in my estimation, those appear to have worked for the couple of references I looked at. Again, they were comprised of very well written content that actually attracted links on it its own. As well, all of the usual items like hub score / referencing authorities need to be observed.

whitenight




msg:3964429
 5:51 am on Aug 3, 2009 (gmt 0)

From what I've seen, link wheels use fluff for content and take on almost any member who applies. True website families would not do either one.

Methinks you posted this explanation in the wrong thread...
and to the wrong person.
For me, it was a rhetorical question.

---------

Let's go down memory road for a second.

Anyone remember the original Google dance days?
Or even the old update days.

A small group of people continually said the dance and "penalties/missing pages" after updates were caused by
not enough storage.

Anyone paying close attention to Goog FUD during Big Daddy heard Goog admit the lack of storage space was what caused the original Google dance and why certain sites would just fall off the earth after updates, only to return inexplicably to their original rankings with the next update.

What's this have to do to with the Yo-Yo.

Maybe nothing, Maybe something.

I had a point about when something looks intentional, but I've forgotten it now. :)

----------------

What else started late 2007?

- implementation of Universal search
Anyone observed what's being shown in Universal Search when they yo-yo?

- Supplemental index goes underground.
Out of sight, out of mind.

- Position #6 debacle.
the implementation of Ghost Datasets(TM), separate ranking partitions. concrete seeding sites, database partitions that don't always "play well together"

--------------------

Back to basic physics.

When any two (or more) objects are in motion, they will NATURALLY OSCILLATE... and causes those things in their field of influence to OSCILLATE.

You can call these oscillations "intentional" if one needs to personalize it with emotional content, but it's not.

It's just a simple UNIVERSAL TRUTH of 2 or more bodies in motion.
There's no good or bad intention behind it.
(maybe just bad programming)

objects in motion tend to stay in motion until....

Shaddows




msg:3964510
 9:44 am on Aug 3, 2009 (gmt 0)

Wow, thats an unusually forthcoming post!

Having never, ever had a Yo-Yo URL, its hard to speak with authority on this. However, I wont let that deter me :p

Point 1. At least one type of "yo-yo like effect" is intentional. It deliberately ceilings traffic levels. I see three potential reasons for this ceiling-
a) To 'fix' a variable for internal (Google) testing: "my site is special",
b) To dampen a traffic spike to hide on-page/off-page SEO success "This site is unnaturally successful"
c) To test CTR to the site for OTHER KEYWORDS. So, main phrase is good. Pages nearly-good enough to rank page1 are given an opportunity. For some reason, Google holds overall referals but distributes referring terms.

Point 2. At least on type of Yo-Yo effect is NOT traffic-related, and is probably UNINTENDED. There are plenty of reports of yo-yoing without traffic ceilings applied. I suspect this is a partition based effect. Storage space had not occured to me as a reason, but its plausible.

However, I tend to think that there are more fundamental partitions in place than most realise- with sites not directly competing with each other, but within their partition. Then the partitions get folded in, according to a pre-defined methodology. Finally, their is some minor re-ordering on-page, including query-dependant penalties being applied.

The Yo-Yo would be evident here due to two separate effects. The first is because you straddle two partitions. Filters sweeping the DB might move between the "Uber-trusted" parition, to the "Super-trusted info site" partition. The next sweep (possibly time-scheduled sweep) moves you back. Observe; a yo-yo, and possibly a regular one, but equally possibly an irregular one. Other calculations other than a filter could move you between filters- particulary the status of upsteam and downstream linking sites.

Second partition-based mechanism. You move from 1 to 2 WITHIN YOUR PARTITION. Your partition is eledgible for SERP Rank 6 if you are top, or SERP rank 15 if in second. Its close, and you change frequently. Observe; an irregular Yo-Yo. Note: Once you get above postition 5, this Yo-Yo does not tend to apply, because the top 5 is usually NOT made up of algorthmically-folded partitions.

SERPs chaos is genarally a sign of either redefining partitions, or changing the methodology for folding in the partitions.

For me, this view explains alot- with the drawback of predicting very little. The only thing it really predicts is that TO ESCAPE THE YO-YO, YOU MUST CHANGE YOUR PROFILE. By making G put you FIRMLY in the top-flight partition, you escape both the straddle-affected, and the fold-affected Yo-Yo.

whitenight




msg:3964511
 9:50 am on Aug 3, 2009 (gmt 0)

Wow, thats an unusually forthcoming post!

lol, its just a summary of my 50+ posts from other threads all consolidated into one. Nothing special

TO ESCAPE THE YO-YO, YOU MUST CHANGE YOUR PROFILE. By making G put you FIRMLY in the top-flight partition, you escape both the straddle-affected, and the fold-affected Yo-Yo.

Yes exactly.
(and the above post is why shaddows is my favorite)
<the screen cuts from bunnies, to kittens, to rainbows>...lol

Can I go now? and let shaddows answer all these questions? =P

tedster




msg:3964610
 12:53 pm on Aug 3, 2009 (gmt 0)

Thanks for that summary, Shaddows - you put many pieces together into a bigger picture that does line up with a lot that I've seen.

There's one type of yo-yo that I still ponder - it's one of the first sites I ever saw with a yo-yo. This site showed regular cycling for just one (pretty major) 2-word phrase. The URL was a home page that had been at least PR7 for quite a few years. This site also had solid rankings and great traffic for some other big keywords. And yet for this particular phrase only, there was a yo-yo.

So that experience makes me wonder about a trust partition ALONE being a cause. There also seems to be a query-related factor involved, and the site seems to have plenty of trust otherwise.

Pages nearly-good enough to rank page1 are given an opportunity

And when this type of url cycles off page 1, its position is often (maybe always) replaced by a universal search result. When the url does appear on page 1, the page does not contain any universal results.

The interesting thing about this type of yo-yo is that the webmaster is getting a gift with the high rankings - but might assume instead that there's a penalty instead and they "really" deserve page #1 all the time. We always think our own children are the best looking, don't we ;)

So the site that I mentioned at the beginning, the one I'm still pondering, may well have been the experimental "gift" type of ranking. In fact - today the site still oscillates, but it's now between #3 and #6, and it no longer shows a strong time-of-day switch. I'm going to study how universal search affects that switch.

Shaddows




msg:3964649
 1:52 pm on Aug 3, 2009 (gmt 0)

I don't think Trust is the only criterion for partitions, it's alot more subtle that that.

Ever noticed how "mom & pop sites" (to borrow an Americanism) rank higher than they 'ought' to? Or how ecommerce gets by with less links, less information, and little chance of .gov or .edu endorsements relative to the info sites they outrank? It is my firm beleif these have been profiled and partitioned accordingly. The partitions allow them to fold in WELL above where they should. Similarly, a profile for "promising newcomer" might make sense. Or "venerable resource". This mechanism would allow an automatic balancing of new and old sites, along with the mainstream-established sites that make up most entries.

Further, the folding methodology itself may change by time of day. Putting ecom high up during business hours when people are likely to buy, and info when they are researching, for example. So thats another Yo-Yo, and one that may not be easily escaped if the top-tier are not semi-permanant (as is the case on less competitive queries).

In the case of your example, it looks like the SERP folding remained the same, but Google was trialling Universal Search. It was therefore forced into second position for its partition- off the main page.

The fact it is now 3/6 is interesting. Let me float this idea.

The fact it gets to 3 suggests the site itself is fortunate enough to belong to the top table, but on the query itself, it's not quite good enough to cement its position. It therefore gets moved down into its profile-slot. The fact is HAS a profile slot would imply that the top-tier is a non-exclusive partition. So, sites belong in a lesser partition but on some set of criteria are ALSO available for a top-3 slot. Which isn't really suprising- for any given query, there must be hundreds of top-tier sites that could make the top 3. The must be additional ranking factors to decide who, for what query, for how long.

Where does the site sharing 3 live when your URL is there? And would you think of it as a direct competitor for any reason other than ranking?

tedster




msg:3964667
 2:18 pm on Aug 3, 2009 (gmt 0)

I'm completely aligned with all that, Shaddows. I misunderstood you earlier and thought you were suggesting that there were "pure" trust-level partitions.

Where does the site sharing 3 live when your URL is there? And would you think of it as a direct competitor for any reason other than ranking?

I'll have to watch closer for that - it's a really good question. And yes, every site on page 1 is definitely a direct competitor. These sites are brands with high name recognition and big offline advertising budgets. All the competitors are PR 6, 7 or 8 (and the PR8 is next to the bottom!)

I oversimplified the situation up to now for easier discussion. There are actually two keyword phrases involved - synonyms. Both were yo-yos and both are now page 1, but cycling. One of the two query terms (the lower volume one) seems to be a major experiment for Google. There are 10 regular results, plus News, a Local box, a Video result and then Related Searches at the bottom. It's a real smorgasbord, but also a good SERP to learn from.

tedster




msg:3964675
 2:25 pm on Aug 3, 2009 (gmt 0)

I just checked some historical records and there's no easy answer for where the #3 site goes when this site pops into #3. I see #4, #5, or #6 and often all within a week or two. This is also one of those SERPs that is so volatile that the API results are often far off from user results.

Shaddows




msg:3964700
 3:09 pm on Aug 3, 2009 (gmt 0)

All in, that sounds like its a folding methodology "yo-yo". As if Google doesn't know what combination of results would most satisfy the searcher, and is trying different resolutions. Presumably metrics such as "% requiring refinement", "no click, new search" and other criteria that us webbies wouldn't be able to record but that express user dissatisfaction would be used for this.

If anything, I'm surprised by the 'stability' of your site on SERPs- it only occupies two positions. Indeed, most yo-yos reported are against otherwise stable results. Yours is against chaotic results, and sounds more like stable churn (forgive the oxymoron) than the standard Yo Yo.

This 76 message thread spans 3 pages: < < 76 ( 1 [2] 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved