homepage Welcome to WebmasterWorld Guest from 54.205.119.163
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 33 message thread spans 2 pages: < < 33 ( 1 [2]     
How I Make Sense of Google's Complex Algorithm
tedster




msg:4425775
 1:35 am on Mar 7, 2012 (gmt 0)

In the past, we all knew that search engines were just a bit more complicated than they appeared, but we created mental models that did a pretty good job explaining the SERPs and we let it go at that... for a long long time. I like to think of those old models as the "punch list" approach - here's all the factors we think Google measures and combines into their recipe - let's make sure we hit each one.

Then, slowly but surely, something shifted. Keywords plus backlnks could no longer explain the rankings we started to notice. What on earth is going on? Here's what I've been able to put together.

BIG DATA
WE all know Google loves data. I'd guess that they collect at least ten times the number of signals, compared to what they actively use in the algorithm at any time. And they never delete any of it ;) When Panda first crawled out of development, we started hearing a lot more about machine learning - but Google has preferred the machine learning approach from the beginning - and they let their machines free check out the BIG DATA pile just to see what correlates and what doesn't. There's a reason so many of their PhD hires are statisticians.

STATISTICIANS
Today more 200 signals are actively used - and I'm betting it's FAR more. They know when any particular signal (say backlink anchor text) is natural or at least along the same lines as the rest of that market - and when it's been seriously manipulated. Lots of backlinks should correlate with some other mentions here and there. If it's too low (or maybe too high) then it might get devalued or even tossed out.

Read some of the Spam Detection patents - especially the one about Phrase Based Indexing. This statistics thing is really big.

TAXONOMIES - AUTOMATED!
Google has been automating taxonomy generation for a long time. Query terms are assigned taxonomies, websites are assigned taxonomies. When the statisticians play with their big data, I'm pretty sure that they look at statistical relevance with a given taxonomy - let's say within a market place. Clearly signals are used differently for a crafts website than for gambling, for example.

So when two URLs seem to have "the same" signals but one far outranks the other - it's more likely to be the way that signals correlate and interact - as well as signals you're not used to thinking about.

CORRELATING SIGNALS
Historical signals are a big one. Remember that scary big patent full of possibilities? They've definitely been collecting and testing all those kind of data.

How about User Engagement signals of many kinds? All the search engines have been looking at that kind of data because it's so danged hard to fake. At the same time, when Matt Cutts says that bounce rate is "too noisy" a signal foir them to use - he's not just flapping his gums. He knows, mathematically, exactly how useful or not these signals are in generating good quality rankings.

And there are many thousands of correlations to be measured an watch - thousands I tell you.

[edited by: tedster at 5:28 am (utc) on Mar 8, 2012]

 

JohnRoy




msg:4430677
 11:43 pm on Mar 18, 2012 (gmt 0)

If it's Tedster who says it's impossible to reverse engineer the G engine, no one shall think differently.

A suggested alternative: throw snowballs, lots of it, with the hope that some will hit the right target.

This is what I saw someone do after being hit by panda:

  • Diversify his niche to four sub niches.
  • Revise and Gather huge keyword lists for each niche.
  • Roll out subject related content ranging from 150 through 850 words per article. Quality: All are good (no spun junk etc), but 25% can be considered excellent.
  • SEO Optimize 25% of the pages. Optimizing rate ranges from 60% thorough 90% based on SEOpressor or Yoast wordpress plugin.
  • Interlinking each sub niche as in the old days. Making sure every page has lots of own server inbounds.

    Didn't provide figures. But claims to have seen excellent results.

  • johnser




    msg:4438183
     11:01 pm on Apr 6, 2012 (gmt 0)

    Very good thread Folks.
    Previously when we talked to SEO clients we told them they needed:

    A) Clear Site Structure (they rarely had this)
    B) Great Content (often too costly for offline companies to produce)
    C) Relevant Links (India-type links a waste of time; relevant links take us 2+ hours each to obtain = very expensive)

    From now on, I'm thinking that in addition to the above, the things to watch out for are:

    D) Strong social signals
    E) Branded links from relevant content
    F) Focus on brand rather than exact anchor text
    G) Ad layout that works for users rather than the business

    (BTW: If anyone knows of a good graphic to illustrate the above, please share it!)

    The SEO Opportunity
    Knowledge about & the ability to inform clients of the issues in this thread (with full disclosure of the many things we don't know) is going to keep the top SEOs earning their crust with smart clients.

    IMHO lower quality / part-time SEOs just aren't going to be able to compete from now on because they won't have enough experience / confidence to tell clients what the SEO doesn't know. They'll also still be hoping that $300/mth outsourced link building will keep working.

    The New Google World Order
    I propose we call this thing something because it actually is a New World Order in SEO terms considering that what we're seeing is an SEO Industry Apocalypse (& out of crisis comes opportunity!)

    "Maya" is my suggestion for a couple of reasons:
    > 2012 is the year these folks thought the world would end. SEO as we've all known it for 16+ years is dead.

    > Mayans were called "The Greeks of The New World" because of their sophistication with maths & technology. A 21st century AI version of that is what we're dealing with now.

    Leosghost




    msg:4438184
     11:10 pm on Apr 6, 2012 (gmt 0)

    Maya != Mayan

    Beware of Maya..

    This 33 message thread spans 2 pages: < < 33 ( 1 [2]
    Global Options:
     top home search open messages active posts  
     

    Home / Forums Index / Google / Google SEO News and Discussion
    rss feed

    All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
    Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
    WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
    © Webmaster World 1996-2014 all rights reserved