crobb305 - 1:57 pm on May 18, 2012 (gmt 0)
That may be true, but Google really should be able to spider the web and incorporate new data/changes much faster than every 4 to 6 weeks. It's not just my site that I'm seeing stale results for. I've had some links removed, incorrect BBB pages removed, changed my business FB page title, all over 3 weeks ago...Google still displays the old stuff.
Just that we may not be working on the right things :(
I know what you're saying though Jez123 :) ...there is much work to be done. Unfortunately, once you do that work, it will take Google a complete moon cycle to detect the changes.
The only two explanations that make any sense (and neither impact recovery strategies) are:
1) There is a different recipe for cooking up SERPs including excluded terms / domains. As Penguine (and pos Panda) are resourse-intenstive, they are not baked in to this recipe. Implication: the SERP is generated holistically, rather than for the "include" terms and then subtracting the "exclude" terms.
2) Adding an "exclude" breaks the sequence for generating displayed SERPs, the assumption being that you build "normal" SERPs then remove "excluded" items. Implication: Penguine(/Panda) is a post-algo modifier, applied to every result after the core rank/score is generated.
Of the two, only the first seems remotely likely.
In other words, given the fact Penguine is not present in the process, this artefact has no explanative power in investigating that particular ranking module, though it does give an opportunity to look under the bonnet (hood) of Google's core algo.
Given that [<term> -<term>] give NO results, it implies that the score page-score of the excluded term is wholly subtracted from the page-score of the included terms. This in turn suggests a powerful investigative tool using related terms to see the relative scoring power of each term; <automobile -car> for example.
Sounds reasonable. Thanks Shaddows.