thegypsy - 1:11 pm on Aug 31, 2012 (gmt 0)
How can there be collateral damage if the (patent's) results are lower than the final destination?
Keep in mind that it doesn't always move down. Would kinda work like this;
1. You're ranking in pos 20
2. You add links, mess with TITLE or other stuff
3. You changes would naturally take you to 5th
4. Google moves you to a transition rank of 10th OR 30th
It could move you up, just not as far as expected, causing you to do further changes to boost the rank.
Again, this really doesn't make sense in the larger context (index-wide) because we'd see a TON more flux that we do. Yes, I know there is a fair amount out there, but imagine if ANY time a page changed, got new links, that this was happening. The SERPs would be even whackier than they are. Bear in mind, this approach was toyed with in the 2003 historical ranking factors patents and even this patent was 2010 (so in play 09 at least).
So, to my thinking, this might be something that is triggered once a site has been in some way identified as potentially spammy already. To get a sense of the myriad of ways they look at webspam search out this other recent post over on SNC:
See that? There's a TON of 'em.
So, consider that when a site trips one or more of those, Google might THEN send over the transition rank algorithm to be part of a further investigation of the domain.
In that implementation, it becomes a matter of 'trust' - something we know Google is big on. Consider Matt Cutt's most recent post on paid links where he says,
"That’s a clear violation of Google’s quality guidelines, and it’s the reason that [website]‘s PageRank as well as our trust in the website has declined."
Not an uncommon theme. And let's go the other direction, Wikipedia. Certainly a website that makes changes all the time. To the pages, incoming links and so on. What's the difference? A level of domain trust.
it certainly makes a strong reason to leave a penguined/panda site alone and go off and start again on a new site.
This was the contention floated by Tom over at Zdnet. I can guarantee this is not the case. I talked to a Googler yesterday imploring them to chime in on the post, he said that they're tried to rationalize with him in the past, to no avail, so they didn't get involved.
Once more, find the post I just mentioned on web spam/penguin. THOSE are the kind of things that would likely put one in the path of the transition rank algo. Cleaning things up isn't going to qualify a site as spam.
Let's look at ONE of them as an example;
"Keywords in page TITLE: a classic boosting technique and research has found spam pages contain far more keywords than non-spammy (classified) pages"
So... consider I have this page TITLE
Buy blue widgets online
Changing it to;
Buy cheap blue widgets at Wisget Barn
Isn't likely to trip a filter, but this might;
Buy cheap blue widgets online, best blue widgets and widget supplies from the Widget Barm
Then we consider maybe that you also throw some spammy links at it (beyond the thresholds of other listings in the query space) at the same time.
Now you could be in a position where the site could lose some trust. In an attempt to further qualify the site as manipulative, they'd start to have the transition rank approach applied to the site/page.
Obviously I haven't a clue... but this does seem like the more logical approach. I have 7yrs of patent and paper studies and talking to Googlers (in private) which I base my theories on tho. The broad based application of the patent doesn't make as much sense.
So, my current advice is to not get worked up about it. This is one of a few hundred patents awarded each year. We (the industry) need to keep some context by looking at the others as well, not just cherry picking 1-2 per year.
Anyway, there's me 2cc folks. Always up for geeky discussions... keep it goin!