Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google's AdSense Farm Update Was a Re-ranking - NOT a Penalty

         

TheMadScientist

5:06 pm on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I know quite a few of us have said the changes in the SERPs related the AdSense Farm Update is not a penalty, and I've tried to explain it, but there are too many threads to get it in all of them and not everyone reads every post, so I'm going to go ahead and post this in it's own thread so people can link to it rather than trying to explain the difference if they would like.

PENALTY CHANGES - how they work

BEFORE PENALTIES
#1 = will get penalty
#2
#3 = will get penalty
#4
#5

Every result with no penalty just moves up, filling in the gaps that were opened.
They all stay in the same relationship with each other.

AFTER PENALTIES
new #1 = was #2
new #2 = was #4
new #3 = was #5
new #4 = was #6
new #5 = was #7


RE-RANKING CHANGES - how they work

BEFORE RE-RANKING
#1
#2
#3
#4
#5

All the results now get shuffled, some go up different amounts and some go down:

AFTER RE-RANKING
new #1 = was #3 [up 2]
new #2 = was #21 [up 19]
new #3 = was #2 [down 1]
new #4 = was #1 [down 3]
new #5 = was #11 [up 6]

So there's a big difference between a penalty change and a re-ranking.
If you only look at drops you don't see the bigger picture of what happened.

[edited by: tedster at 7:47 pm (utc) on Mar 1, 2011]

universetoday

3:52 pm on Mar 2, 2011 (gmt 0)

10+ Year Member



@scooterdude - I don't think that's the case. Even though my site is one of the "nuked" sites, and my top keywords have supposedly moved down, my overall traffic has remained roughly the same. That means that my longtail terms have been relatively unchanged. If the site was hit across the board, I would expect my entire traffic would have been crushed. So far, I'm still not noticing any impact to my traffic, despite the fact that I'm on the Sistrix list.

scooterdude

4:04 pm on Mar 2, 2011 (gmt 0)

10+ Year Member



A digital footprint is not necessarily common accross every page of a site

Furthermore, part of a footprint might well include keywords targetted by each page using known/expected SEO tech

A focus on page based factors would add a degree of what they have referred to as granularity to their filter/penalty system

scooterdude

4:16 pm on Mar 2, 2011 (gmt 0)

10+ Year Member



Its reasonably well known that Google can an does personalise results, plus the SERPs can change as quickly as Google chooses, so the Sistrix list,,,,,

might need reruns, re verification, ,,,

TheMadScientist

4:29 pm on Mar 2, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The first time I heard of re-ranking was in the context of a LocalRank paper from Google - in 2002. The idea presented there was this:

1. Take an initial set of the top 1,000 URLs, scored by the relevance algorithm.

2. Now analyze those 1,000 results by how they inter-link within the set - disregarding any links from outside the set.

You've now got a new number, which you can apply by re-ranking the original set of 1,000 pages. No new pages will come into the mix, and no results can be thrown out either.

Personally, I think the 'next logical step' in this type of ranking is to assess penalties based on the factors built into the new scoring to remove pages from the main index but not add pages to it, which is why I think it's entirely possible Jane_Doe has a penalty due to the addition of the new system, but that's not all the system is.

If you think about it, you already have relevance determined, so you really can't add a 'quality' page in to the index based on 'quality' alone, because it would not be relevant, but you could remove a page based on the 'lack of quality' even though it was relevant.

So, you install the new system, re-rank and assess penalties, but don't 'include replacements' for the pages penalized, because you already have plenty of results and including a non-relevant page based on 'quality' doesn't make much sense.

Anyway, it's just a hunch, but I could see it being done that way...

TheMadScientist

4:40 pm on Mar 2, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



One reason I think people should not change too much until things 'settle' is I would guess there's a 'relevance' v 'quality' struggle going on and 'blending' the two together across queries / query-types will take some time.

EG For a 'shopping' query, where there are probably a larger number of relevant pages, it could make 'quality' more important or weighted higher than it would be for say an 'informational' type query, where there are likely a lesser number of relevant answers.

ken_b

4:51 pm on Mar 2, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



One reason I think people should not change too much until things 'settle' is...

And another reason might be that Google is apparently reworking this thing a little bit.

Google Is Working on an Algo Fix - to help wrongly demoted sites [webmasterworld.com]
.

Lapizuli

8:27 pm on Mar 2, 2011 (gmt 0)

10+ Year Member Top Contributors Of The Month



The Farm Update did not find sites that were violating a guideline and give them a penalty. Instead, it tried to measure the quality of page content. It now generates rankings based on folding that additional ingredient into the full recipe.


@tedster, I don't disagree that it's not a penalty. I see the difference, and that's the way I've been thinking of it, too.

But I think Google's a little confused about it, because they painted this update using a much broader headed brush than they should have. And I'm not seeing pages differentiated by quality at all - just by randomness. The good is not rising, the bad is not falling. I find it interesting that Amit Singhal said:

[wired.com...]

“Therefore any time a good site gets a lower ranking or falsely gets caught by our algorithm — and that does happen once in a while even though all of our testing shows this change was very accurate...”


This doesn't sound to me that Google's targeting pages, but sites. And that's weird to me, because by their nature, "content farms" are broad-based, covering every topic imaginable, with a range of talents and expertise involved, and by virtue of this fact - regardless of their quality - will INEVITABLY get the most criticism because, put simply, everybody lands there. Some find good, others find bad, and collectively, the negative voice is louder. To me, targeting the higher quality "content farms" is like targeting the Web, period.

To use my own experience, since these are the sites I'm familiar with, I'm finding it odd that a site like Suite101, which has a surprisingly large proportion of real journalists and pro writers and smart people writing for it, got hit so hard - specifically, that those writers' articles got hit so hard. High quality articles, as far as I can judge, seem as badly hit as low quality.

I'm also finding it odd that, judging by others' reports and my own articles (some of which are fairly decent) a lot of really good - i.e., not just "original" and "grammatically correct," but well-written, helpful, and exhaustive - articles on HubPages have been downgraded, when, if published on independent sites, pages of that caliber would be considered helpful.

So I guess I'm saying that if there are changes to be made to optimization, it can't happen at the page level. There's something about these sites that's being targeted, and that's effectively a penalty - perhaps a new kind.

Unless I'm thinking of this the wrong way. I hope I am.

TheMadScientist

8:47 pm on Mar 2, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This doesn't sound to me that Google's targeting pages, but sites.

It could be the overall quality score factored in the whole 'page footprint' which imo would include the layout (html) of the page and could explain why on some sites it looks to be site-wide and on others page-by-page.

If 'overall quality' includes 'visual quality' as well as 'textual quality' it would explain why some textual content that's seemingly 'quality' would be hit.

IOW: If they combine a 'visual quality' score with a 'textual quality' score to determine an 'overall quality' score there could be cases where the 'overall quality' did not outweigh the 'overall quality' score of other relevant pages by enough to retain rankings, except on pages where the 'overall quality' score threshold was lower ... EG Obscure Results ... This would mean the loss of rankings could be attributed to a lower 'visual quality' score rather than a low 'textual quality' score.

If that's the case, it could easily affect an entire site with an overall lower 'visual quality' score, but only affect pages on a site with a higher 'visual quality' score, which would explain quite a bit, imo.

One reason I keep thinking 'document footprint' as a whole, including 'visual', is they have had the ability to render a 'view' (the algo can tell where elements are positioned on the page) of the pages they spider for quite some time and I suspect they may have included scoring based on that ability in the new 'quality' piece of the algo.

dibbern2

8:00 pm on Mar 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think I'm seeing a sitewide effect, although it is small.

According to webmaster tools, I have lost .6 places in ranking for a family of terms (think: red widget elgibility, blue widget elgibility, etc. in blank widget elgibility)

The page content for these terms is variable. The red widget elgibility page is remodeled and excellent content-wise and links-wise. The blue widget eligibility page is old, thin, out of date and scheduled for remodeling. Yet both, along with yellow, orange, green, and many other color widgets seem to have suffered the same loss of rank.

On the other side, in the same site, pages for widget repairs have gained ground accross the same mix of high-medium and low quality pages for many color widgets.

Oddly, the simple terms like *blue widgets* without a short tail attached, increased rank the most.

hyperkik

11:46 pm on Mar 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



In terms of this being "re-ranking" versus a "penalty", isn't that a matter of semantics?

To put it another way, your comments and the notion that this may be a re-ranking associated with duplicate content issues led me to do some investigation. Is it correct to say, "Many pages on my site were re-ranked because a large number of sites, including some high-profile sites, have plagiarized their content," or "Many pages on my site were penalized because a large number of sites, including some high-profile sites, have plagiarized their content." If their plagiarism has in fact triggered this result, I don't see that it's any less accurate to say "the original content was penalized" as opposed to "the original content was re-ranked".

trakkerguy

12:12 am on Mar 7, 2011 (gmt 0)

10+ Year Member



"re-ranking" versus a "penalty", isn't that a matter of semantics


If it's a re-ranking due to algo change, changes to the site/page can bring the rankings back soon after crawl.

A penalty is likely to require much more to recover from - perhaps a manual intervention.

hyperkik

12:28 am on Mar 7, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If it's a re-ranking due to algo change, changes to the site/page can bring the rankings back soon after crawl.

Possibly, but that doesn't mean that a site isn't being penalized (fairly or unfairly) by the algorithm change.

And if the algorithm is responding to the fact that content on your site has been scraped and plagiarized, such that its content appears in various forms on many other sites, it's not as if there's an easy on-site fix.

The one site I've heard about that made a speedy recovery from this algorithm change, Cult of Mac, was reportedly the beneficiary of a manual intervention.

TheMadScientist

12:30 am on Mar 7, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Isn't a dl, ul, and ol just a matter of semantics too, then? No point in using one is one situation and one in another to more accurately describe each situation, right? They're only semantic differences.

If you've really read the whole thread and can't see a difference, then no big deal, call it a penalty if you like, but a 'penalty' takes weight away; It penalizes ... Go figure ... A re-ranking can increase or decrease the weight given to a page (or site) ... I guess I see a difference even if you can't.

TheMadScientist

12:46 am on Mar 7, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It's funny how the people who keep thinking this new system is only a 'penalty' system keep referring to their page(s), because they lost ground, but what I guess they probably miss out on thinking it's a penalty is taking a look around and wondering why another page came from nowhere to rank ... If you only look at it as a penalty rather than a re-ranking you miss out on half of the picture, because you never bother to look at the new results, specifically the pages that moved up, and ask yourself why... What made that 'page from nowhere' suddenly rank?

In a penalty you only have to find the reason for the drop, but when they change the algo and some pages get an increased weight, it's probably prudent to look both ways ... I'm fairly certain it's not merely a semantic difference, and I would say the most accurate way to describe the situation hyperkik is talking about regarding a 'penalty' is: "These pages either lost a great deal of 'weight' or may have even been penalized during the re-ranking process."

If you only look at the down side it could be much tougher to move back up...

NixRenewbie

5:24 am on Mar 7, 2011 (gmt 0)

10+ Year Member



per econman
"Over the long haul, however, by imposing higher editorial standards and encouraging their contributors to write fewer, longer, articles that don't simply regurgitate/repurpose/copy content published elsewhere, than perhaps they can improve their situation over time. "

I was noticing that a lot of the beaten down sites seem to specialize in the 400-600 word article length and reward frequent submissions. One such is a content mill I formerly wrote for. According to one source, it lost 79% of its se traffic. That's a lot. The ginker-toys sites use a hand-ful of words and a couple quick pics (from the device manufacturer ... so they are duplicates) and call it good. Google seems to have disagreed.

eHow should be the subject of a Pixar film for how they managed to escape the axe. Kudos to the wizards behind the scenes there ... it sure isn't the writing that kept them afloat.

Of late, I haven't cared for the results pages of G ... too much garbage above the fold ... so I've switched to Dogpile the past couple of days. I'm liking it. Fewer results, but better.

NixRenewbie

5:47 am on Mar 7, 2011 (gmt 0)

10+ Year Member



Every once in a while it occurs to me that Google might just be yanking our chains.

Naw ... they wouldn't do that. Would they?

hyperkik

6:00 am on Mar 7, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



but what I guess they probably miss out on thinking it's a penalty is taking a look around and wondering why another page came from nowhere to rank...

Or why it didn't get penalized? I see a page, now ranking well, that has content that's too low in quality for anybody to plagiarize. I guess that puts it in the territory of lower quality eHow articles - it may be junk, but it's unique junk.

TheMadScientist

6:01 am on Mar 7, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



it may be junk, but it's unique junk.

LMAO! Thanks .... That's so true!

honestman

9:24 pm on Mar 7, 2011 (gmt 0)

10+ Year Member Top Contributors Of The Month



Links, links, links. It seems to me that G wants to decide what is a good link, so any site which links out extensively in the body of content, or on pages of links which have been carefully edited, are punished now (this was not the case previously to Panda). No intermediaries are allows. Only G is allowed to determine what is a good link out (and I am not talking about link exchange, which we do not do). The rest of us are now considered inferior as editors of what we determine are good and useful links. So we will now remove all links from all sites (designed to help the audience by providing extra context) and let Google determine which are the "good links" exclusively. Sad that this new update treats all our editorial choices as inferior, but it seems to be the case.

tedster

10:35 pm on Mar 7, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



any site which links out extensively in the body of content, or on pages of links which have been carefully edited, are punished now

That's jumping to a conclusion - there are many counterexamples to the idea.

honestman

10:51 pm on Mar 7, 2011 (gmt 0)

10+ Year Member Top Contributors Of The Month



@tedster With all due respect, I have seen more sites punished for outbound links in this update than any other update ever -- and I mean sites with outbound links that also have high-ranking inbound links as well. I think that the idea is to "close the Universe" such that G is the sole judge of what constitutes good links. I have analyzed hundreds of sites in the last few days and see this pattern--and I do not doubt there are counter-examples as you say, just as there are many counter-examples of the still not clearly undefined "content farms" which have not been demoted in this update.

tedster

11:03 pm on Mar 7, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I realize my comment was a bit blunt, honestman - it was not meant personally. I wanted to stop a new myth from getting started. There may be a correlation of some kind with outbound links - and correlations can be very helpful in analyzing anything.

But to say "any site..." would mean you found a cause effect relationship, and even more, "any site..." means there are no counterexamples. So I'm cautioning care with our words here. That's how new SEO myths get started.

econman

11:04 pm on Mar 7, 2011 (gmt 0)

10+ Year Member



Links, links, links.


I haven't see any evidence this update is about links -- except for speculation/theorizing that links from sites like ezines may be worth less today than they were before the update.

Aside from that sort of indirect impact, how would a link related change explain other patterns we've seen?

Consider, for example, the study by potpiegirl which noted a key difference between eHow and its competitors: eHow has a much smaller absolute number, and lower proportion, of pages about male enhancement pills and other topics appealing to spammers, compared to the competing sites which suffered in this update.

That data could be an indication that huge numbers of pages on the impacted sites are now being disregarded for internal link analysis purposes, adversely affecting the entire site. If so, it would be logical for the same thing to be happening to links going from those low quality pages to other sites, as well.

Or, it could have little or nothing to do with links, and just be an indication that publishers of the adversely affect article sites weren't careful enough in vetting content being added to their site -- allowing too many pages with various other low quality indicators (consistent with the fact that the publishers allowed tens of thousands of redundant articles to be published on their sites about topics that are popular with spammers).

honestman

12:08 am on Mar 8, 2011 (gmt 0)

10+ Year Member Top Contributors Of The Month



@econman It was interesting that various of my webmaster and analyst friends say that when they are looking for information that they now get specific companies with a minimum amount of external links (mostly internal links), while they could not find the sites they had previously found which were full of links to related information which they found useful. These were extremely high quality sites which were "confident enough" to link out to external sites/evidence. In other words, the quality of the pages they found useful was not in being self-referential in terms of their own site.

Most of those I know went to other search engines to get the information they needed, as they did not need the information offered by specific service companies. It forced them to sort through multiple pages to get to what they wanted--overviews by expert sites. This frustrated them and made them think that the very concept of the Web as being born for research purposes was no longer the case with this update, but rather based on undetermined factors. I am convinced that this update is going to frustrate many users -- forget the webmasters for a moment. Just an observation based upon empirical evidence and discussion with "power users."

I hope I am wrong, but every search confirms my views.

Reno

5:29 am on Mar 8, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



and correlations can be very helpful in analyzing anything.

We have little to go on, so correlations jump out and must be examined for their significance, because at least it is something.

But I fear that too many of us are looking for a smoking gun, and I don't think there is one to find. The algorithm is so deeply complex at this point that the impact of any seemingly important site characteristics is going to differ (perhaps wildly differ) from one site to the next. It will differ because of all the other stuff.

It's like taking medications. If you just take one, you can read about the side effects and watch for anything; if you take three meds, you will not know for sure how the other 2 are reacting with the first; if you take 20 meds, you have a witches brew, and so the Google algorithm with its hundreds of ingredients will almost certainly have unpredicted side effects on every site it swallows. One site has good text content & little external linkage and does great, whereas the next has minimal text but a ton of external links, and does equally well.

We learn little from that, and that's only 2 aspects ~ now when you mix in the hundreds of others you'd better pray for divine guidance because no mortal will get it.

We are beyond the point of single cause & effect, or double, or triple. We're in uncharted territory ~ some patients will be energized, others will die, and some that die will be otherwise perfectly healthy. This update is proof of that.

......................

dibbern2

5:56 am on Mar 8, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It's NOT about outbound links. My experience this week say that's the wrong alley. Totally wrong.

honestman

7:02 am on Mar 8, 2011 (gmt 0)

10+ Year Member Top Contributors Of The Month



@dibbern2

With all due respect:

I have sites which demonstrate to me that outbound links are a HUGE factor (among many others in this update). G seems to devalue pages with many outbound links even if it is has quality inbound links (why?) and does not want any intermediaries in the form of editorial choices and aid to the end user. When I remove quality related outbound links from pages, the pages immediately shoot up in ranking. Tested over and over and over and yet to see a case where there was not an improvement thus far.

Reno

7:43 am on Mar 8, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



When I remove quality related outbound links from pages, the pages immediately shoot up in ranking. Tested over and over and over and yet to see a case where there was not an improvement thus far.

If the original purpose of the link was to provide additional quality content/info for your users, then I'd be curious what would happen if you simply added a rel="nofollow" to each of them? With the quality links still on the page but purposely devalued, would the ranking go up?

.......................

honestman

7:57 am on Mar 8, 2011 (gmt 0)

10+ Year Member Top Contributors Of The Month



@Reno

Thank you for the suggestion. Will try that on test article pages, though pages with many relevant nofollows (notification of a text advert as required by Google) are the ones that have taken the biggest hit, even when they have previously been among the most popular pages in terms of user time spent, etc.

Reno

8:30 am on Mar 8, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



though pages with many relevant nofollows (notification of a text advert as required by Google) are the ones that have taken the biggest hit

This is not good news, as like you, I followed their rules and hate to think that it is now a punishable offense, so let's hope it is something else about those pages that took the hit.

........................
This 99 message thread spans 4 pages: 99