homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 245 message thread spans 9 pages: 245 ( [1] 2 3 4 5 6 7 8 9 > >     
Google Updates and SERP Changes - February 2011 part 2

 6:39 pm on Feb 26, 2011 (gmt 0)

< continued from [webmasterworld.com...] >

But, I believe that Google now prefers sites with a much higher percentage of "valuable" pages. If you don't meet the percentage value determination, whatever that is, you get whacked.

I tend to agree with Fred. This is what I am finding as well.

[edited by: tedster at 8:00 pm (utc) on Feb 26, 2011]



 6:45 pm on Feb 26, 2011 (gmt 0)

Let's make sure of this: You search for site:example.com and the first page listed is not www.example.com but www.example.com/other-page.html ? Do you rank for your 'example' (domain keyword) ? Do you have a fresh cache?
Long time ago there was a penalty, link related, where the first page would be demoted and by extension everything else too.

Homepage cached yesterday. Is an old site,plenty of quality links.No paid links. I know about 350 penalty, seems to look like that..

Searching on Google:
site:www.example.com shows results with my internal pages (categories), but homepage is on 20 place
example.com shows my homepage in 1 result with sitelinks
my brand name shows my homepage in 1 result with sitelinks

I do not see the benefit of this update in results, just filtered old established sites.


 6:49 pm on Feb 26, 2011 (gmt 0)

Here is what I am seeing:

Large sites, 5000+ pages that are older and haven't been updated in years are moving up.

Large sites that are newer or have been constantly updated are moving down.

I'm seeing domains in the top 10 I haven't seen in years and they are still showing very outdated information.


 6:51 pm on Feb 26, 2011 (gmt 0)

IMO There's still a significant amount of shuffling to go ... Titles and descriptions, even for a site: search have been very slow to update lately (especially since the roll out, but even the week before) on the sites I watch, which to me says there's still some significant processing going on behind the scenes and imo it'll take a few more days for the indexing to catch up.


 6:56 pm on Feb 26, 2011 (gmt 0)

Have heard a theory which sounds plausible to me, but wanted to run it past you lot.

It's this: in order to target content farms and work out their "true" authority, G dialled down the emphasis on internal linking. Therefore sites with good external links to every single page are doing great, but sites that depended on sheer size and internal linking to rank, have tanked.

What makes me think this is plausible is the difference in the hit to Hubpages and Squidoo, which several people have posted. To me, these seem like identical types of site, so I would have thought they would have both dropped by the same proportion. But according to some figures hubpages has dropped by an average 30 positions and Squidoo by just 15 positions, and Quantcast is also showing a 40%+ drop in Hubpages traffic, but circa 15% drop in Squidoo's.

The only real differences between the two sites is the way they are structured. Hubpages was designed by IT folk with brilliant internal linking, and Squidoo was designed by PR people with abysmal internal linking. So Squidoo may have been ranking mainly on external links - and they've dropped less because of this (i.e. they didn't have much of an internal linking advantage to start with).

This might also explain why Mahalo dropped so significantly - nobody links to them naturally, do they, and I doubt they were able to build a bunch of external links themselves either, especially as they scrape and have no users who are looking after their own pages as it were.

It might also explain why Ehow is still ranking - all those writers on ehow, before Demand Media closed it to all comers, used to build links to their own pages, so I imagine each page is pretty well supported externally.

What do people think? Is this way off mark, or is this plausible?


 7:14 pm on Feb 26, 2011 (gmt 0)

I think there is a solution available for this "source of originality" and its quite simple I think google can implement it.

Let's say you write article... then the first step would be to ping google server (something like pingomatic) that you have something new for "lunch". Then google will immediately crawl that secret (so far) content and send back some confirmation code so you can now publish it.

If there will be API available for this then all major CMSs can implement it including custom made systems.

This would be the end of content scrapers who rank better now. Of course it won't solve rewrite issues but it would be the great first step in fight against this.


 7:15 pm on Feb 26, 2011 (gmt 0)

I think there's some very good thinking behind those ideas, AlyssaS. It does seem likely that something about a linking footprint is involved in this update, and internal linking could well be part of that. That's a savvy observation on the difference between Squidoo and Hubpages.

However, I doubt that it's only one thing. I also think there's a major component that uses some form of semantic analysis on the content itself, instead of the algo looking only at secondary signals. Matt Cutts mentioned that parts of this change were in development since last year - that's a big project! Coincidentally, Anna Lynne Patterson, the brain behind their phrase-based indexing patents, returned to Google last year.


 7:24 pm on Feb 26, 2011 (gmt 0)

Here's a fun thought....

If Bing Copies Google [webmasterworld.com] on these serps.... well it could get ugly for a lot of sites.

[ok, that's more of a scary thought than a fun thought, at least for some, me included.]


 7:24 pm on Feb 26, 2011 (gmt 0)

Just curious: some are referring to Quantcast data, is there some way to see up to date Quantcast data? I can only see through the end of January.


 7:30 pm on Feb 26, 2011 (gmt 0)

Now only a few incoming links are necessary to get tons of long tail traffic IF a site survives this algo change.


 7:33 pm on Feb 26, 2011 (gmt 0)

Here's another thought. It was little more than three weeks ago that we saw the Scraper Update [webmasterworld.com] that aimed to improve attribution for the original source of content.

I'm wondering if this new Farm update undid some of that progress against scrapers? I'm seeing some data that makes me think it has - at least for now - and the adjustment everflux period we're now in has to repair that damage.


 7:33 pm on Feb 26, 2011 (gmt 0)

dataguy - here's the quantcast link for hubpages:


You need to click on chart settings to pick US, and then set the range to one week, so you can see the daily change over the last week. They've updated it up to Feb 25th.

They've gone from an average 700k to under 400k hits a day.

I looked up Mahalo too, but sadly, they hide their stats. And some of the article directories only have estimates rather than actual stats (because they haven't got the quantcast tracker in their templates).


 7:38 pm on Feb 26, 2011 (gmt 0)

I'm wondering if this new Farm update undid some of that progress against scrapers? I'm seeing some data that makes me think it has - at least for now - and the adjustment everflux period we're now in has to repair that damage.

I think it's been undone too. If you scroll back on this thread to the 8th Feb, I was overjoyed that a competitor of mine had been removed who had copy/paste stuff. But he's back.

He was removed at the end of Jan. Then came back at the start of Feb. Then got removed again on 8th Feb (which was the link algo change, so not sure why that caught him again). And now he's back.


 7:41 pm on Feb 26, 2011 (gmt 0)

Long time lurker here...

I lost about 50% of google traffic when the update hit..

However, I'm now seeing damn near all my positions back to "almost normal" In IE.. Im not logged in etc..

Im also seeing ezine, hubpages, articlesbase ranking almost where they was before as well for terms such as acai berry, reverse phone lookup etc..

Here are the ip's im seeing these results on in IE, mozilla still not showing these results, but in my previous experience mozilla lags behind IE


 7:49 pm on Feb 26, 2011 (gmt 0)

jdn - I can definitely confirm that those IPs are returning original (pre 24) results. But none of my google.com visits are pointing to those IPs.


 7:50 pm on Feb 26, 2011 (gmt 0)

@jdn123 you can use Google AdWords: Ad Preview Tool (search words on google) , select .com and United States.

Many sites are affected and I think is not about a recent activity of a webmaster building links etc. Also, since pages dropped seriously in serps looks like a filter.

Now my thoughts are to on page elements: structure of page, repetitive elements on each page, internal linking, keyword density, external links. But looking at competitors, we usually have similar seo settings, some of them are not affected. Even i have added reviews, images,tutorials and ranking better than them before this algo change, they rank now better with no original content.


 7:51 pm on Feb 26, 2011 (gmt 0)


For my sites rankings I wouldn't say it's original to pre 24 but it's damn near and I just started seeing it about 15 mins ago


 7:54 pm on Feb 26, 2011 (gmt 0)

Some speculation on how the algo could work?

1. The "content farm" algo is tied to the "duplicate scraper" algo. The duplicate scraper algo cleaned out scrapers but kept legit sites/original sources up. This algo identifies scrapers.

2. The "content farm" algo imposes a site wide penalty on you if you have enough copied/repurposed stuff on your site. THAT is what pushes you down below your scrapers, esp. if you are a big site. There is a trigger on how much of your content has been copied verbatim based on percentage of duped content.

3. They will then run the "duplicate scraper algo" to clear out the crap/garbage. There was collateral damage on the first phase ("content farm" clean out phase) that marks you with a sitewide demotion/penalty. However, your penalty will potentially be lifted once the duplicate scrapers are identified and removed.

Maybe the definition of "quality" of your site is based on that sitewide parameter. Your authority is not the issue here, it is if you triggered this new penalty that marks you as copying/regurgitating. G could only add that sitewide penalty after testing the duplicate scraper algo and making sure it worked.

There's basically an amount of uniqueness measured here, which identifies "quality".

Just speculation of course.


 7:54 pm on Feb 26, 2011 (gmt 0)

@ rowtc2

I tried ad preview tool and it's what im seeing without using it.. I just noticed 15 min ago and I'm on the east coast


 8:03 pm on Feb 26, 2011 (gmt 0)

I'm wondering if this new Farm update undid some of that progress against scrapers?

For lack of more understandable terminology:
Makes sense; don't they usually roll the algo and then re-install the filters on top of it?

ADDED: I think that was actually stated as the way they do it by the GoogleGuy who used to post here, but I'm not 100% sure.

IMO the the scraper update was a filter, but this one is definitely algo...
Yeah, they both effect the algo and rankings, but I can't think of another way to describe what I'm thinking, so let's not split hairs, please? Thanks! lol

[edited by: TheMadScientist at 8:22 pm (utc) on Feb 26, 2011]


 8:13 pm on Feb 26, 2011 (gmt 0)

Again, for those who are thinking of this as a quick fix that Google rolled out - let's revisit an earlier thread from this month:

Moultano [a Google search engineer on HackerNews]:

We've been working on this issue for a long time, and made some progress. These efforts started long before the recent spat of news articles. I've personally been working on it for over a year. The central issue is that it's very difficult to make changes that sacrifice "on-topic-ness" for "good-ness" that don't make the results in general worse. You can expect some big changes here very shortly though.

Big changes promised shortly at Google [webmasterworld.com]

There's the core trade-off Google wrestles with: relevance versus quality. And this particular search engineer has been working on the project that created this update for more than a year.


 8:31 pm on Feb 26, 2011 (gmt 0)

Okay, I had to go WAY back to find a reference, but if you go to page 9 (20 per page) of this thread [webmasterworld.com...] there are comments about the data rolling first, then filters and backlinks being added in.

I'm sure it happens faster now, and with Caffeine I don't know if the process has changed, but imo it stands to reason for changes this big they may have to do something similar.

I'm about positive I've read much more recent 'officially unofficial' comments on the process working this way too.

Algo gets update.
Some filters and systems have to be 'turned off' during the process.
Those filters and system get added back on top.


 8:36 pm on Feb 26, 2011 (gmt 0)

Lots of speculation and opinion in these threads, too much of it stated as "facts".

I've been nailed hard in this update, that's a first for me after 10 years of updates.



 8:49 pm on Feb 26, 2011 (gmt 0)

Me too Ken_b. 10 years of running my site. All content created by my users (much like WebmasterWorld). We have battle scrapers to the hilt over the years, and after this update it seems some of them have beaten us.

Not sure what to do now. Guess the best course of action is to wait and see how it all rolls out. And report any problems we see.


 9:08 pm on Feb 26, 2011 (gmt 0)

The "content farm" algo imposes a site wide penalty on you if you have enough copied/repurposed stuff on your site.

I don't think that's the mechanism. Many big name sites that lost rankings are not using copied content. They're original content, but really poorly written and with no "meat". In other words, it's only content according to the technical definition of the word. But it's really just the thinnest of gruel.


 10:26 pm on Feb 26, 2011 (gmt 0)

In the previous thread, FredOPC's comment caught my eye:
How google makes this determination: who knows... could be links, could be bounce rates, could be a mixture of these.

In an earlier post, someone mentioned a rate of about 1.5 pages per visitor.

Would it be worthwhile to look at *Time on Site* and *Pageviews per Visitor* as potential indicators? Not as averages, but stratified, i.e.

percentile tos pages
20 1.00 1.5
40 2.00 3.5
60 8.00 10
80 9.00 12
100 10.00 20

I would assume that 'content farms' have a both a lower average and a more homogeneous distribution:

percentile tos pages
20 1.00 1.5
40 1.50 3.0
60 2.00 4.0
80 4.00 7.0
100 6.00 8.0


 10:57 pm on Feb 26, 2011 (gmt 0)

I have been thinking about (time on site) and (bounce rate), but all my sites that got hit all have low bounce rates, high page views - an average of 3 page visits or higher per visit.

I also have lots direct traffic, people bookmarking and coming back to the sites. I provide useful quality content or the people would not be coming back.

I think this has something to do with backlinks. My 1 site that I have not been adding backlinks to stayed strong and increased traffic, the rest took big hits (been adding backlinks to them all). The sites I have been adding the most backlinks took the biggest hits.

There is only 1 site that is the exception to this and it has went up a couple of notches due to the ezine articles sites that got wiped off. Google had already put a filter on this site and held it on page 2 due the amount of backlinks I was pushing at it.

I am having a hard time believing this focused on quality, I am sure there are a lot of variables but backlinks played a big role in this update.

I am seeing sites that are big time backlinks buyers staying strong, I do not rent backlinks like these sites do. I am thinking Google see the backlinks they purchase as better quality so they did not get hit.

So quality of backlinks, amount of backlinks over a given time and possibly using to much of the same anchor text seem to part of the problem.

Also newer sites seem to not be effected at all, all my sites that got hit were older established sites, and yes people do copy my content so that could be another factor.

Maybe time to start buying higher powered backlinks? It seems to be what Google likes at the moment.

[edited by: kd454 at 11:03 pm (utc) on Feb 26, 2011]


 10:58 pm on Feb 26, 2011 (gmt 0)

backdraft7, you seem to have strong opinions as to what constitutes a "content farm", so let me present a site to you, and you tell me if it's a content farm. The majority of the site's pages are product pages that gives photos and descriptions of widgets. Manufacturer photos, descriptions and specifications are used, as they would be in an ecommerce site, but are re-written to explain how features benefit the user.

The product pages are there to direct visitors to widget retailers who pay to be on the site. Users can find the retailers by searching by the brand of widget.

There's other content on the site, including a small ecommerce store, but the bulk is the product pages I mention. Is that a "content farm"?

Getting back to all of the theories as to what's going on, I think it's premature to be arguing about what may or may not be happening. There's been a lot of good theories posted with examples, but for every example I can cite ten sites right off the top of my head to which the theory doesn't apply.

I have rankings for key phrases for several different niches covering many months last year. It will be interesting to examine those to see if there's some commonality between the sites that lost rankings big time, as well as the sites that were largely unaffected.


 11:11 pm on Feb 26, 2011 (gmt 0)

Would it be worthwhile to think from the point of view of the big G. They are trying to hit content farms. What are content farms? Some characteristics:

- large site
- authoritative site
- high PR
- established, maybe old site (e.g. ezinearticles)
- huge link profile for domain (high variety)
- a lot of syndication (ezine and all the other sites encourage this)

When i think about it, my site has the same profile as these so-called content farms, EXCEPT I DO NOT explicitly allow syndication. I have a copyright on my site, but scrapers scrape away anyway. It's possible that over a thousand of my pages have been copied 90% or more by scrapers.

And yes, that would be my bad, for having an RSS feed that allowed for full page view.

Could G think of me as a content farm? By these standards, yes. I got caught in their net. The question is, how do I distinguish myself from ezine, ehow, etc? A lot of "expert" content may be respun precisely because this is top quality content. But somehow you are mistaken to be a content farm.

Recourse here is to settle on other factors to see if you are a quality site vs content site, as Caribguy suggests. OR complain to G about getting reinstated.

Should we wait for all of this to blow over? I wonder how long they'll figure out which is the content farm vs the quality established authority site that is innocent!


 11:18 pm on Feb 26, 2011 (gmt 0)

Here are some more numbers to consider. My site has 125,000 pages of UGC. Of these pages:

9,700 pages are showing a 39% average drop in Google referrals from a week ago.
1,800 pages are showing a 59% average increase in Google referrals from a week ago.

So my site has pages which have been hurt by this update, and pages which have benefited from this update. Unfortunately it averages out to a 34% drop in Google referrals overall.

Doesn't this mean that this update does not introduce a site-wide penalty?

This 245 message thread spans 9 pages: 245 ( [1] 2 3 4 5 6 7 8 9 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved