Welcome to WebmasterWorld Guest from

Forum Moderators: Robert Charlton & andy langton & goodroi

Message Too Old, No Replies

Google Updates and SERP Changes - Sep 2011

9:47 am on Aug 30, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:Mar 9, 2010
votes: 9

< continued from: [webmasterworld.com...] >

There were declines for hubpages after the initial traffic improvements. But 23 aug seem to have reversed the downward trend and they are almost back to pre-panda levels. Sub-domain strategy is proving to be fruitful for this large content farm. This might help the other larger content farms to recover as well.But it might not help the mid and small sized sites and they will remain hurt. Here is why I feel that way.

Hubpages are creating sub domains at the user level and not at the topic level. I am noticing several of their current top ranking pages are from users who post on a variety of topics. This beats the theory that panda is favoring topic focused sites.But what surprises me is the upward movement of some low quality pages from poor quality writers. I have examples of such sub domains which I cannot share here.

So what is really helping them? As I understand it, the move to sub domains are helping them beat the panda's sitewide penalty. This suggests that the sitewide drag enforced by the google panda algorithm is very strong.

But why will this subdomain strategy help only the large content farms and not the smaller ones?

The relatively smaller and mid sized sites can branch off to only a fewer sub domains, in comparison to what the bigger content farms could. Hubpages have now created a large number of subdomains. I am guessing that the panda algo isn't applied to several of them. The smaller sizes of these new sub domains are probably helping them keep a distance from the panda evaluation.

If other bigger content farms intend trying the sub domain strategy, they should follow the hubpages model as it is and create sub domains based on useror any other unitthat will lower the number of pages per sub domain.

I guess that this will be relatively difficult for mid and smaller sized sites, as the number of authors will be less and hence pages per author will be high enough to be trapped by the panda net.

[edited by: tedster at 9:55 pm (utc) on Sep 2, 2011]

8:59 pm on Sept 30, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Aug 31, 2001
posts: 1357
votes: 0


I'm starting to wonder if they just made a mistake. Perhaps there's some bug in the system that just didn't show up until they rolled it out.

They definitely made mistakes in the Florida update and Googleguy and others seemed to think we were just bleating about nothing. Then eventually Matt (I mean Googleguy) twigged that something might not be as they expected and invited some of us to send him examples. Once they saw what was happening in these specific examples they made changes which sorted the problem for some of us.

If anyone from Google is listening perhaps it is time for you to give folks here some way to show you examples of where Panda is making mistakes.
9:35 pm on Sept 30, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:May 22, 2005
votes: 20

The odd thing about all of this - in the "old days", that's 6 months ago, we knew all our competitors, now the sites that rank are sites i have never heard of.

No investment, keyword stuffed, javascript redirects, template pages with no info, obvious auto-generated titles and descriptions, mass links with similar anchor text, no unique content, table structure, terrible colors.

All our competitors before Panda had real staff, spent on investment, and at least made a contribution and an effort. I was even pleased, in some cases, for them to be a competitor.

It's not thousands Tedster it's probably tens of thousands...
edit spelling as usual
9:38 pm on Sept 30, 2011 (gmt 0)

Preferred Member

5+ Year Member

joined:Mar 20, 2011
votes: 0

Here's the thing that's really relevant right now...

I think we can all accept and understand that there might be some mystery reason we can't understand for the Panda algorithm not to like our site.

But what doesn't make any sense is to have a huge recovery from that algorithm, thus demonstrating that the algo now likes your site, only to have that suddenly reversed.

I can't even wrap my head around how that could be possibly, with any logically created algorithm.

Panda hates me
Panda hates me
Panda hates me
Panda likes me!
Panda likes me!
Panda hates me.

It's bizarre. Especially since it's happening over such a relatively short period of time. For Panda to hate a site for 6 months, like it for 3, and then suddenly hate it again defies sense. Especially when for most, the recovery was slow and then the take away of the recovery was instant, like another switch being flipped.

It feels like, as someone suggested, that Google is simply intent on shaping traffic in some areas of the internet and keeps tweaking the algo to make sure that the sites they want kept down are staying down. If those sites fix the things that were keeping them down, then they add some other things into the algo or something.
9:50 pm on Sept 30, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:May 22, 2005
votes: 20

I hope not - as you appear to be saying - if you improve your site don't bother as you will be hit next month.

However , when you get an almost straight line graph of visitors/page views/ impressions( in WMT ) every time there is a change I am inclined to agree with you.

AS I posted elsewhere, it's almost as Google doen't want certain sites to get beyond a traffic level.

For example, Thursdays has always been, for us, for over 10+ years, a bad day. Now it's the best day. But Sunday/Monday, normally the best, is now the worst !

All I can see is throttle, throttle , throttle. Stop these sites annoying our advertisers.
9:59 pm on Sept 30, 2011 (gmt 0)

Junior Member

10+ Year Member

joined:July 11, 2006
posts: 104
votes: 0

Especially when for most, the recovery was slow and then the take away of the recovery was instant, like another switch being flipped.

I think what is going on is that once Panda hates you, he hates you forever - if you have been panda stomped, you are forever foresaken to be stomped every time Panda comes around.

BUT this doesn't mean that your site can't steadily increase in rankings due to more people linking, better seo, better content improvements that have always been at play in Google's Algorithms. You are just gaining in SERPS due to the normal algo's. But you are still screwed because Panda doesn't care about 'the rest' of the 500+ algo changes going on at google - Panda comes and stomps you back down.

Once Panda targets a site there is nothing that can be done to overcome it - short of manual google intervention, which google claims doesn't exist.
10:08 pm on Sept 30, 2011 (gmt 0)

Senior Member

joined:Dec 29, 2003
votes: 0

there's a lot of truth on what you said. Funny some don't even consider Panda a penalty simply because Google has said it is not. Let's call it a hammer in the head, not a penalty.

Shortly after the April Panda I realized that content, unless it's cut and paste, does not matter at all. Now I'm 100% sure. Essentially all our sites are in Google's caring hand$ since they can screw your user metrics in a heartbeat with the 'enhancements.'

< continued here: [webmasterworld.com...] >

[edited by: tedster at 10:44 pm (utc) on Oct 2, 2011]

This 186 message thread spans 7 pages: 186

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members