homepage Welcome to WebmasterWorld Guest from 107.21.187.131
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 186 message thread spans 7 pages: < < 186 ( 1 2 3 4 5 6 [7]     
Google Updates and SERP Changes - Sep 2011
indyank




msg:4356621
 9:47 am on Aug 30, 2011 (gmt 0)

< continued from: [webmasterworld.com...] >

There were declines for hubpages after the initial traffic improvements. But 23 aug seem to have reversed the downward trend and they are almost back to pre-panda levels. Sub-domain strategy is proving to be fruitful for this large content farm. This might help the other larger content farms to recover as well.But it might not help the mid and small sized sites and they will remain hurt. Here is why I feel that way.

Hubpages are creating sub domains at the user level and not at the topic level. I am noticing several of their current top ranking pages are from users who post on a variety of topics. This beats the theory that panda is favoring topic focused sites.But what surprises me is the upward movement of some low quality pages from poor quality writers. I have examples of such sub domains which I cannot share here.

So what is really helping them? As I understand it, the move to sub domains are helping them beat the panda's sitewide penalty. This suggests that the sitewide drag enforced by the google panda algorithm is very strong.

But why will this subdomain strategy help only the large content farms and not the smaller ones?

The relatively smaller and mid sized sites can branch off to only a fewer sub domains, in comparison to what the bigger content farms could. Hubpages have now created a large number of subdomains. I am guessing that the panda algo isn't applied to several of them. The smaller sizes of these new sub domains are probably helping them keep a distance from the panda evaluation.

If other bigger content farms intend trying the sub domain strategy, they should follow the hubpages model as it is and create sub domains based on useror any other unitthat will lower the number of pages per sub domain.

I guess that this will be relatively difficult for mid and smaller sized sites, as the number of authors will be less and hence pages per author will be high enough to be trapped by the panda net.

[edited by: tedster at 9:55 pm (utc) on Sep 2, 2011]

 

Hissingsid




msg:4369289
 8:59 pm on Sep 30, 2011 (gmt 0)

Tedster,

I'm starting to wonder if they just made a mistake. Perhaps there's some bug in the system that just didn't show up until they rolled it out.

They definitely made mistakes in the Florida update and Googleguy and others seemed to think we were just bleating about nothing. Then eventually Matt (I mean Googleguy) twigged that something might not be as they expected and invited some of us to send him examples. Once they saw what was happening in these specific examples they made changes which sorted the problem for some of us.

If anyone from Google is listening perhaps it is time for you to give folks here some way to show you examples of where Panda is making mistakes.

johnhh




msg:4369301
 9:35 pm on Sep 30, 2011 (gmt 0)

The odd thing about all of this - in the "old days", that's 6 months ago, we knew all our competitors, now the sites that rank are sites i have never heard of.

No investment, keyword stuffed, javascript redirects, template pages with no info, obvious auto-generated titles and descriptions, mass links with similar anchor text, no unique content, table structure, terrible colors.

All our competitors before Panda had real staff, spent on investment, and at least made a contribution and an effort. I was even pleased, in some cases, for them to be a competitor.

It's not thousands Tedster it's probably tens of thousands...
edit spelling as usual

Shatner




msg:4369303
 9:38 pm on Sep 30, 2011 (gmt 0)

Here's the thing that's really relevant right now...

I think we can all accept and understand that there might be some mystery reason we can't understand for the Panda algorithm not to like our site.

But what doesn't make any sense is to have a huge recovery from that algorithm, thus demonstrating that the algo now likes your site, only to have that suddenly reversed.

I can't even wrap my head around how that could be possibly, with any logically created algorithm.

Panda hates me
Panda hates me
Panda hates me
Panda likes me!
Panda likes me!
Panda hates me.

It's bizarre. Especially since it's happening over such a relatively short period of time. For Panda to hate a site for 6 months, like it for 3, and then suddenly hate it again defies sense. Especially when for most, the recovery was slow and then the take away of the recovery was instant, like another switch being flipped.

It feels like, as someone suggested, that Google is simply intent on shaping traffic in some areas of the internet and keeps tweaking the algo to make sure that the sites they want kept down are staying down. If those sites fix the things that were keeping them down, then they add some other things into the algo or something.

johnhh




msg:4369311
 9:50 pm on Sep 30, 2011 (gmt 0)

@Shatner
I hope not - as you appear to be saying - if you improve your site don't bother as you will be hit next month.

However , when you get an almost straight line graph of visitors/page views/ impressions( in WMT ) every time there is a change I am inclined to agree with you.

AS I posted elsewhere, it's almost as Google doen't want certain sites to get beyond a traffic level.

For example, Thursdays has always been, for us, for over 10+ years, a bad day. Now it's the best day. But Sunday/Monday, normally the best, is now the worst !

All I can see is throttle, throttle , throttle. Stop these sites annoying our advertisers.

ScubaAddict




msg:4369320
 9:59 pm on Sep 30, 2011 (gmt 0)

Especially when for most, the recovery was slow and then the take away of the recovery was instant, like another switch being flipped.

I think what is going on is that once Panda hates you, he hates you forever - if you have been panda stomped, you are forever foresaken to be stomped every time Panda comes around.

BUT this doesn't mean that your site can't steadily increase in rankings due to more people linking, better seo, better content improvements that have always been at play in Google's Algorithms. You are just gaining in SERPS due to the normal algo's. But you are still screwed because Panda doesn't care about 'the rest' of the 500+ algo changes going on at google - Panda comes and stomps you back down.

Once Panda targets a site there is nothing that can be done to overcome it - short of manual google intervention, which google claims doesn't exist.

walkman




msg:4369325
 10:08 pm on Sep 30, 2011 (gmt 0)

@ScubaAddict
there's a lot of truth on what you said. Funny some don't even consider Panda a penalty simply because Google has said it is not. Let's call it a hammer in the head, not a penalty.

Shortly after the April Panda I realized that content, unless it's cut and paste, does not matter at all. Now I'm 100% sure. Essentially all our sites are in Google's caring hand$ since they can screw your user metrics in a heartbeat with the 'enhancements.'

< continued here: [webmasterworld.com...] >

[edited by: tedster at 10:44 pm (utc) on Oct 2, 2011]

This 186 message thread spans 7 pages: < < 186 ( 1 2 3 4 5 6 [7]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved