homepage Welcome to WebmasterWorld Guest from 54.204.182.118
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Panda & Historic Thin Content
DodgeThis



 
Msg#: 4534286 posted 1:42 pm on Jan 8, 2013 (gmt 0)

Our site is split across several regional geo-targeted sub domains. All are built using same in-house CMS, same niche, different regional content. As of 11 Dec two of the subs suffered a >90% drop in Google impressions.

The one thing that historically made these two subs different is that they had a lot of thin content before Panda demoted them. Since then all the thin pages were removed and the offending subs brought strictly in line with the others. After several months the two subs staged a full recovery (that was Oct 2011, after which they continued to improve until 11 Dec when the traffic dived to early 2011 Panda levels).

One thought is that we are again being penalized for the historic thin content. Google is still finding the 410s, so perhaps the volume of those yet to be discovered have triggered something with the latest Panda.

Does that sound feasible?

Beyond that we’re out of ideas and would certainly welcome some. Guess we could start disavowing backlinks, but we’re resisting for the time being.

[edited by: goodroi at 3:00 pm (utc) on Jan 8, 2013]

 

DodgeThis



 
Msg#: 4534286 posted 3:16 pm on Jan 8, 2013 (gmt 0)

Correction, should read: "(that was Oct 2012, after which they continued to improve until 11 Dec when the traffic dived to early 2012 Panda levels)".

Sorry.

Sand



 
Msg#: 4534286 posted 3:56 pm on Jan 8, 2013 (gmt 0)

It's really hard to give actionable advice without seeing your site, but if you feel confident that your content is all in order, I would put some focus on the UI / technical elements of the site.

I wouldn't presume to know which user metrics Google monitors, but I do believe that some of them are part of Panda (personally, I doubt bounce rate is important at all, but some kind of long click vs. short click ratio could certainly be a factor). Are people sticking around?

Second, don't confuse Panda with Penguin. If the dates match up and you've definitely been hit by Panda, disavowing links probably won't help you at all.

DodgeThis



 
Msg#: 4534286 posted 9:00 am on Jan 9, 2013 (gmt 0)

Thanks for your thoughts, Sand. Time on site and bounce rate were fine prior to this drop in traffic. As the majority of our sub domains do okay with the same UI we are trying to see if the issue lies beyond that, eg an historic or off-site factor. That these two subs have been previously affected by Panda would appear to be the one main difference between them.

Our showing in the serps suggests a penalty has been applied. It happened on Dec 11, a date on which it seems more than a few websites took a dive, but as far as I know that date doesn’t line up with any official Panda or Penguin update, so we’re not presuming to know what ails us.

Despite recovering from Panda throughout 2012, WMT still shows over 1m pages Not Selected (all thin and on a relatively small site) that were (along with many more) 410ed at the end of 2011. I guess my question is this: Even though we recovered once, is it likely we remain susceptible to Panda until the majority of 410s are found?

jimbeetle

WebmasterWorld Senior Member jimbeetle us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4534286 posted 5:35 pm on Jan 9, 2013 (gmt 0)

I guess if Google hasn't recrawled the pages as yet then it doesn't know they no longer exist. And if there are that many pages that you 410d, then depending on the sites crawl budget, it might take some time until Google gets around to all of them.

speedshopping

5+ Year Member



 
Msg#: 4534286 posted 7:27 pm on Jan 9, 2013 (gmt 0)

@DodgeThis, can you tell me if the 2 subdomains on your site are subject to a 950 penalty? Can you check your ranks and look at the back of the results to see if you're page is there - the reason I ask is because we too got hit massively in the early hours of the 11th Dec (UK time) and Google took out 80% of our subdomains 950 style. I am pretty sure Panda is not to blame, but am keen to see if was an algo dial or whether it was exclusive to us for what we believe might be the possible cause.

scooterdude



 
Msg#: 4534286 posted 10:03 pm on Jan 9, 2013 (gmt 0)

1 million pages, I am guessing this is not editorial content, probably not an retire either

scooterdude



 
Msg#: 4534286 posted 12:24 am on Jan 10, 2013 (gmt 0)

should have been
", probably not an e store" either .

DodgeThis



 
Msg#: 4534286 posted 9:49 am on Jan 10, 2013 (gmt 0)

… it might take some time until Google gets around to all of them.

That’s what we’re thinking, just wondering if anyone else had experienced recovering from Panda, only to fall back into its clutches for a problem fixed first time around.

@speedshopping

Our main KWs are down 500 spots. In many cases we are easily outranked by partial matches from our other subs.

I am guessing this is not editorial content

You would be right. We added a tool to the site that let the reader search various RSS feeds and display the results on our site. To that we added a myriad of refining / filtering options, inadvertently ensuring any single term could create a mountain of pages. We then stuck a bow on the whole shebang by adding links to the various search result URLs throughout our site (by way of popular and trending lists).

This was three years ago, and it is safe to say we failed to fully consider how many URLs we were generating and the ramifications of that. 14 months ago we addressed the situation. While our intention was never to get these pages in serps, we also realise intent doesn’t factor high. In the arena of thin content we have been caught with our hand in the cookie jar.

… probably not an e store

Information site.

jimbeetle

WebmasterWorld Senior Member jimbeetle us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4534286 posted 3:31 pm on Jan 10, 2013 (gmt 0)

You would be right. We added a tool to the site that let the reader search various RSS feeds and display the results on our site. To that we added a myriad of refining / filtering options, inadvertently ensuring any single term could create a mountain of pages. We then stuck a bow on the whole shebang by adding links to the various search result URLs throughout our site (by way of popular and trending lists).

Wow, well at least you know the cause of the problem. Unfortunately, it's probably going to take Google some time to untangle the mess. Best, but drastic, option might be a complete restart on a different domain.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved