homepage Welcome to WebmasterWorld Guest from 50.19.169.37
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 173 message thread spans 6 pages: < < 173 ( 1 2 3 4 5 [6]     
Let's Post Our Panda Solutions - Things That Have Worked
flanok




msg:4338919
 11:13 am on Jul 13, 2011 (gmt 0)

I, like probably many of you have looked through this forum for answers of what we can do to recover from this Panda 2.2 update.

The truth is I am seeing an awful lot of “we have lost this and lost that” and very little of “we did this and got better” results.

I thought it may be an idea to restrict one thread to “Things That Have Worked” whilst we all experiment with content, links and everything else.

Let’s leave all the other chat to other posts, and if you have found something that gave you some sort of return form Panda 2.2 then lets post it here.

I will start will some small gains.

History
My losses consisted of many pages losing 5 to 10 places for key terms. I.e. first to second page rankings, across the board leading to a 30% traffic reduction. But not huge rankings loses that have been reported from other members.

I noticed I had a big issue with existing “SUPPORTING” pages no longer being cached within Google’s index (or appeared not to be) around a third of my site. I got this message when clicking on the cached link per each page. (My key pages that had most rankings still were cached.)

Your search - cache: Mysitepage: Did not match any documents.

At first I thought Google had an issue with my SUPPORTING content, so I moved this section (around 2000 pages to a subdomain, all handwritten over a long period, but in honesty probably lacks real data).

These SUPPORTING pages also did not cache after 3 weeks (only a very small quality were cached).

What I Did
3 days ago, I went into webmaster tools and increased the crawl rate for these supporting pages and in 3 days have seen a dramatic increase in how many pages now show as cached.

As all these pages had important internal links throughout the site to my KEY pages, I believe I am now regaining these internal links to my KEY pages .(or as now on a subdomain these may now be classed as external links to my KEY pages ).

Sure enough this morning I saw not a full return, but saw several KEY pages return back to first page status. There are still around 50% of these supporting pages to be cached, so I keep finders crossed for further gains.

I had also added links from my home page deeper into key pages that had been linked from internal pages, but does not account for all the improvements just some.

This is not a full return but a big enough indicator, for me to understand maybe if it was the quality of content, moving it to another subdomain has helped.

But also there is still a lot of uncached information out there and I do not think we will see the full status until all pages within google are rechached under Panda.

Of course this is all just my opinion only, even if you disagree with my comments and have your own solutions please post them here.

 

synthese




msg:4423295
 10:08 pm on Feb 29, 2012 (gmt 0)

@marketingguy - thanks so much.

@kenneth2 - ditto. I've applied same changes as you and MarketingGuy but still no results. AS time goes on I'm thinking that overlapped content is one of the main issues (e.g. 2 or more pages/posts/articles that are about a similar thing). This is very hard to address on a larger site that it many years old.

Whitey




msg:4423935
 1:55 am on Mar 2, 2012 (gmt 0)

I'm questioning if many of the folks who have applied common remedies, such as blocking pages and adding quality content are still in Panda, even though they think they are.

With less text to index, of course their traffic would have fallen and give the impression that they have not cracked Panda. What they added back may not have been enough to compensate for what was culled. Then maybe their link profile was dumbed down as well in the previous year, so pre existing rankings on core terms may have been hit and has been further confused by the many algo changes in between

Maybe what you did, has worked , but not shown results in a manner you were expecting. Are your pages ranking for phrases within new added content? What measurements are you relying on ?

Anyone with similar or alternate thoughts

suggy




msg:4424006
 7:30 am on Mar 2, 2012 (gmt 0)

Whitey

The thought had occured to me.

I was always under the impression that being a large site helped with rankings. All those internal links, though only contributing a tiny amount each did, I think, help boost the most linked pages on your site. Slimming you site down to a 10th of what it was, has to have hurt that?

I wonder if that's why I am now stuck at #5, where I was once top dog? I know many of my competitors have better link profiles than me.

This might also explain why small but more focused sites with tighter link profiles (in terms of target terms) do well, post Panda.

synthese




msg:4424829
 11:01 pm on Mar 4, 2012 (gmt 0)

@Whitey.

There's some good surmising there -- except the link profile for some older posts is strong. They used to rank and now they don't. To me it still seems like a sitewide penalty sits across the whole site, and seems completely at odds with the actual backlinks to the site.

Then again maybe I'm living in dreamland, and you are correct.

garyr_h




msg:4424857
 1:18 am on Mar 5, 2012 (gmt 0)

@synthese I'm in the same boat, but without knowing more about the algorithm it's hard to tell if that's even the problem for me.

When you have red widgets, say red widgets for genericgroup1, red widgets for genericgroup2, red widgets for genericgroup3 and there is some overlapping between each because one or two red widgets fit with all the groups, is that a problem with G?

What if the words are similar, but not the same? What if you have classic red widgets but then also famous red widgets? Obviously some of the classic red widgets will also be famous.

These pages are quality content and use to receive quite a bit of traffic, but Panda seemed to kill traffic to many different pages. But are they the cause or simply part of the effect? These pages still receive traffic from other search engines, just not G.

Edit: I should mention that this site has around 2,500 pages hand created by me over the 7+ years of operation, plus tens of thousands of community written pages (hard to describe here, but each one is unique and has comments on many of those tens of thousands).

Marketing Guy




msg:4425015
 12:13 pm on Mar 5, 2012 (gmt 0)

I've noticed that some examples of Panda hit sites are ecommerce sites with 1 page per product (which is fairly usual). Could be time to reassess that strategy, particularly if your products are very similar. I.e. have 1 product page with 30 different variations / options on it, rather than have 30 individual pages.

With regards the larger site issue - sure there's benefit having loads of internal links, but if you do consolidate it all into a smaller site, you still have the larger site's PR distributed across less pages so there should be an overall net gain there (in most cases). If there isn't, then that means your deep content wasn't linked to naturally - which is an insight in itself. ;)

It's also worth reassessing what constitutes "quality content". I get the impression that some people think it means writing a paragraph of unique content for each product page. Again, I think the days of this being useful are gone. Unique doesn't mean quality.

A year ago, I would have optimised an ecommerce site by writing unique descriptions for each product and ensuring that main keywords are targeted by hub / champion / landing pages.

Now, I'd reasses that and perhaps look at having less product pages that are more functional and perhaps supplement that with a range of different content - a blog with industry news, product information, infographics on product use, testimonials, UGC, etc. Although I would have done some of this before (depending on the client), I think the key Panda factor that has changed here is that it's becoming more important to provide information rather than just produce copy.

In addition, one of the main moves I've made since recovery is to do what I can to capture and retain that traffic. Starting off with social media (building twitter and facebook followers) and perhaps moving to a more community driven model (I have an information site).

It's important to be as critical as possible of your website's model. Just because a thin product page brought in traffic for years doesn't mean it will continue to do so and as long as your revenue is directly tied to traffic volume, there's something missing. All the extra stuff you'd be doing if you had a B&M business (adding value, customer retention, repeat sales, plus selling, product diversification, engaging customers, etc) should equate to a positive growth in sales over time, not just the stagnant return that search traffic offers.

@synthese @garyr_h IMO overlapping and very similar content are one of the key areas to think about. My site has/had that issue and I went some way to removing the really poor articles which essentially tackled 2 problems - primarily the thin content (which was more of an issue) and overlapping content. It was easy for me to do - I wasn't that attached the content I produced early in my career (going back 10 years in some cases) and I knew fine well some of it was pushing the boat out too far in terms of SEO.

However, making a more objective decision on content I felt was solid would have been much harder.

IMO the UGC isn't problem - look at how often the same stuff is repeated on forums. It's commonplace and not something Google would actively penalise.

@garyr_h How much of your 2.5k pages could be considered to be overlapping content?

Scott

suggy




msg:4425033
 12:56 pm on Mar 5, 2012 (gmt 0)

It's important to be as critical as possible of your website's model....


This says it all for me. I knew, years before Panda hit, that we just weren't making enough money, given the surplus of traffic Google was sending us. I just don't think we were different enough to engender any real loyalty. Don't get me wrong, customers love us and we are scoring 4.85/5 on our ekomi feedback, but months later when they are looking to solve a similar problem again, not enough past customers remembered us or found we could solve their latest problem too. The result was far too high a reliance on speculative traffic that can disappear in the whiff of a panda's backside.

I am so convinced of this that I am winding up that business irrespective of Panda recovery and starting afresh on something that people give a damned about; that they can be passionate about. In particular, I am focusing on ideas where repeat purchase is a real strong possibility (due to repeated use). Rental/ hiring is particularly attractive for this reason.

So, in short, I guess our business model sucked and we knew it, but it took a Panda to force my hand.

wingslevel




msg:4425046
 1:36 pm on Mar 5, 2012 (gmt 0)

Hey Scott- Tricky thing though, individual item pages. I have long suspected that our ecomm sites have suffered some kind of penalty for dup content because we have individual item pages. we have 100s of thousands of them. but we sell food. imagine, hypothetically, of course - corn flakes are available in 22 different sizes and packs. want them in 18 oz boxes? 24oz? how about the little self serve plastic tubs? so we would have a sub-subcategory page called corn flakes and then links to individual item pages for all 22 varieties. each variety has a unique image and some unique descriptive info, but, lots of non-unique content as well - after all, a corn flake is a corn flake. this approach always worked the best for us. we did lots of usability tests and benchmarked conversion rates etc.

now, in their ultimate, arrogant wisdom, google has decided they know more about selling corn flakes than we do.

so, ok, we decided to do it their way - tested about 10k items where we used a drop down menu on the corn flakes page "choose size" - guess what? bounce rate soared and conversion rate plummeted. people got confused. they wanted the little tub and they went to a page with a picture of a box and a drop down - too complicated, we lost them. but guess what again? they loved this page over at the 'plex - were doing back flips over it - traffic to that whole sub-sub was up.

Marketing Guy




msg:4425051
 1:55 pm on Mar 5, 2012 (gmt 0)

Yeh that's a tough one. I fear the answer sits somewhere between information provision and usability - perhaps something that's outside the technical capabilities of web technology just now (or at least just search technology).

I suspect what you are describing could be managed better via a mobile app where users can drag and drop / easily tick boxes / select options / etc - something that traditional websites make a little bit difficult. But that still leaves the issue of driving traffic to those pages in the first place.

Can you get the best with both worlds and just block the child option pages from search engines? May have a similar effect as a Panda solution without the need to redesign the site.

That said, how much of your business comes from people searching for specific food products online vs people searching for generic shopping terms & brand searches? Is it necessary to have each product page optimised, indexed and ranking for something?

I've theorised that in certain markets Google might make exceptions to the degree of thin / duplicate content that is acceptable. I've been working with a car dealership who (along with 99% of their competitors nationwide) use the default manufacturer's description for new cars. Those pages ranked just fine (slightly better when we rewrote the copy, but not much so it makes a difference).

Realistically the problem is down to Google's ability to recognise generic product pages and rank them appropriately - right now they can't really be distinguished from normal content pages, which is why that's the strategy that needs to be applied to them. I can't see that situation lasting though.

jinxed




msg:4425074
 3:00 pm on Mar 5, 2012 (gmt 0)

I recently decided to do an experiment on one site I own that was affected by Panda. The site was small, only around 40 pages, and never really received much traffic even before there was an issue with Panda.

The theory was that if the experiment went well then I will try it on a bigger site I have.

Here is what I did:

1. Removed around 15 pages that were 'discussing' best uses of certain products, that were originally receiving traffic wanting to purchase the product. Status 'Gone' for all of these.

2. Domain change (301 redirect) from a .co.uk domain to the .com plus new extension-less URL's.

3. Moved all ads way below the fold.

4. Removed some no follow links I had to some 'fluff' pages. Back when I first created the site I must have been trying to sculpt page rank. Anyway.

5. Created a much more user friendly design on all pages, but especially the home page by building a javascript 'search filter' to narrow the pages users would find useful depending on their input. This vastly improves navigation.

6. Optimised the usual I.e. Site speed, css images sprites, blah blah..

7. I did not change the actual content pages that were left.

The result was a 4x the traffic (and gaining), a much lower bounce rate + generally all round improved metrics.

ecmedia




msg:4425092
 4:01 pm on Mar 5, 2012 (gmt 0)

My dilemma is that during Jan 14th update my Pandalized site has completely recovered as if nothing happened. During the time it was Pandalized I never did nothing at all and pretty much abandoned it. I did not change anything because I was so convinced that my site did not deserve to be pandalized and Google got it so wrong to punish me.

This poses a serious question for me (and I am sure others have the same issue) is to do nothing on my other websites that I know are wrongly pandalized or as you folks are doing try to change things.

Could it be that for those who have found success with changes, their websites were problematic but what if G still pandalized websites that did nothing wrong according to its guidelines but the faulty algorithm trapped them!

For the time being I am going to focus on my recovered website and really hope that Google engineers are not idiots as they currently appear to be.

jinxed




msg:4425113
 4:32 pm on Mar 5, 2012 (gmt 0)

The algo is what it is.

The way I see this is that if you can see something that can be improved upon, then do it, but change for the sake of change is a big waste of time.

Outside opinions are very revealing in these situations.

garyr_h




msg:4425332
 1:40 am on Mar 6, 2012 (gmt 0)

@MarketingGuy I'd guess around 10% but definitely not more than that. Maybe that's something to look at. I hate playing this Google game though, because if I guess and I'm wrong, then I lose Bing and Yahoo traffic, plus whatever Facebook was sending, and don't get any benefit in Google. If it works, the prize is amazing, but if it doesn't, it's not a good place to be. I will most likely take the risk because Google use to send amazing traffic.

I already made a ton of changes throughout late-January to mid-February and I'm hoping that helps whenever Google accepts it.

As for your suggestion on the products, this is an information website but some content can be seen as "thin" if you look at word length. No true idea what to do about it. If I combine all of those pages, then the page is extremely long. If I don't, then it's thin content. My competitors have the same types of pages, but they are ranking in the top 10 while I'm no where near there at the moment.

Whitey




msg:4425336
 2:07 am on Mar 6, 2012 (gmt 0)

Some of these reported recoveries are possibly not in highly competitive verticals, where Google yawns and says : "not another site about xyz , doing it the same boring old way " and perhaps ratchet's the quality threshold far higher for that data set.

@MarketingGuy - if your recovery was in a niche / hobby site , respectfully , maybe your job was easier.

Surely when Google built this Panda algo, they set minimal criteria that each site must have in each of it's major verticals, alongside all of it's other ranking paraphernalia of "signals" and perhaps a bit of randomisation to deliberately confuse an understanding of how things work.

Has anyone heard or sense anything to the contrary? Is anyone claiming recovery in a competitive vertical?

Marketing Guy




msg:4425478
 9:33 am on Mar 6, 2012 (gmt 0)

@garyr_h could be the 10% that's dragging you down and the rest is the case of a bunch of SEOs overthinking things. :) Possible to consolidate some of the overlapping pages?

@Whitey in a way it was easier as I wasn't reliant on the site for a living, but we're still talking about a relatively competitive vertical (though, I'm not sure what the cool kids call competitive these days!). Being an information industry in many ways, it's much like SEO where there are loads of blogs, forums, etc on the subject, as well as lots of newspapers in on the act, local, national and international companies and government websites.

I'd go a step further and perhaps suggest that not only Panda may be fine tuned to particular verticals, but also at the individual keyword search level. But I don't think that really impacts the recovery approach - the basic idea of analysing whether or not your page deserves to rank in comparison to the competition is still the same.

Marketing Guy




msg:4425501
 10:51 am on Mar 6, 2012 (gmt 0)

It's also worth noting that the only rankings that haven't returned are the homepage targets. However, this may be less about Panda and more about this month's change to the algo where a link factor was removed (the current theory going around is that it's related to anchor text, which would make sense). The rankings are jumping in and out (the rest, accounting for 3k visits/day are stable), in a similar way to people are describing a traffic throttling effect with Google recently. So it could be either or these (or neither or both).

The point is that it might be difficult for Panda hit sites to distinguish between what is a Panda penalty, what is a result of now redundant previous link building efforts and what is being impacted by the flux / throttling effect. 3 juicy new things to worry about (or reasons to focus on other marketing channels, depending on your point of view) - fun times! :)

claaarky




msg:4425642
 5:52 pm on Mar 6, 2012 (gmt 0)

Ecommerce site, Pandalised since April 2011 (overnight loss of 50% traffic so definitely Panda), currently seeing what appears to be a gradual recovery (now up to around 70% of pre-Panda traffic).

We tried moving from the individual product page model to multi-product pages in May 2011 to overcome the repetition of very similar content on numerous product pages in case that was our issue. Traffic dropped a further 10-20% and conversion rate dropped by 30%.

Stuck with it for 7 months, working on the content and trying to improve conversions. Traffic recovered slightly with every Panda update/refresh but conversion rate was killing us so eventually abandoned it this year and switched back to individual product pages.

However, we haven't gone back to exactly the same format as before, we now canonical all similar products to one main product (the most popular or most representative) and list the other variants as 'related' products. So Google now just indexes the main version of the product but that seems to rank quite well for all the variants as well.

We made a multitude of other changes to the site at the same time as we switched to this set up, but traffic started going up literally the next day which amazed me as I was expecting a drop in traffic after such a big change, but traffic has been going up in leaps and bounds ever since.

I don't know if this is a recovery (I imagined a Panda recovery would be overnight when a refresh happens rather than spread over a number of weeks and months) but it's the most positive thing that's happened in almost a year and every week seems to bring another boost. At this rate we'll be back to pre-Panda levels within a month (assuming there isn't a big slap just waiting round the corner).

If this is a recovery story I'll post the rest of what we did. We've done a lot but there are some things we haven't done that others have mentioned they did, which might save people some time and effort. Fingers crossed this is it - it's been a heck of a 12 months!

garyr_h




msg:4435568
 1:41 pm on Mar 31, 2012 (gmt 0)

Any update @claaarky?

suggy




msg:4435575
 2:22 pm on Mar 31, 2012 (gmt 0)

@claaarky - has it survived the latest update? I'm down again... argh!

Bewenched




msg:4437167
 6:27 pm on Apr 4, 2012 (gmt 0)

Our ecommerce site was hit really hard in the initial Panda "cleansing" by Google. However over the last 4 months we have started to see a turn around. Here are some of the things that we did.

1) Beefed up content on our product pages. We're in the auto parts field so originally a lot of the copy was just technical specs and not fluff, over the last year we've added more user friendly copy to the pages in addition to the technical stuff.

2) Made our thumbnail images bigger. Note these were always clickable to the larger version but we had them at 72px square, they are now 100px square.

3) Re-Wrote some of our product pages to start using Product micro formats to help other shopping engines. [schema.org...]

4) Really started aiming our sights on Bing traffic including starting to use Bing's webmaster center.

I'm not sure which one or if any of these made a difference with Google in particular, however we started seeing a significant recovery in traffic and revenue.

Recovery has been gradual and hope that we will be back to pre-panda levels within at least 6 months since we have a very large site with over 200k products. It will take a long time for Google to re-crawl everything.

Whitey




msg:4437257
 9:43 pm on Apr 4, 2012 (gmt 0)

@Bewenched - can you provide some % traffic metrics of your fall, and where you're at now.

What makes you think you've responded to Panda rather than some other elements of the algo?

claaarky




msg:4437499
 2:31 pm on Apr 5, 2012 (gmt 0)

Quick update on our progress.

Traffic stabilised after my post on March 6th at around 70% of pre-Panda levels until March 19th, then started falling again until March 31st, then began another modest climb. We're now back at around the 50% level again (i.e. where we started a year ago).

Yesterday we updated the site with a much reduced header (we had a very large attention grabbing header) just to address the above the fold algo update. I don't expect that to have any impact on Panda though.

So the battle for a Panda recovery goes on for us.

Planet13




msg:4460921
 1:22 pm on Jun 3, 2012 (gmt 0)

@ Bewenched:

4) Really started aiming our sights on Bing traffic including starting to use Bing's webmaster center.


How exactly does one target for bing traffic?

I have asked this question a few times on the bing forum and never got an answer :(

How does one optimize for bing that is different than optimizing for google?

This 173 message thread spans 6 pages: < < 173 ( 1 2 3 4 5 [6]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved