| This 173 message thread spans 6 pages: < < 173 ( 1 2 3 4  6 ) > > || |
|Let's Post Our Panda Solutions - Things That Have Worked|
I, like probably many of you have looked through this forum for answers of what we can do to recover from this Panda 2.2 update.
The truth is I am seeing an awful lot of “we have lost this and lost that” and very little of “we did this and got better” results.
I thought it may be an idea to restrict one thread to “Things That Have Worked” whilst we all experiment with content, links and everything else.
Let’s leave all the other chat to other posts, and if you have found something that gave you some sort of return form Panda 2.2 then lets post it here.
I will start will some small gains.
My losses consisted of many pages losing 5 to 10 places for key terms. I.e. first to second page rankings, across the board leading to a 30% traffic reduction. But not huge rankings loses that have been reported from other members.
I noticed I had a big issue with existing “SUPPORTING” pages no longer being cached within Google’s index (or appeared not to be) around a third of my site. I got this message when clicking on the cached link per each page. (My key pages that had most rankings still were cached.)
Your search - cache: Mysitepage: Did not match any documents.
At first I thought Google had an issue with my SUPPORTING content, so I moved this section (around 2000 pages to a subdomain, all handwritten over a long period, but in honesty probably lacks real data).
These SUPPORTING pages also did not cache after 3 weeks (only a very small quality were cached).
What I Did
3 days ago, I went into webmaster tools and increased the crawl rate for these supporting pages and in 3 days have seen a dramatic increase in how many pages now show as cached.
As all these pages had important internal links throughout the site to my KEY pages, I believe I am now regaining these internal links to my KEY pages .(or as now on a subdomain these may now be classed as external links to my KEY pages ).
Sure enough this morning I saw not a full return, but saw several KEY pages return back to first page status. There are still around 50% of these supporting pages to be cached, so I keep finders crossed for further gains.
I had also added links from my home page deeper into key pages that had been linked from internal pages, but does not account for all the improvements just some.
This is not a full return but a big enough indicator, for me to understand maybe if it was the quality of content, moving it to another subdomain has helped.
But also there is still a lot of uncached information out there and I do not think we will see the full status until all pages within google are rechached under Panda.
Of course this is all just my opinion only, even if you disagree with my comments and have your own solutions please post them here.
Ontop of the above I re-wrote all of my Meta Descriptions.
Has removing pages (and 301 them) helped anyone? I've been thinking of removing like 60% of "thin" and "bad quality" pages, but I'm not sure whether to do it or not. I am worried about internal link structure...
Here's what we did:
- Noindexed 300k pages
- Focusing on the top 400 highest revenue generating pages only now (everything else is noindexed)
- Improved site speed so pages load in less than .5s
- Moved content closer to the homepage, no more than 2 clicks to get to what you need.
- Rewrote unique titles & descriptions for all pages
- Added at least 300+ words per page
- Removed all boilerplate text on pages
- Asked webmasters to remove any sitewide links (if it's wordpress you can give them some code so the link just shows on the homepage)
- Noindexed tag/category pages on our /blog/
- Removed sidebar links on our blog to the main site
We recovered mid Jan, we're now double our pre-panda levels (which was Feb last year) & still going strong. I'm slowly introducing old pages back in (around 2 per day at the moment). Our revenue is up 400% (however will still take a while to make up for the blood sweat & tears we've endured since Feb 2011).
Glad to her your changes are paying off for you!
Could you tell us a little more about your site? Is it ecommerce? A social site? Articles? Or some other site type?
Thanks in advance.
We're in the coupon/loyalty space.
Before we got hit by Panda we had 40k merchants with around 3-4 pages per merchants & lots of boilerplate stuff.
Now we've completely scaled back to around 500 pages total, all super high quality, lots of content, providing a lot of value to our visitors. We could scale back out again in an instant but we've learned that quality wins over quantity (unless you have the authority).
|Now we've completely scaled back to around 500 pages total, all super high quality, lots of content, providing a lot of value to our visitors. |
Now THAT sounds like a Panda solution.
Did the same as gyppo pretty much, without removing IBLs, recovered around Dec 8 . Still going very strong. Did some under optimization for some keywords too.
@gyppo, did I read that right, you went from 300,000 indexed pages to 500? That's radical!
@gyppo: your quality solution is radical and interesting at the same time but it seems to me you are losing long tail traffic . Panda killed most long tail sites for good. Providing mega value for 300k-1M long tail URLs sounds almost impossible and would cost at least one million (premium article writers).
Are large sites dead? Are long tail sites dead?
Probably quality mashups are the solution.
|Are large sites dead? Are long tail sites dead? |
I think that there has been a significant change to longtail results, which I think really started with the May Day update of 2010 and I think that the emphasis given to Big Brands, along with Panda, has pretty much meant that the days of building a page and "optimizing" it for a long tail result are done and gone.
I think if you try to optimize for a specific longtail that page better be GREAT - or at least the bulk of the site had better have lots of content that is geared toward the head of that long tail.
|Here's what we did: |
- Noindexed 300k pages
- Focusing on the top 400 highest revenue generating pages only now (everything else is noindexed)
Which is key in deciding whether to noindex a page: (1) simply the total revenue the page generates, (2) click-through and conversion rates from ads appearing on the page, or (3) earnings per page view?
And how does a search engine know that you're culling low-revenue pages from your site? And why would a search engine equate low revenue with low quality or so-called "authority"? Of course, if your revenue stream is primarily from Google, then the answer is clear enough: make pages that make us money, or we'll move your entire site down in our SERPs. Otherwise, I'm not sure I understand.
Our long tail traffic is stronger now than it was with 300k indexed pages (and much more valuable, our EPV is 400% up), my view is that perhaps we didn't have enough link weight to support that many pages. Whereas now, we can easily support 400 pages & the rankings to those pages have skyrocketed to the point where we're competing with the major players in the space.
In terms of choosing pages, we simply picked 400 merchants that we have either made a sale or had some sort of engagement with over the last 2 years - we're also now focusing completely on merchants that we can earn revenue from. Trust me focusing on 400 merchants has made our life much easier & much more structured, it's like trying to keep 1 room tidy instead of the whole house.
The bottom line is that one person can only realistically be expected to keep a certain threshold of pages at a high quality, it would be very easy for me to scale back out to 300k pages due to the nature of how we generate our site. But the lesson here is that Panda is looking cautiously at sites that are machine generated & have boilerplate elements/lack of unique content ratio per page.
One rule of thumb I've been trying to abide by is to make sure that each page has no more than 30% similarity (from a code & content perspective), naturally it's hard to tell which of the changes or things we've done has had the most impact. Just try to put yourself in Google's shoes & have a good impartial look at your site.
> Just try to put yourself in Google's shoes & have a good impartial look at your site.
I'd code Panda very simply:
If total number of pages on site > 1,000
And % of pages on site without valid backlinks > 30%
Then overall site quality is below minimum;
Therefore flag entire site, Pandalize.
Quality pages on Pandalized sites survived Panda. They were judged as quality based on backlinks. If you remove all the pages on your Pandalized site which don't have quality backlinks, you're moving in the right direction.
Very interesting gyppo. Thanks for sharing!
When you noindex'd did you do it just for Googlebot or for all robots? Just curious if Google would make a distinction. Also, can you confirm you did not actually remove pages via 404 or canonical tags or 301 redirects?
I am in the ecommerce comparison shopping space in a pretty specific niche and have a very similar problem. Lot's of pages of content indexed, lot's of it from datafeeds so "thin". I've tried just about everything possible aside from your approach of noindexing everything but the truly money pages. Some of my changes include a complete redesign, removed all ads nearly a year ago, noindex'd and removed as much duplicate content as possible, added more social signals, our site speed has always been super fast, etc. We've only seen a slight uptick since we were Pandalized by Panda 1.0 in February last year.
We still rank pretty well for a few things but I am starting to think something more drastic like your approach may be a good idea to test next.
@gyppo- that's great that you climbed out and shared your success with others. A few things :
-What were your timelines on each of the major components, from start to finish?
-Did you add your content gradually onto those remaining pages, and did it reach a threshold that kicked you out of Panda, or did you cover all pages before it kicked in.
-Would you have been confident with larger / smaller amounts of content on each URL [ Some mention from Google a while back that size didn't matter ]
I guess I'm mindful of the indexing / learning / quality rescore cycle timeline which sits at around 2-3 months as i understand it.
- We started noindexing orphan pages in October.
- We noindexed everything on the 8th December.
- We rolled out a completely new design on 8th December.
- Took 4 weeks to complete the roll out of the new content, then recovered on the 19th Jan.
- Hard to say, we still have pages that are buried which have more content than ones that aren't. At the moment our traffic is still increasing but a page will come out & rank really well for a week, then it'll get dumped again only to get replaced by another.
|We noindexed everything on the 8th December. |
@gyppo I'm just clarifying the intesity of your actions. By everything, do you mean "everything" except 400 pages , or , everything aka / e.g. the home page ?
6 weeks from the main actions of 8th December is very fast, considering the completion of content to around the 5th Jan approx. -How fast was the reindexing and did yo udo anything special to accelerate it ?
|High-quality sites algorithm improvements. [launch codenames “PPtl” and “Stitch”, project codename “Panda”] In 2011, we launched the Panda algorithm change, targeted at finding more high-quality sites. We improved how Panda interacts with our indexing and ranking systems, making it more integrated into our pipelines. We also released a minor update to refresh the data for Panda. [insidesearch.blogspot.com.au...] also [webmasterworld.com...] |
It makes me wonder if there is a rolling component to Panda. Thoughts?
Having contributed about two or three of my sites before, I've found another of my sites is out of Panda.
Again it doesn't have many pages - about 30 I guess.It used to be top 3 for "keyword, keyword, websites". It's a site that offers web design for one man businesses in the keyword keyword niche. It's not a very competitive niche in serps. The domain name is not keyword rich.
Anyway with Panda 2.1 which affected a lot of my sites, it dropped to sometimes position 10, sometimes page two or three.
Last week I made a few changes to the homepage (not with improving things for Panda in mind at all)- nothing you would think would make a difference to Panda - a couple of ten word testimonials and a nice little box highlighting different aspects of my service, changed the background colour of a couple of text boxes and added a Facebook and Twitter button. That's all I changed. Just tried to make it more attractive for visitors. Now I'm back to position 3.
So now that's 4 of my small sites out of Panda - (just hoping the rest will get released soon!)
@nickreynolds - so if you consider your site to be Panda released, are you seeing it's return differently from the others that you released. What makes you think it's Panda related ?
I'm questioning this in regards to the timing between updates/data refreshes and to see if strengthens the suspicion of Google have integrated some of it's features into the flow of the general algo.
My experiences mirror Gypo and Donna, nothing to add but my agreement.
Got hit on Oct 14th. Dropped pages from around 17k to 300 using robots.txt or noindex. Consolidated content and focused on uniqueness of pages. Improved database searches and page structure, along with caching. Made numerous other improvements.
Result: No Change!
Waited until end of January and still no change so decided to 301 around 300 pages to another old domain with a slight revision of the layout but a much simpler link structure. 301's went live 9 days ago, and despite spidering all the new pages within hours, the number of pages actually in Google's index is still only 131.
This morning I have noticed the first referrals directly to the new domain rather than via redirects. It's too early to tell if this has been effective, but it's the only encouraging sign I've had in 4 months!
The quality of referrals from Google is very low, with virtually every one bouncing. Referrals from Bing are of a much higher quality. These actually seem like real searchers!
I'm adding content from the old domain progressively and checking originality as I go.
@whitey - my demotion in serps for this site was on the same day that numerous others of my sites got hit - the day of Panda 2.1. It's been difficult for me as my sites are pretty small and not full of potential Panda targets eg duplicate content, thin content etc. All the evidence date-wise is that they were hit by Panda, although I've not really known what to do to solve the problem. My guess was that maybe one site was affected and that my other sites were guilty by association eg same adsense code or analytics code or some minor relevant interlinking etc.
My biggest site was only hit in a minor way and slowly crept back to its earlier position without me doing anythiung. One site I moved the hosting and it came back 2 months later (other sites that I moved hosting didn't come back). One site I moved hosting and deleted about 20-30 pages and it came back almost immediatley. This site I'm mentioning in this last post I didn't move hosting and just made some pretty minor home page alterations.
@nickreynolds - i may have missed some other's posts, but I think you'd be the first to come out of Panda in between an update or data refresh which to the best of my understanding upgrades the quality score and releases traffic/rankings.
That's why i was curious - and your date into a Panda penalty co-incides. But your exit may be something else happening and got confused with a Panda solution.
There are reports of lifts in between cycles, but they seem to be within the existing "quality score" of sites.
Official - Panda update / refresh : [insidesearch.blogspot.com...]
Can anyone pinpoint a date and recovery, or is this in refresh data mode as we speak ? The report was a bit ambiguous.
Sunday 26th rolling into Monday 27th for the update. Will post back in a bit with some info about the recovery changes I implemented last month.
It's early days still, but it looks like my April 2011 hit site has made a full recovery. It was a hobby site, so didn't really pay much attention until recently, but decided to take a swing at fixing it at the start of January.
I just checked the top 50 keywords for 2010 - resulting in 300k visits / 1.2 million. 40/50 are back in the top 30 on Google.com, of which 30 are top 10 - aka pre-Panda rankings (some improved).
So here's what I did;
There were two sites involved;
- Site 1 - 10 years old, high traffic (150-250k per month), poor design, blog (200 pages) / forum combo (40k pages), forum reasonably active, some thin content on the blog.
- Site 2 - 3 years old, same subject area, 17k per month, custom WP theme (looks pretty good), 50 or so posts
Both hit on April 2011 with 80% traffic loss, etc.
Tested the ad ratio theory earlier last year on the second site - didn't make the slightest bit of difference. It's all about the content (although I should point out that although the sites run 3 Adsense blocks, they were never too "in your face").
Decided to forget about Panda for a bit and think more from a business point of view. I came to the conclusion;
- Merge the sites - easier to maintain and realistically there was no point having two (other than the original plan to rank for similar phrases, which wasn't that well thought out!).
- Ditch the forums - they weren't Panda hit (takeaway = Panda is folder specific, or at least can be), but tough to maintain and no real return.
- Redesign the main site - it was static HTML pages. Shifted it to a custom WP site.
- Embrace social media - setup Twitter accounts and Facebook page for the site and integrated social share buttons on the site. More of an alterative solution to Google than a Panda remedy.
- Improve ad positioning, etc.
- End of Dec - transfered content to new WP install. Decided on a case by case basis which articles to remove or rewrite. Ended up removing about 10 articles (out of 200) and rewriting a handful more (a couple were merged together.
- 301'd ALL old URLs to new versions. Even for stuff that was removed (or old cases of duplication, but there wasn't much of that). Site had just been hacked a couple of weeks previous (changed host as a result) - also 301'd the subfolder of spammy links that was installed (WMT showed 30k+ inbound links to those pages from spammy sites).
- Removed forums - 301'd all forum URLs to site homepage.
- Added content from site 2 and 301'd all URLs. Implemented site address changed via WMT.
- January - tested various changes to the site to improve ad CTR, time on site and social shares.
- Social shares - bigger buttons at the end of posts worked better.
- Adsense - well blended ad below the post performs well - revenue at 40% of what it was pre-Panda, but with much less traffic.
- Stickyness - tested positioning for related posts, popular posts, random article snippets, etc. Dropped bounce rate by 15% and increase time on site by 30%.
- Google+ - created authorship for my posts. Kicked in a week ago.
That's the core changes implemented - the rest was really down to developing a content strategy for the site, which has been going well. Since launch I've added regular content each week and been active on Twitter.
Some notes & observations
- I don't think removing the forum had any impact on this at all - it was just a business decision.
- While social shares have increased during the 2 months the new site has been live, I don't think these have any particular impact on Panda.
- There's been some speculation about how to manage old content - I 301'd everything to an appropriate page. Seemed to be just fine.
- Not particularly convinced bounce rate or time on site is a factor.
- There were some fluctuations in rankings throughout January - old rankings returning and then dropping. Think this might have been minor Google fluxes, etc but could potentially be down to the scale of 301 redirects being factored.
- There were a LOT of visits from Googleplex a few days before the Google+ authorship kicked in. ;)
- The Panda refresh mid-January didn't have any impact on the site - although much of the old content (40k+ pages) was still in the index. Down to 2.5k today, but that's still a bit high.
- Had a floating social share bar on the left hand side of the site. Removed it at end of Jan - no impact on the amount of social shares.
- Site 2 only became "unverified" in WMT in the past week (meaning the rankings transfered to the new site). I can't comment on whether or not this impacted the recovery of site 1 though, but the rankings did improve with the recovery.
If pressed to theorise, I'd say the combination of some thin content (which was ranking well) and poor design let the site down. Looking at the site then and you wouldn't think much of it. But now, it looks clean, some guest authors on board, there's regular content being produced and growing social profiles.
It's a weird test though - did lots of different things (mass 301's, authorship, merged sites, cleaned up content, redesigned sites, more active social presence) so it's difficult to pin down what the solution was, but if you take a step back from looking at Panda in those terms, then realistically all those things make good business sense too.
The interesting one is the authorship kicking in a week before Panda recovery. I'm absolutely not saying their connected - it's just coincidental timing. But if I was a search engine that developed a qualitative aspect to my ranking algorithm, I might want to counterbalance that with positive signals of quality (WMT, authorship, social, etc).
@Marketing Guy - Thanks , that's a great post with a lot of effort behind it for sharing. More inputs from others like this would be helping others strengthen their solutions.
|If pressed to theorise, I'd say the combination of some thin content (which was ranking well) and poor design let the site down. Looking at the site then and you wouldn't think much of it. But now, it looks clean, some guest authors on board, there's regular content being produced and growing social profiles. |
Given the time lag in the cycle for recovery [ 1-2 months ], would this strengthen your belief that it was the December factors that applied the most weight to a release.
Most of my pandalized sites are still at 30% of pre-panda level as of today.
Things that I have done
1)Removed tons of irrelevant or thin content pages (tags, swf, etc)
2)Increased the content (+100%) of about ~ 10-20% of all total pages. Apparently, this is not enough to impress panda. The content improvement is still on-going.
3)Total layout change, zero impact on Panda
Interesting panda behavior that I noticed.
My site seems to be getting the most traffics on internal pages with the most "Facebook likes" even though the content is thin. ( less than 30 words). I suspect the traffic is also tied to country that liked my page. For instance, if the majority of likes that I gained from a page is from Thailand, then my Google traffic would most likely come from Thailand on the keywords associated to the page.
@Marketing Guy - Thanks for the comprehensive description of your approach.
Point 9: 'Site 2 only became "unverified" in WMT' interests me, since I've just moved some of my content to another domain.
Why would unverification transfer rankings to the new site? Doesn't the 301 do this?
|Given the time lag in the cycle for recovery [ 1-2 months ], would this strengthen your belief that it was the December factors that applied the most weight to a release. |
Not certain to be honest. I did notice some ranking improvements early in Jan / Feb - they jumped up and down - some would be number 1 for a while, then nowhere, then back to 5th or something (higher now).
1. Bog standard Google SERP fluctuations (and / or a combo of new content being discovered).
2. Google factoring in the 301's as they were being indexed, perhaps circumventing Panda, but then rankings dropping as Panda is reapplied.
3. Google testing my site for Panda factors, after detecting it has changed (in preparation for the recent update).
4. None of the above and it was just the case that last weekend was a Panda update.
I also speculated that perhaps Google is using quality raters to determine Panda factors, so that could explain the lag time and subsequent recovery. Complete speculation though.
|My site seems to be getting the most traffics on internal pages with the most "Facebook likes" even though the content is thin. |
That's interesting - one of the main changes for my site was the inclusion of social share buttons - so Google Analytics was tracking social shares for the first time in years. That apparent growth in social shares (from nothing, to quite a regular stream of emails, FB, Twitter, etc) may have had an impact.
|Why would unverification transfer rankings to the new site? Doesn't the 301 do this? |
301's were in place when I updated the change of address in WMT. Some were crossed over to the new site but rankings weren't as good and fluctuated a lot for a few weeks (similar effect as described above). They didn't recover fully until the update on Sunday and it was the same week WMT stopped showing "change of address requested" and started showing "site unverified".
Most likely it's just coincidental timing and nothing to do with the recovery. But you could take from this that;
1) Move content from one Panda hit domain to another Panda hit domain will do nothing.
2) But content / rankings from site 2 did effectively recover when site 1 did.
@marketingguy - thanks so much.
@kenneth2 - ditto. I've applied same changes as you and MarketingGuy but still no results. AS time goes on I'm thinking that overlapped content is one of the main issues (e.g. 2 or more pages/posts/articles that are about a similar thing). This is very hard to address on a larger site that it many years old.
| This 173 message thread spans 6 pages: < < 173 ( 1 2 3 4  6 ) > > |