| This 89 message thread spans 3 pages: 89 (  2 3 ) > > || |
|My attempt to Recover from Panda|
In full disclosure I am adult webmaster. Had two profitable sites get hit. I haven't recovered nearly enough but I am documenting what I have found
First site: been running for 5 years very popular among those people enjoying this umm... activity. 100% unique, runs reviews, contests as well as a one stop for updates in the industry. It is heavy with affiliate links which I am becoming convinced is why it got hit in Farmer.
Solution: I found some java script code to encrypt the links. Google sees a random string being passed in the on click even. Short of being able to execute it it won't see the link.
Result: Tested it on the home page on the text links. Saw some very nominal recovery (could be noise). Began the first posting under this method today and will test for a couple of more days to make sure everything is fine with the click though. Then roll it out slowly to the rest of the site.
Site #2. Niche site has been 1-3 in its main keyword (competitive one) for 3 to 4 years. New posts every day, 100% unique written by me. Lost 60% of its traffic when panda rolled out. For its main keyword it dropped from position 2 to around the 5th page.
Solution: did an analysis of the pages hardest hit. One I thought was interesting I did a parody on the made for tv commercials. Where "widget" was mentioned multiple times in almost all the paragraphs. More interesting I was not trying to sell said "widget" (It appeared in the picture set). So apparently Panda doesn't enjoy sarcasm, any human reader would have thought it was funny. In any case I rewrote the page and it recovered OK
Second page: admittedly thin content. added one paragraph. It recovered.
A number of my pages had duplicate titles and metatags that got hit. I have been cleaning them up as I go (or deleting them). so far no change but Google has not yet seen them (as confirmed in the webmaster tools).
Also no changes for the main keyword. If anything it has gotten worse. I will probably also try the link encryption since I am convinced it has something to do with affiliate links
Anyone else have any ideas or findings they want to share?
|In any case I rewrote the page and it recovered OK |
|admittedly thin content. added one paragraph. It recovered. |
Wow - two ranking recoveries. That should give some hope to others. Interesting that you actually changed the content itself rather than some secondary signal.
Unfortunately they were only two pages. 998 more to go
you mean 998 pages where you have not yet made changes or 998 pages changed but unrecovered?
well its less than that, but I got a few hundred pages I have not made changes on that got hit. The others seem to be doing fine. Someone put forth the idea on the forum that somehow Google builds a score of what it considers "low quality" pages and dings the site as a whole if it reaches a threshold. So its kinda hard to figure out what is hit because the page sucks and what is hit because the other pages suck
It's interesting that you can get individual pages to recover, since Panda is a site wide penalty. That flies in the face of what we've heard, seen about it.
Also bad news for sites with large quantities of pages.
I think its clear that Panda appplies a 2 part filter
(1) All your #*$!ty pages get hammered.
(2) your main pages, usually the not #*$!ty ones, get hammered too.
You need to fix a great mahority of the #*$!ty pages, for the non #*$!ty penalty to be removed site wide. Fixing indivudal pages will undoubely see them improve, but youve got to cross that "ive fixed almost all of the #*$!ty pages" threshold to see a sitewide fix.
I have two sites,one penlaised, one not.
both have lots of articles.
both have huge business direcotires, which by any definition, are thin.
Difference is the one not penalized has better(read - totally unqiue articles, not articles based on wikipedia slightly changed) and also, it does not have lots of random content like "4 featured businesses" on every page.
The large thin business directory, seems not to be the major problem, its the article farm/dupe content/lack of unqiue content to the page becuase you are putting content that appears on other pages on every page that is the big problem.
I am rewriting articles as fast as i can and yes, I see indiivual fixed pages rise in rankings within 7 days... am racing to that threshhold.
If you claim to have all unique articles, but still got hit. Take a good hard look at them. Are they articles that are slightly different than what else is out there? Are they on average.... short.... or on average.... 500 words? If so, you need to harden up and take action.
Ummon, keep at it. And please keep us updated about your progress. Very encouraging.
|Someone put forth the idea on the forum that somehow Google builds a score of what it considers "low quality" pages and dings the site as a whole if it reaches a threshold. |
That idea was being floated pretty strongly by Google engineers interviewed at the TED conference. So it's not just any "someone".
The pages Panda thinks are really crappy in themselves get the deepest demotions. The rest of the site gets a less severe demotion because it's somehow contaminated and doesn't deserve as much search traffic. From what I've seen it looks like a weighted demotion, depending on how many clicks separate the other pages from the Panda-targeted pages. I'm not 100% on that pattern, but my preliminary look seems to suggest it.
The idea is, I think, that Google wants to send visitors to SITES that give them a good experience, not just to "a page" that is relevant for their query phrase.
My first post here. :) I have one site effected by Panda and have lurking here looking for tips. I did get it out of the penalty a couple of days ago. Still traffic from Google is 30% of what it was. It's a celebrity site that lost 80% of it's traffic on Feb 24th, and was no longer found for its main keyword.
Also it is a Wordpress blog. I deleted about a third of the posts because they were "thin content." I installed a new theme and made some new posts. That was not enough.
I installed the subpubhub and WP to twitter plugins made a couple of posts, and that is what finally pulled the site out of Panda.
Hope this helps some one.
To continue what tedster is saying, the homepage could be devalued and most likely will since the homepage is usually essentially a portal into those other less quality pages/sections of content. My advice to everyone affected is to evaluate all the pages that are linked from the homepage and start there with your evaluation.
Evaluate your site as a whole and make any quick improvements, do all your links work? are there repetitive page titles/meta tags etc? Is all the little details updated? example copyright 2007-2011 (have you updated it to 2011 from 2010 yet?). Do not leave out the little things.
Also if you dont, add an about us page it could possibly help and makes your site look more like an authentic company.
I don't think an about page or up to date copyrights will do you any good. My index page was hit and that wiped out the whole site. I reduced the amount of posts displayed on the index page so the content was more easy to manipulate. Then I pushed out fresh new posts with the plugins I mentioned above.
Welcome to the forums, snackers. I hope everyone has registered the fact that you DID recover traffic for your main keyword after seeing it buried deep by Panda.
Thanks for posting about it - there are very few success stories around right now, but the fact that several people posted today about returned rankings is heartening. The feeling was growing that once demoted you would be stuck in the pit for a long time.
It's good to know that there can be upward movement.
[edited by: tedster at 2:32 am (utc) on Apr 22, 2011]
|It's interesting that you can get individual pages to recover, since Panda is a site wide penalty. That flies in the face of what we've heard, seen about it. |
People keep ignoring the fact that just because the site has been Pandalized doesn't mean all your top 10 rankings are gone nor does the site have the inability to rank new content.
Certain pages, even pages that are 100% identical to every other page site wide can still rank like crazy while the rest of the pages slip slide into oblivion.
There's something obviously being overlooked, it can't be that hard to figure this out.
Or... what Tedster said.
Thanks for hammering away on that point, Bill. It's an important one.
1. First comes tagging a page (or pages) as what Panda feels is shallow content.
2. Then comes a site-wide calculation of how much those page scores are going to influence scores for other pages on the site.
3. And since Panda 2, the possibility that some pages will be identified, even on a Pandified site, that the algo decides are good enough to give a boost, past the negative scoring.
I've seen real world examples, so I know that a site can lose a lot of rankings and still be strong for other keywords, even "big" keywords.
|2. Then comes a site-wide calculation of how much those page scores are going to influence scores for other pages on the site. |
Is there any data on whether the collateral damage correlates to the flow of page rank?
Meaning, do the pages that are linked directly from a bad page (or a group of bad pages) suffer more than pages that are more distantly linked from the bad page(s)?
Or from what you can see, is it more or less a "blanket" effect that affects all pages more or less equally?
1/4 of our 6 year old site was untouched by Panda, and that was the blog. The rest of the site kept high positions for non-competitive key words, even two word combinations. The problem is only with highly popular searches! A bunch of 20 top queries suffered. And that is around 75% of the traffic.. By the way, check out Earth Day Google logo, you will find a nice animal hiding there, guess who?
>>>(1) All your #*$!ty pages get hammered.
>>>(2) your main pages, usually the not #*$!ty ones, get hammered too.
I think this is a really good theory. The problem for me is figuring out which pages Google considers to be the crappy ones. There's no pattern for me, to the way my pages were deranked. Most of the ones which were deranked the hardest were actually the thickest, 1000+ word articles composed entirely of original writing.
So the trick there is how to figure out what Google thinks is thin, because it doesn't necessarily penalize the thin pages the most.
>>The pages Panda thinks are really crappy in themselves get the deepest demotions.
@tedster How would you fit this theory with my situation, where many of my heaviest demotions were the thickest, most content heavy, most original pages?
|do the pages that are linked directly from a bad page (or a group of bad pages) suffer more than pages that are more distantly linked from the bad page(s)? |
I'm still studying this area, but the strongest correlation I see so far involves collateral demotion for pages that link TO the "bad" page, not pages that are linked FROM it. If Google's purpose is to keep their users from visiting pages that make for a poor experience, that direction (linking TO the bad page) would also make more sense than flowing the demotion outward FROM the bad page.
The challenge is this: there is no easy way to know if a mildly demoted page is catching the site-wide collateral damage score, or if that page also "earned" that smaller demotion in its own right.
about 4.000 pages on one site and 2.000 on the other! we are going page by page. I personally removed about 20 already with thin content.
@shatner If all or part of your thick, 1000+ word articles are available anywhere else and you are not the #1 rank (to check: google them, sentence by sentence, in quotes), then google quite possibly considers them #*$!ty pages. It doesn't mater how original they were when you wrote them and posted them, (or how unfair it may be if someone scraped them or copied them, etc.), all that matters is: how original are they TODAY and does google see them as YOURS?
|I think its clear that Panda appplies a 2 part filter |
(1) All your #*$!ty pages get hammered.
(2) your main pages, usually the not #*$!ty ones, get hammered too.
I think the transmission mechanism of (1) to (2) is via internal links.
In the past, if a page got filtered, it lost it's rankings, but not it's link juice. Now it appears that link juice it passes diminishes too, so if you have too many of these weak pages, they start to topple everything else as the links supporting other pages weakens.
To me this is the reason Hubpages got hurt while Squidoo didn't. Squidoo has never had much internal linking, so a bad page was isolated, other pages never really relied on links from it to rank, and are therefore unaffected.
Of course all pages on sites link back to the homepage, so if you have a lot of bad pages with weakened link juice, the home page should start to topple too, as the stuff that previously supported it gets taken out.
Edited to add: according to Quantcast [quantcast.com], Squidoo's traffic is now higher than it was during Jan and feb - even though they attract the same sort of spam that Hubpages attracts. The difference must be that their site structure isolates the bad pages so they can't contaminate the good.
Think of it as Page Rank in reverse or Negative Page Rank.
It used to be on my site if I had PR5 on the home page, a page linked from it had PR4. A page one directory level down from that had PR3, etc., etc., and so on and so forth.
What if Google now applies the same PR concept it had applied for strong pages for weak pages?
I used to get ranking simply from anchor text in internal links.
What if the same type of idea is now applied negatively. The more internal anchor text links to bad pages, the more likely, Google estimates, users will have a bad experience on that site.
In theory then you have to remove the weak pages and/or the links to those weak pages.
Google used to lower rankings for sites that linked to "bad neighborhoods" on other sites. What if Google now lowers rankings for links to weak neighborhoods (pages) on your site?
As much as I hate the effect, if I understand the theory correctly, it's very clean, logical, and consistent with the original Google principles.
|What if the same type of idea is now applied negatively. The more internal anchor text links to bad pages, the more likely, Google estimates, users will have a bad experience on that site. |
Or it could be a ratio of internal links to external links. If a page only has internal links, it means no-one outside thinks it is worth linking to (and links from scrapers don't count as they link indiscriminately).
A good test of the above could be finding out whether pointing fresh external links to a page revives it.
|tedster wrote: |
I'm still studying this area, but the strongest correlation I see so far involves collateral demotion for pages that link TO the "bad" page, not pages that are linked FROM it.
Have you seen, or even looked, for any correlation between a demoted page and external links to "bad" pages/sites? I imagine not being able to see Panda's effect on that external page/site would make that difficult, at best.
I ask, because I've seen the idea brought up elsewhere in this subforum, but I don't think it received much comment. If internally linking to a poor quality page is bad for the linking page, then it seems reasonable that externally linking to a poor quality page/site may also have an effect. In the same way that you feel Google doesn't want to send people to "bad" sites, they may also not want to send people to sites that associate with "bad" sites.
That's also something that I think can't be easily abused. The webmaster is in almost complete control over external links and in situations where they don't have that control, they almost certainly have enough to "nofollow" those links.
hey guys, need your thoughts, i have a network on sites that have adsense, they were all hit same time, traffic dropped 90%, when you do a search for 'domain' or any other keyword we were ranking for we at pages 5, 6, 7.
But doing a domain.com site is there, PR is same, indexed pages did not change...I am almost sure its adsense that caused this, did anyone of you see the same penalty?
|it seems reasonable that externally linking to a poor quality page/site may also have an effect. |
I don't think that's part of Panda - because it has already been active in the core algorithm for several years. In fact, saving your site from that kind of demotion was the original reason the rel="nofollow" attribute was invented for links.
@tedster said: >> @shatner If all or part of your thick, 1000+ word articles are available anywhere else and you are not the #1 rank (to check: google them, sentence by sentence, in quotes), then google quite possibly considers them #*$!ty pages. It doesn't mater how original they were when you wrote them and posted them, (or how unfair it may be if someone scraped them or copied them, etc.), all that matters is: how original are they TODAY and does google see them as YOURS?
That's what I'm thinking too. Which perhaps suggests that one of the main reasons I am pandalized is because my content has been so widely scraped perhaps, since my most heavily penalized pages have the most content?
If that's the case, what do I do? I can't stop the scrapers. This seems like a question Google really needs to answer.
Does anyone see templated title/description/h1 content as being a big negative? I'm suspecting one of my sites is suffering from this.
|Which perhaps suggests that one of the main reasons I am pandalized is because my content has been so widely scraped perhaps, since my most heavily penalized pages have the most content? |
Same problem here. My pages that are badly ranked are those that were heavily scrapped. It looks like scrappers have literally taken my positions. I know from my experience where I can rank approximately for some articles and on these positions are scrappers with texts taken from me.
We have discussed it here: [webmasterworld.com...] and there's some info on how to stop scrappers.
| This 89 message thread spans 3 pages: 89 (  2 3 ) > > |