What i know is, i had 5 sides pandalised.
where i have done major content fixes, sites are recovering.
where i have done nothing, those sites are as bad or worse.
|i am not even sure how you are guessing it to be a gloabl rollout of panda and the information that you had given is totally confusing. |
1) Are you talking about stats for different sites?
2) Are you seeing the drop of 40% now for sites in European languages but don't see any changes for sites in Asian languages?
3) Have you made changes to English language sites to recover the loss of 20%?
4) Was the recovery to the same pages where you lost traffic or you see a different traffic patter now? Even among the few who claim to have recovered, it isn't recovery of pages that lost ranking/traffic but it is a different traffic pattern that ensured the same overall traffic volume. These few people are those who have been minimally affected by Panda.
We are not seeing any big changes to English language sites that suggest an algo change for them, except for a few commerce sites.
1. No, this was on one site (my main focus).
2. Yes, European SE traffic has dropped some 25%. IT, DE, FR, EN-GB. No traffic changes (at least via Google) in Asia.
3. My English language traffic had returned to normal after fixing some dupe issues, canonical URLs and so forth, until Jun 14. Looks like I didn't go far enough.
4. Trafficked pages haven't shifted too much (less than 5%).
But looking at my country stats, it's pretty clear that this is two fold.
1) Expansion of Panda to at least Europe (foreign language) and most other regions (except for Asia). Matt Cutts claims this too:
|Danny Sullivan: Where are we now? Panda 2, Panda 2.1? Are we at Panda 2.2? |
Matt Cutts: There’s another change coming soon. I don’t know when we’ll launch fully internationally, not just in English.
2) A reiteration of the Panda update to fix the scraper issue in the first two rollouts.
Can anyone else corroborate my data?
nippi - can you explain in more depth about what 'major content fixes' have worked for you?
One of my major money sites was hit in Panda "V3" I believe in April (can't keep track anymore). Traffic dropped off a cliff as well as revenue that day. I just noticed late last night that my site has returned to its first page rankings where it was before the drop. Here's hoping it holds!
@martinacastro - I think we're seeing some major statistical sampling by Google and I'm trying to pin down the mechanism they're using. Have you taken notice of the Google IP address that each set of results comes from?
Hi tedster, I have checked the first page several times and when my site changes position 2 or 3 sites of the first page goes to the second page. I use to check google proxys and the results are very different.
Thanks for your comments
I can confirm a 50-60% drop in traffic on a previously non Pandalized and stable site on an AU domain commencing yesterday. The site had good authority links , strong social signals ( FB and Twitter ) unique content on key pages, some aggregated content on product pages and has been around for nearly 8 years.
Pages weakly linked to seem to have held better.
Across Pandalized sites with new unique content added, there are no recoveries, and on one there is a further 30% downward movement yesterday.
Looks like a Panda 2.2 to me on this limited sample.
[edited by: Whitey at 12:04 am (utc) on Jun 19, 2011]
Whitey - Is there a way we could email offline? Your issue sounds similar to mine as we have some aggregated content too.
Anyone else get penalized in the past week which you think is from aggregated content?
Ok. Sites I manage are article business directory shop combination sites.
Articles in the main were not too original, ditto the descriptions of businesses and products. Some businesses and products also had short descriptions result being detail pages were both duplicate and thin.
Using software I found out my 150 most duplicate articles and rewrote them. This done by a writer of major newspaper level skill.
Reprogrammed my directory and shop so now all links to thin detail pages hidden from search engines(I have set this at all pages with a description less than 125 words)
These pages also no index no followed.
Using software determined which businesses had duplicate descriptions. Emailed them explaining the problem, offered them a rewrite of their description for $15 in return they would get a feature listing for 6 months (appear above other listings, randomly appear my home page and links set to follow). 1400 of 27000 went for it
I removed links to the dupe content detail pages for the rest and no index no followed the rest.
I'm back baby. Not 100% but I shouldn't be. Still got lots of articles to redo and I may never rank for some things I ranked on before. Ive regained 80% of lost panda traffic.
RankInge and traffic improvement is on all pages but highest on the pages I fixed.
|RankInge and traffic improvement is on all pages but highest on the pages I fixed. |
@ Nippi - Was that in the last 24 hours or so ?
1400 newly acquired articles, 150 rewrites and dumping referall links to thin content looks like it tipped the balance sitewide, or was it just for those rewritten pages?
[edited by: Whitey at 12:15 am (utc) on Jun 19, 2011]
I added some new content (3/4 new pages) and updated 150 pages of my site (it have a total of 350) and my traffic is down 30% some strong kws dance between position 50 and 200.
Yes in the last 24 hours.
It really looks noindex,follow URLs do contribute to your Panda score.
Some more observations on the limited sample i see :
Snippets of aggregated content identify correctly the original source as the most authoratitive
On this very isolated example I'd block or rewrite all aggregated text descriptions. Better to have less content well written than lot's of aggregated content.
The difficulty handling the fixes is the time between updates - and the limited number of reports. But for now the focus should be on firming up these reported returns and implimenting remedial work fast.
[edited by: Whitey at 12:27 am (utc) on Jun 19, 2011]
I also see that google index some pages with noindex tag...
When did you notice the comeback? Was it a one shot thing or gradual?
In my mind - panda is not even worthy of being called a new phenomenon. We were all able to get by googles existing thin and dupe content rules with inbound links. And then we couldn't.
I am convinced any site can recover with improved content and that it's content, not adsense or page speed ghat is the key. My pages are in the bottom 75% speed wise and I have 3 adsense blocks per page with 2 above the fold.
Oh! That's he other thing. My articles are now all natural length for the topic. Means some are 450 words long, some are 1500 words long. Previously I had 650 of between 500 to 550 words.
Bam one shot
Thanks for the report.
Could you give us a rough timeline on when you started making changes, and how long that took, relative to your recovery date. Thanks.
I'll save Nippi some time since I was looking at just that (and he's probably at the liquor store right now, I know I would ;)).
Message # 4302252 from Nippi [webmasterworld.com...]
|7:35 pm on Apr 21, 2011 (utc -5) |
I think its clear that Panda appplies a 2 part filter
(1) All your #*$!ty pages get hammered.
(2) your main pages, usually the not #*$!ty ones, get hammered too.
You need to fix a great mahority of the #*$!ty pages, for the non #*$!ty penalty to be removed site wide. Fixing indivudal pages will undoubely see them improve, but youve got to cross that "ive fixed almost all of the #*$!ty pages" threshold to see a sitewide fix.
I have two sites,one penlaised, one not.
both have lots of articles.
both have huge business direcotires, which by any definition, are thin.
Difference is the one not penalized has better(read - totally unqiue articles, not articles based on wikipedia slightly changed) and also, it does not have lots of random content like "4 featured businesses" on every page.
The large thin business directory, seems not to be the major problem, its the article farm/dupe content/lack of unqiue content to the page becuase you are putting content that appears on other pages on every page that is the big problem.
I am rewriting articles as fast as i can and yes, I see indiivual fixed pages rise in rankings within 7 days... am racing to that threshhold.
If you claim to have all unique articles, but still got hit. Take a good hard look at them. Are they articles that are slightly different than what else is out there? Are they on average.... short.... or on average.... 500 words? If so, you need to harden up and take action.
[edited by: walkman at 1:06 am (utc) on Jun 19, 2011]
Nippi, do you have affiliate links or just adsense?
Cool thanks for the link and quote. I love you.
nippi Chug a beer (or 20) for us.
Ok on order
I fixed the business directory and shop 6 weeks ago. Had little effect in fact I dropped further shortly after.
Fixed 70 articles at the same time.
Then loaded 50 new articles 6 days ago. I believe these took me over the panda threshold and my site came whizzing back.
50 more gong up tonight.
No I have no affiliate links but would not be at all concerned about having them on quality pages though there is really no reason to ever have affiliate links on a page that search engines can follow.
Thanks for sharing this update Nippi, and congrat's on being the first with the breaking feedback. Hopefully a lot more folks will wake soon to good news, it being late Sat night early Sun morning across Europe / North America. Perfect timing and no accident.
Actually, as i speak there is continuing downward fluctuation on the newly effected site, so i imagine this re itteration is going to take a while to settle down.
I suspect we are going to see winners and losers again, plus some insight into what allowed some sites to escape the earlier updates. Hopefully we'll have a lot more reports like Nippi's. Those actions were well anticipated.
|Hey to all, |
What we all seeing at the moment (and I am only talking about GWT) is not panda 2.2 is in fact a pre panda 2.2 check (Little unfortunate but I am not able to mention the name of the Google employee, who has provided me with this info). Now I my self was badly pandalized in March and then in April and the only thing that cam back is my site links, bot no traffic is coming back to any pre panda levels at all.
One other thing what is still not being mentioned anywhere is that in Fact between 24th February and now Google made 17 other roll outs of Panda, so I am totally lost whether it is Panda 0.1, 1.0, 2.0 or 2.2. What do we really know about this.
Bolding mine. That's a pretty big statement AlexB, thanks for sharing it even though you can't verify the source. I noticed 200 new 404 error pages appeared in my GWT recently, pages that haven't existed since 2005? or so. They had been forwarded with 301 way back then as part of a combining of site features and everything went smoothly.
Certainly Googlebot picking now to review every url it knows about suggests you might be right.
I can see in big sites (Google.es, Ip from Spain, spanish langage)
-duplicate content and original content are similarly affected (same domain)
-rankings does not change for the most important keyword of each URL in the site
-not ranking for many of the long tail keywords (I haven't previous data)
-now there are fewer landing pages and fewer keywords for each landing page
This it reminds me of caffeine update.
Could be a spoonful of caffeine?
Could be pre-panda a spoonful of caffeine?
Could be, but then we'd need to figure out what Panda is.
I think Panda chews up keywords and spits out pages that don't taste good and he runs independently of traditional "big push" updates. It makes sense given recent longtail changes. It makes sense given G's need for speed, they want faster updates too. It makes sense given G's needing to expand beyond "links" as a primary metric since something has to apply what they've learned.
I think Google moved away from treating pages independently, at least a little. I think your overall metrics matter more now than ever before and that's partly why it's harder to float single pages up for longtail searches from a lower quality site.
I'm reading a lot about people having some success with recovering rank on good content by removing duplicate and low quality pages... I think we'll figure Panda out soon enough.
edit: Panda may be a more automated version of the old system that monitored page one results. GWT tells me that many unaffected pages didn't rank top 10 before the updates began anyway. We may not understand EXACTLY what Panda does just yet but we can surmise what Google's intent and method is with Panda. Hopefully that clears up all of the confusion, fear and loss of mortgage money without a clue as to why.
If Panda has ears... Please don't rank me for what I should be, rank me for what I am!
My site dropped 30% of search traffic, from 14th June
Congrats. Were you hit by February Panda?
Ok. let me tell you how I feel this works. They collect data (pages) to their servers and then evaluate each of those pages for all their "Quality" factors. This is an intensive process and Panda is the guy who gave them a breakthrough from a technical perspective. He isn't responsible for laying down those quality rules but he is the guy responsible for making this evaluation possible.
considering the method used for evaluation, not all changes are analyzed by google often.This might explain why sites are not coming back even though many claim to have fixed them. They will probably run this evaluation every 4 months (guess) and generate a panda score.There are probably two scores. One for the individual pages and the other for the site as a whole. A certain number of pages with a quality score above a certain threshold level aren't pushed down harder and they continue to do well to some extent.Other pages that fall below this threshold level are pushed down badly (irrespective of their original ranking as determined by the relevance factors).
But there are several other changes that Google introduces now and then and these are applied in the usual way.
What might be worrying are these two:
1) Panda (= "Quality algo") isn't run often.
2) Its impact on rankings for pages are much greater as the site rank may pull down even the best pages. i.e. wile the other factors may assign a good rank to a page, this panda algo has the ability to push it down by several pages in SERPs.
If you know how the infamous adwords quality score works, this might make more sense.
[edited by: indyank at 9:05 am (utc) on Jun 19, 2011]
Australian site, so panda February hit my us traffic only so it was noticeable but small hit.
Traffic is back to about 5% below February levels. Rankings are in fact higher on many terms, but also completely gone on some things as those pages are now 'gone' from my site so no chance to rank for them.
:) at indyrank. We're on the same boat so there's a lot of wishful think on our part.
The only reason Google would run it at 4 months is to punish us. Panda, looking at the patent, does take a lot of resources. So much that they had to devise a system to continue where it crashed, but this is Google in 2011: they are translating everything for free, hosting code, giving access to unis to test stuff and so on. So they have resources, IMO. Probably not to run it live but at least every 1/2 weeks.
My theory: US sites hit on Feb 24th will suffer for being in the same boat as eHow, Mahalo etc. Imagine if Mahalo or ezinearticles were back at #1? There's public relations at stake as well, it's not engineering...just as it was when the press bashed Google pre-February. Hopefully they are gentler with smaller sites, that are small enough to be fixed manually. Even a /cat&3?prod=4 mess with dozens of empty URLs and some bad luck would have probably been enough to trap a site in Panda, IMO. All you need to do is send signals of spam, not be spammy.
| This 210 message thread spans 7 pages: < < 210 ( 1 2 3  5 6 7 ) > > |