They made 3 changes that took a total of about 30 minutes to do and they have made about a 90% recovery.
Yay! ...but, hang on, what were they?!
I have noticed that in my niche real estate. The site with the least articles and content are ranking better in the last update. They have seemed to penalize content as all area content is similar or city information is 20% or so. The sites with no content and and just real estate listing are doing better. I have sites with 10 pages that do much better then my site with 500 pages of content.
The first 5 sites in my area have almost 0 content just listings. A big change.
I got hit by panda 2.1 in early April (80% reduction in traffic), but the site returned to near normal 3 weeks later. Around June 22nd my traffic went down 98% and almost all search terms that ranked well are now on page 6 or 7. Anyone else seeing a reduction like that? All pages are still indexed, and page rank is still showing. Can I expect this to reverse as the previous panda did, or is this something different that is ocurring to my site. Very upset.
I can not disclose the full details of the ecommerce site. 3 of there competitors were knocked out by google for doing the same thing and I am under a strict contract with about about disclosure so I do not want to test the waters. I can tell you that they did not delete/no index or block any pages/sections and they did nothing to there current duplicated product descriptions.
@brinked I guess the clue is in what you left out then ;)
However I'm afraid that what works for one site will not work for the next .. otherwise your own site would be OK.
Just noticed - me a Full Member now - do I win a t-shirt or coffee mug ?
[edited by: johnhh at 10:38 pm (utc) on Jun 26, 2011]
Anyone else seeing a rise in SERPS for what I call 1995 sites - basic table design - no css - terrible design and colours - the sort of thing you did with free "design a page" software.
We also see a major rise for boiler plate " please send us info on red widgets " nicely diluted by some menus and boiler plate text with related key words - reminds me of pre 2005 stuff.
[edited by: tedster at 11:24 pm (utc) on Jun 26, 2011]
When I took on these clients I tried to put panda completely out of my mind. Instead I focused on obvious flaws I saw from a user stand point and pointed them out to them.
We worked hard on improving the general quality of the site. One site had 10px font everywhere and was a bit hard to read so we increased that to 12px, added a customer support ticket system and many other small things like that.
I look at it in the perspective that if I can not recover there sites rankings, I can at least make there site better than it was which will count for something especially if it means more conversions and better visibility.
These were 3 very different sites and I do not know if the changes we made even made a difference in panda or not, or if panda just rolled back an update or series of updates that was causing the ranking loss to begin with.
As someone who is providing as much detail as I can about what I've done that worked against panda, it's pretty single eyebrow raising to have someone report success whilst refusing to detail anything they did.
Aw, c'mon. He DID detail what was done on two out of three sites...
That third one was bound in a legal contract matter.
I dedicate a lot of time to researching and trial and error. I share my experiences with this forum out of my own kindness. A lot of people post simple theories and whatever is on the top of their heads at any given moment.
I am posting details about clients websites in which I have strict legal contracts with. I should not be posting anything at all about them, but I do to contribute to this community. I am always VERY transparent about what I do on my own websites, I share more than most do.
I like to post things I find based on research and facts and not on conspiracy theories.
Find it however eyebrow raising as you like, for now on I will post nothing about my findings.
what worked for him might not work for you. That's a fact as you can see many sites going back and forth.
And if it's a loophole, google closes it.
Lastly, everything here is on trust. He can easily tell you to cut your title to 6 words because it worked. Or no-index your category pages. Or...someone can lie and say he came back or was penalized.
Since content is not working for me, I am testing on making visitors go through a loop, not giving them the information right away as many of my competitors do. They are ranking, I'm not and I'll report if it works. Maybe time on site really matters for Google, even it's a bit annoying to the visitors. Google can't tell why they stayed an extra 30 seconds on my pages and they might see this as a positive thing...the users must be amazed how I included the other side's POV or whatever ;). If this means you escape a 70-80% Panda demotion, you adapt. Eventually you find a middle line between annoying them to much and surviving. In the process you make the web better, or whatever Panda's goal is. Whether it makes sense or not, means nothing, if Google put it there, it is so.
If you are a popular tech site with tens of thousands of followers and 1+million monthly visits you may want to try tweeting. Another one was un-pandalized after 4-5 days [google.com...] . Makes you go hmmmm...Panda didn't run again last night, did it?
if Google expanded the size of their main index?
if Panda were a method by which Google defines if each url will be placed in supplemental index or if will be placed in main index?
I know, I know, the supplemental index and removed (hidden?) long time ago, but..
-why many sites loss only long tail visits? (very significant!)
-why many sites with original content are affected? (very significant!) user experience data?
-doesn't Panda penalized duplicate content? Not, The scrapers still in search results.
Is Panda a filter that has been designed only to spot what it believes are low-quality pages?
Main Index Ratio of all sites that are affected (3 sites affected of 12) is less than 10%.
Calculating your supplemental index ratio (roughly)
Total Pages Indexed = site:www.yoursite.com
Pages in the Main Index = site:www.yoursite.com -inallurl:www.yoursite.com
Main Index Ratio = (Pages in the Main Index x 100)/Total Pages Indexed
Now there are fewer number of landing page and there are fewer number of keywords for each url (in my sites), this the same happened in caffeine update. The long tail are distributed between a greater number of competitors. These new sites ranks the pages more related semantically/syntactically with searched keywor.
I've noticed that changes in the serps, that could means that there are changes in some factors of the algorithm, or could be a recalculation with the new data.
Actually. Let me totally withdraw my comments and apologise. I stuffed up.
I missed the first post you made on this thread... I though your first post was the one mentioning fonts etc.
I was like... oh come one dude... is that all you are sharing? I understand the need to keep some cards to yourself.... but really is that it?
The answer is no, its not it, i got it totally wrong.
@Errioxa, you bring up some interesting observations. I've noticed some of these things myself, but I don't think they are all part of Panda, I think some may be changes to the rest of the algorithm.
However, there's no doubt that Panda came in response to the high number of URLs that Google was able to index since Caffeine was installed - Amit Singhal and Matt Cutts said that in an interview.
It's cool Nippi. All makes sense now and just a misunderstanding. Can't say I don't skim over many posts myself as well.
Ted I have automatic autoblog with original content but is very low quality. This low quality content was created purposely to try panda update (that's the advantage we have, that Panda had to learn Spanish), sometimes this text is unreadable...The original content is in english (scraped), after is translate with a api (not Google translte) and publish on the site. Not link building.
Before Panda update this blog recives many visits, after panda update the traffic down 80%. Google is able to know that the text isn't good! semantic? sintaxis? G know it....
Second site, with original content (copy, rewrite and published in spanish) hasn't been affected. This content is good quality and original.
Other site, this site is in several languages. The best content is in spanish. In subdomain "es.domain.com" hasn't affected. However the english and italian subdomain is affected. This content (italian and english) is of less quality than spanish content, sometimes, for example, Spanish text and English text is mixed. The Ip domain is spanish.
Does Google use something like a spell checker? perhaps G uses the spell checker of Microsoft Word :p
How does Google evaluate the quality of a url or site? I'm seeing that the the traffic from google organic with a one visit or less my traffic has dropped significantly but with advanced filter "more than" 1 visits the traffic has increased! could be that Google evaluates the quality of content with this metric?
When I use these flters with data of same month on last year it's the same... I don't understad t
Big Media starts spinning Panda a great thing: NY Times [opinionator.blogs.nytimes.com ]
I enjoyed that article:-)
|Main Index Ratio of all sites that are affected (3 sites affected of 12) is less than 10%. |
Interesting idea, but in my case at least it doesn't appear to be related:
Pandalized site: 50.5%
Non-pandalized site: 56%
It seems unlikely that a 6% difference accounts for a ~60% drop in G referrals. Furthermore, pages that suffered the worst losses are still in the main index.
Main Index Ratio of my sites:
Pandalized Sites: 5.1%
Non-Pandalized Site: 55%
Does it make sense?
@freejung in the site affects, what % of this domain is on the main index? 50.5%
I'm sorry, I don't understand the question. Pretty much the whole site is indexed, my sitemap in WMT claims to be completely indexed. I used the search queries you suggested to calculate what you're calling the Main Index Ratio for the pandalized site, and 50.5% is what I came up with.
I should point out that I'm not 100% certain that my traffic losses were caused by Panda (of course, nobody really is). Right before the Panda 2.0 update, I had about 3 hours of downtime which resulted in a number of pages showing a DNS error in WMT. I've had that sort of problem before and seen it negatively impact rankings -- but not this badly and not for this long. For the first few weeks I had no idea whether I was pandalized or not, but as time goes on I am more and more convinced that in fact I am.
|Does Google use something like a spell checker? perhaps G uses the spell checker of Microsoft Word |
They certainly do check spelling and grammar - they even score reading level, but I'm sure it's their own software, not Microsoft's.
|How does Google evaluate the quality of a url or site? |
That's been the big question here for the last four months. They appear to be using a lot of different signals, and then combining them in very complex ways. So for every person who is sure "it's all about THIS" there's another who says "I fixed that and it did me no good."
|I'm sorry, I don't understand the question |
Google Translator is to blame!
I don't know if it is a cause or consequence, but it caught my attention. I don't think that the bounce rate is a factor or signal, I've TLD domains for each country, some domains are affected but not others not. The bounce rate is similar between them.
In other, with -15% down traffic, the bounce rate is very low (<23%).
I think that semantics and/or sintaxis is a crucial factor, Google could be compared between different pieces of your own website and identify redundant content, in adittion Google could use something like a spell checker.
I know that I know nothing!
I have other site with 500 url indexed only, 100% in main index, and 400.000 visits/month.
In this web I don't use pagination, the content is re-write and is medium quality. This web isn't affected :)
Are there any reports of large sites coming back from Panda?
Most of what I've read indicates that smaller sites are the one's that have come back.
Given the huge cost of writing unique content onto large sites, I wondered if there is any consensus on whether it is best to completely block and/or remove large amounts and build back content, or block/remove a lesser amount and build out content.
There's a big cost consideration to this as large sites simply would struggle to take the risk of a big rewrite.
If anyone answers this in reference to their own success' or observations could they indicate the size of the sites they are referring to.
I have noticed a lot of shuffling of the SERP's ever since panda was announced/released worldwide.
The low quality sites google pushes to the top ten end up falling further and further down over the last few months. The websites who received a boost that are of good quality seem to be sticking around.
I like the fact that google is not letting these poor performing websites stick around. I also noticed that links do not seem to matter as a very low PR, small backlink profile website is performing very well as it should be.
I guess I'll need to start putting more work into my sites from now on, no more half-arsed Amazon associates sites or AdSense Micro-Site Networks :(
At least hay was made whilst the sun shined, time to get serious either with PPC or hiring top quality writers and running it more like a business rather than just some thing I do in the evenings which pays for stuff.
|Are there any reports of large sites coming back from Panda? |
It's a big world out there. I guess silence means a resounding NO with a large gong sound? Y/N
|Main Index Ratio of all sites that are affected (3 sites affected of 12) is less than 10% |
My Pandalized (substantial, established, commerce) site: 7.5%
Competitor A (similar in size/age to mine, main beneficiary!): 31.4%
Competitor B (most similar business to mine, holding): 9.1%
Competitor C (large site, from out of nowhere, now falling): 5.4%
Competitor D (affiliate minnow, from out of nowhere): 75.4%
Competitor E (minnow, from out of nowhere): 32.3%
Competitor F (one of biggest players, holding): 25.7%
Competitor G (one of biggest players, holding): 17.9%
Competitor H (established but new to big terms): 53.1%
Competitor I (big established player, modest boost) 54.1%
Competitor J (substantial, established modest boost) 41.4%
Competitor K (substantial, established, back after hiatus) 40.8%
Competitor L (one of biggest players, brand, slipping slightly): 21.3%
So, only Competitor C is lower than my site and all but one are significantly higher!
Of course, these numbers are but a symptom of the real problem, but is that:
a) Shallow (however that's figured out) content?
b) A weight of poorly linked (internally) content?
c) Something else about the make-up of the pages that get excluded?
| This 145 message thread spans 5 pages: < < 145 ( 1 2 3  5 ) > > |