|Many Weeks since the Panda Update - Any Improvements?|
It has been 2 weeks now since google's Farmer update on Feb. 24th, for the sites that are affected, anyone see any improvements? For my site, we have started to remove low quality content a week ago, but have not seen any ranking improvements so far.
BTW: Google did not admit to having a 'white list' they call it an exception list, and the one site I know of (via tedster referring to a situation he knows of with John Mu making a manual exception for a site) on a list like that gets flagged for a hand review every time it trips the filter it would normally be filtered from the results by, which to me doesn't sound like a white list at all ... You don't hand review sites on a 'white list'; you let them do whatever they want.
I do think we should try to keep the discussion separate though, because right now we have too many topics in too many places to actually discuss anything, so I probably won't be replying to any more 'listing posts' in this thread...
Let's take the listing discussion back where it belongs, here:
Does Google Keep a White List [webmasterworld.com]
You must be skipping a lot of posts if you think Google is using "thin content" as a quality metric. All three of my sites are fat content. The majority of our pages are a couple thousand words with multiple pictures as well.
I'll toss out something on the other end. I have a site that I am currently working on that has a lot of "thin" content - some areas that haven't been fleshed out yet, and others that just by the nature of the content, it'll probably always be pretty thin, because the users have indicated that that's what they want.
However, it's content that is not readily available anywhere else, is shared a lot, generates a fair amount of return visits, and is doing fine in the search engines; growing every day, in fact. Panda had no discernible effect on it whatsoever.
I'm thinking it's a mistake to just focus on the "thin" part.
One paragraph of unique, useful, popular, necessary, and/or shared content is probably worth more than 20 paragraphs of content where most of those adjectives don't apply.
The sad thing is that considering many good sites were affected negatively by this update, and poor sites like eHow gained, we can make all of the assumptions we want on our sites like netmeg and his thin content, or others making changes, but what if we have nothing wrong on our site, and Google struck us down any way?
We could be chasing our tails.
speaking of assumptions, netmeg's thin content is hers, not his
Wrong may not be the best way to think about it, because this is not black or white, right or wrong. It can't be, when there are only ten spots on the front page for eleventy billion search terms (and websites). It's a bunch of things that have to align in certain ways. And of course we're always gonna be chasing our tails, because nobody has the answer, probably not even Google.
But we can look at the signals.
|However, it's content that is not readily available anywhere else, is shared a lot, generates a fair amount of return visits, and is doing fine in the search engines; growing every day, in fact. Panda had no discernible effect on it whatsoever. |
I would deem that "The eHow Effect" - "thin" (or even "wrong") isn't a problem as long as the content is unique. When I've investigated the pages with which I've had the biggest traffic drop-offs, I'm finding either (1) huge amounts of scraping and plagiarism with Google having difficulty figuring out that the content originated on my site (they were much better at figuring that out, or perhaps it was that they were not "punishing" my site by accident, before this update), or (2) due to the nature of the content there's similarity between the pages and what you might find in similar articles and that's dragging down the "uniqueness" quotient.
While some people speak of a site-wide effect, I'm seeing a page-by-page effect. I think this algorithm change was the algorithmic equivalent of a shotgun, not a scalpel, so I think it's possible for pages and sites to potentially hit in any number of ways.
[edited by: tedster at 11:52 pm (utc) on Mar 18, 2011]
[edit reason] fixed the quote box [/edit]
While some people speak of a site-wide effect, I'm seeing a page-by-page effect.
I see both. On the sites with pages that got hit, they do seem to be dragging down the other pages though many are still on page 1 or 2 for their main terms, just lower than they used to be.
On my main money making site that got hit, one main topic area was specifically hit so it is quite obvious what they didn't like.
I see the same thing also, a few pages moved up, but most moved down. It seems the net effect was to bring my whole site down 50% to 60%. One page that moved up, and still ranks on page 1 for some good terms, is a "thin" affiliate page. It was a page that I linked to most conservatively (only once from the homepage, and in the top navigation of all other pages). Pages that got hit the hardest had 2 or more affiliate links AND were linked most frequently (3+ times) from the homepage with varying anchor text.
I did have one page that suffered the greatest position drop (-300 according to WMT), and that page had a spelling error in the top header tag <H3> and a very short/duplicate title/desc. Since making those corrections, WMT reports a gain of 200 positions for that page.
Most of my pages are error-free, ad-free, informative, and unduplicated. Only 4% of my pages contain any affiliate links whatsoever. So, I am pretty much at a loss. I have revamped some of my content on the thinnest pages (that contain affiliate links), and will eventually file a reincl. request if these changes don't help.
I was trying to crash two sites which survived this panda mess. Added excessive adsense above the fold, released thousands of thin URLs ... the sites are still fine.
"I was trying to crash two sites which survived this panda mess. Added excessive adsense above the fold, released thousands of thin URLs ... the sites are still fine. "
They have either been vaccinated against Panda Flu or Google needs to calculate their score again (with the bad pages).
Something is going on. I get 6 Google visits in 1 minute and then nothing for 5-10 while Bing and Yahoo (Bing) are way more evenly spread.
Open the faucet Google. I'll even go to church or something :)
after yet another sleepless week, this thread made me see things through a different perspective.
first of all, I can't really see the point of focusing on this white list. from what I remember from my college years, when we had to analyze stats, the first thing we had to do is remove the top highs and the bottom lows from the data set. This white list seems like a top high for me. Unless you have a real chance of getting in it, why bother? If you do get in it, then every little alarm you trip will probably result in a manual evaluation. This means that even a small, honest mistake, could mean the end.
|I'll toss out something on the other end. I have a site that I am currently working on |
I have a felling G is now using data from its history books. Your site seems new, so this could be one of the reasons why you're not seeing any effects. On the other hand, because it's new you could be coding it differently. Maybe throw in a few ideas you got along the way without realizing it.
|and poor sites like eHow gained |
after reading a lot of HTML docs lately, ehow seems to be a lot more 'quality' than before...
regarding the scraped content, i can't get it either. intellectual property should be recognized no matter the algorithm or other interests.
|I was trying to crash two sites which survived this panda mess. |
no offense SEOPTI, but for me this seems to be an extreme exercise. if this is happening on a larger scale, it could explain some of the interesting results we see in the SERPs.
|I have a felling G is now using data from its history books. Your site seems new, so this could be one of the reasons why you're not seeing any effects. On the other hand, because it's new you could be coding it differently. Maybe throw in a few ideas you got along the way without realizing it. |
Nah, the site pre-dates Google by at least three or four years.
|but for me this seems to be an extreme exercise |
If they're test sites with little value or useless traffic, the information on what crashed the sites would be very valuable.
|I was trying to crash two sites which survived this panda mess. Added excessive adsense above the fold, released thousands of thin URLs ... the sites are still fine. |
I'm not sure that should be a surprise. Hasn't there been a post somewhere by a Googler about the recovery process for collateral damage sites or target sites that rework themselves taking some time (a month or more?).
If that's the case, might it not be reasonable to think that they might not move too quickly while that initial process works its way through the system?
I find 2 things interesting about the recent update:
1) The pages of my site that were hit the hardest, and ultimately brought my traffic down 50% through some feedback loop to the homepage, were affiliate pages.
2) Since yesterday, I have had two affiliate managers (from two different companies that I work with) email me to ask why volume was so low, and told me that they have seen a drop in volume across the board since the beginning of March.
[edited by: crobb305 at 3:34 am (utc) on Mar 19, 2011]
wish i had other priorities 3-4 years before google :)
i'm still having a hard time defining 'thin'. there are too many mixed signals. some of them are 'common sense' level, but others don't make any sense at this point in my case.
|If that's the case, might it not be reasonable to think that they might not move too quickly while that initial process works its way through the system? |
If it makes your site better, I don't see a reason to "wait and see" - improving your site and content should help whether or not Google comes through with a fix.
|i'm still having a hard time defining 'thin'. |
At this point I'm going to announce the eHow Principle: "You can never be too rich or too thin". Seriously, I think this is principally about "unique", not about "thinness".
Someone mentioned site-wide effect or did it affect us page-by-page. I noticed some pages fell while others gained. Is that a site-wide effect? Maybe. Maybe I gained in SERPS that were dominated by lower quality content. The Farmer / Panda updates might have pushed them down "more" than I got pushed. So it appears to be on a page-by-page basis, but in reality it really was sitewide.
Thin isn't a death sentence, the statistics being thrown around just don't back up that assumption and it's not thin site owners out in droves complaining right now.
That doesn't mean it's not "thin is the problem" but at most it's "thin in the wrong places, like the index page, is a problem". IMO thin includes ads and affiliate offers so LOTS of those is still thin.
I'm sick of hearing about eHow, my 7 year old nephew has written better pages than they have on some subjects, he just can't pump out 100,000+ of them a month though, like ehow. If he could we'd have a 7 year old wallstreet millionaire which exemplifies the fact that something is wrong.
To me it's like a rich bully forcing poor kids to get handouts at the church for him. Bulk cheap.
I think sites received a cumulative grade which became part of their overall metrics and their rank fluctuates within that grade to a larger extent now. I'm hoping the test gets run again soon, and frequently, it's like a glass ceiling. It doesn't matter what was graded because the report card hits after study time is finished, there is no going back at least until next semester, assuming there will be another test. Until then your pay grade is stuck in a rank grade.
I have seen THIN sites doing very well
I have seen THIN AFF sites doing very well.
I have seen horribly designed pages doing very well.
It's a combination of sorts and I am hoping Google runs the report card again soon. I believe that sites haven't come back because Google hasn't re-calculated anything yet. Personally I had way too many thin tag pages (almost more than pages) and maybe too many pages with links only. My site flows from
Home > Category > Product
and Home > Letter a-z > Product
and on the category and letter pages I have only the links to each product in that category /letter. Frankly if I add anything there, it would be for Google only, because this navigation makes perfect sense and people just want to get to the product, not read nonsense.
Ladies and Gentlemen, this is a private panda illuminati party, the doors are closed now for non-members but you are invited to try your luck later this year. <conspiracy and sarcasm>
How much are tickets for the Panda party and do you accept Paypal?
I noticed another interesting thing here. I found that some of my articles with comments are not doing well after Panda. Some articles with out the comments or few comments gained the rank. I also saw that some of my comments are not filtered for bad words, so I applied bad words filter yesterday.
I know that Disquss shows ajaxed comments, I don't think Search Engines can read those comments unless you submit a sitemap. We get a minimum of 10 comments a day. In general, is it a good idea to show Ajaxed comments instead of on page search engine friendly comments?
Well, since my penalty hit on March 10, I have been taking recovery steps on the Panda trail. However, I am starting to think that my penalty may be more of an OOP (over optimization).
In analyzing my WMT data, one thing jumped out at me. On the day of my penalty, my ranking for my main 2-word phrase fell 300 positions. As a result, all longtail phrases that incorporate this 2-word phrase fell 20 to 40 positions. Back in January, I added an H1 tag to my homepage, which has that 2-word phrase in it. How it can go two months before a penalty hits is beyond me. I have seen two deep crawls since that change was implemented. For the life of me, I can't understand why G would penalize for a legitimate use of an H1 tag, but maybe the fact that a 6-year-old site never used a header tag suddenly starts to use one raised a red flag. Furthermore, the rapid increase in traffic that I saw in mid to late February (30 to 40 percent) may have caused the algorithm to take a closer look at the new H1. So, at this point, I have no idea which road to take, but I think I have done all I can do. I have taken out the H1, I have added new content, I have cleaned up spelling errors, etc. All bases seem to be covered.
I'm still baffled by the fact that my penalty hit A) 2 months after I added an H1 and B) 2 weeks after Panda was unleashed on Feb 24.
My traffic has dropped 50%, so I suppose it could have been worse. It could have been 100%. I still have 8 sitelinks for my website name (which I think is a good sign).
I'll keep you posted.
I'm curious to know if you guys who have been hit by penalty are seeing any phrase-specific drops like I have, or are the declines in position fairly evenly distributed?
For the sites I've analyzed, Panda ranking drops are very much phrase specific. To be exact, they are drops for a phrase/page pairing. At the same time, other phrase/page pairings can improve, not just maintain.
As far as an OOP, I haven't seen those penalties given to phrases that are extensions of the core phrase - only the core phrase on its own gets nailed. In fact, those longer phrases are sometimes the one slightly bright spot after an OOP hits the main phrase.
Have you guys notice that it is location drop too? I have few location dropped 99% of all traffics from that page while others only 30-40% drops.
Here are some around 90+% drop locations:
New York:Brooklyn was-150/day now-2/day
New York:Bronx was-70/day now-2/day
Texas:Irving was-150/day now-10/day
Also saw another hit on March 6. I guess their algo is still at work.
It seems to me that they pick the mid to large size sites for this experience while given a free ride for others so I do agree with SEOPTI's statement of this panda experiment is a VIP one.
Just subscribed to this site, because if Panda has taught me anything, it's that it was a mistake to focus solely on user experience and content and ignore Search Engines. That what I did in the past, and I lost heavily in Panda, my site was on the Sistrix list even though we pretty much do nothing but long form content.
By my reckoning, what Panda really is, is Google telling webmasters that Content doesn't matter and that all that really matters is paying closer attention to what they want.
Still have no idea why we were hit, but have been working frantically to try and find any improvement possible. We've done most of the things people have reported here, and can so far report absolutely no change.
Nor have I heard from anyone who has seen any real change outside of the two sites already listed here.
John Mu's post on the Google forums seems to suggest we won't see any change either. Starting to think this is just the new reality and people should start making plans to go out of business or lower their expectations to stay in business in this new reality ruled by eHow and sites like it.
UniverseToday, is your traffic back to pre-Panda levels? Your site seems to be similar to mine -- good content but heavy ad presence above the fold. Do you think that is why you were initially punished by Panda? Or do you think a human reviewer (an idiot) flagged your site as not being high-quality because of a first impresson and that's why you were a Panda false positive? I saw that you had submitted your site to Wysz's Google thread -- do you think that is what helped you to regain some standing? Did you do anything else that might have helped? Curious to learn as much as possible from somebody who was banished and has now been forgiven. Thanks.
Tedster, was your sense that the "drops for a phrase/page pairing" were justified? Are better results showing for those phrases? Personally, I'm fine with any changes that give better results for a specific phrase. I remember in the early days of Google, somebody would search for "the capital of Minnesota" and end up on my Minnesota venture capital page. I'm glad to see bad phrase-page pairings like that go away, but my limited sampling suggests that the algorithm isn't necessarily giving better quality results for phrases where we've lost ranking.
Something happened yesterday that's never happened before, and it follows a large crawl of my site on Friday.
I have a page that's always, every day, been by far the top performer on my site, drawing far more page views than any other. It's the main page for dozens of sub-pages of widget models. The directory for those sub-pages has also been far at the top of visits by directory. The main page was #2-#3 for years for several different phrases, and the sub-pages obviously ranked very well, too.
Yesterday that page dropped to #3 in my stats, and the directory of sub-pages dropped to #2. That's the first time that's happened since I started this site in 2004.
It's also right after a week in which I added content to that page. I added sections about the various features found on the different models, explaining them in further detail than is done on the individual model pages. I also added some other information.
I re-wrote the text for the individual model pages (although I'm 80% finished doing so), doubling the amount of text, and I think giving a bit more useful information. I did things such as adding small detail photos showing differences in features, and comparing a replica model to an original widget from the 1940's, and writing about the similarities.
I also added a comments section to each page, and was able to get some friends who own a couple of the models to write short reviews. I'll have to seek out more people to write reviews for those pages where there are none.
So, that's it. I thought I was doing right, but if yesterday's pattern continues, it's very possible that I did wrong.
This update is fubar.