Don't you guys find this all a bit strange.
Google is saying, we're clamping down on low quality content.
But scrapers are replacing original content in the serps. (I'm actually seeing more junk in the serps for some queries, maybe it's just me.)
eHow is still untouched.
The best part was the official sounding announcement on their blog, claiming biblical improvements in search quality.
And about the "junk floating to the top before they can skim it" argument, I see nothing from google saying the algo needs time to learn whatever it needs to learn.
The message I got was, "We rolled out a super duper algo and the search quality is now better than before."
We didnt get hit by Panda across any of our sites in our network.
Almost every competitor of ours did and they number in the hundreds. And when I mean hit I mean ran over by a freight train loaded with semi trucks.
Dozens of alexa 300-1000 sites chopped in half. I cannot see a pattern at all, we see high quality original content sites being nailed right next to content spinners and mashups machines.
The top five sites that got hit in our space are owned by traded companies and probably combined lost 50 million uniques a month.
Now here is what doesnt make sense to me with both Panda and the previous scraper update. I see really good sites getting nailed to. At first when we got hit in the scraper update I got very upset, I mean we are a really good site filled with 100% unique handmade content. Why would we get hit by a update that was supposed to hurt scrapers? Well a couple weeks after the scraper update I noticed several of our competitors traffic rankings falling by the exact same degree/time/percent. Whoa, they got hit also!
Ok well here is the thing, yes they are competitors but they didnt deserve this just like us. They also are the top sites in our space with what I consider to be the best unique content.
So now you have many sites that got hit in the scraper update that didnt deserve it and many sites that got hit in the Panda update that didnt deserve it. This isnt collateral damage, this is a screwup.
Nothing more to say, one to many green beers this past week and im blibbering but I think at least for our space Google really screwed up many very good sites that I have looked up to and admired for many years.
They should put a toggle switch in GWT that lets webmasters notify them if they think they got screwed by these update. Real sites with real content will use it instantly, scumbags wont.
Now back to my beer!
Googlebot very slow today, much slower than normal. Looks like google got the data for the (deeper) monthly update
I'm curious, what types of sites replaced the ones that tanked in the serps?
|But scrapers are replacing original content in the serps. (I'm actually seeing more junk in the serps for some queries, maybe it's just me.) |
I think it might be just you. I'm seeing normal commercial players dominating the SERPs.
(Don't forget that with personalised search, if you constantly click on scraper sites to check what they are scraping, G will serve up those sites in your personalised search, even though they arn't serving them up to anyone else, on the grounds that your personal history shows you like clicking on these pages!)
I guess it's query dependent too.
I can't post links, but last week a couple of well known SEO blogs had some examples of very spammy results floating to the top of the serps, for non-ecommerce type queries.
I'd like to point out something I'm noticing here... maybe we can all confirm it one way or another.
It seems like there are two distinct camps in this thread:
People involved in ecommerce sites
People involved in content provider sites
People involved in content sites are seeing crazy, wild disparity in the SERPS, seeing sites with quality content completely deranked in favor of scrapers and content farms.
On the other hand, my impression is that the people involved in ecommerce sites more often than not believe Google is doing the right thing and aren't seeing the wild SERP results where quality sites are being unjustly penalized.
Does this sound accurate to everyone else? Maybe everyone should say what category their experience fits into.
ME: Content providing sites, seeing PANDA trash everything and no idea why.
One of the pages I rewrote has had its Google rankings return. The traffic is actually higher now than PrePanda. The site as a whole is still down from before Panda but started inching back a little bit today. I don't know if it will stay that way, but I guess it is better than seeing further drops.
I should add that I didn't just rewrite the page but I cleaned up a lot of stuff on the site as a whole and got rid of all the thin pages.
Jane, how are other pages /sites that you changed ? I noticed a change too today, but I'm not sure in total traffic stats since I deleted many pages and that has undoubtedly lowered the traffic.
We could see something this week, the SERPS for me were different in the am. I am obviously hoping for a + change on my end :)
I'm not sure single page increases mean much. It'll mean something when sites with large numbers of pages which received an overall deranking see some change.
Some sites cannot do any wrong in Google's Panda eyes. I have seen a ONE line contest link outrank the site that was actually hosting the contest. Like me having an entire page with "Webmaster World is having a $10K contest on coding" wrapped with my templates and ranking #1 for the query.
Or a single tweet showing #1 for a major news story in a site that was unloaded to AOL last month. ONE TWEET and gazillions line of template and code. How's that for thin, but Panda loves it. Almost as much as it loves bamboo shoots :)
At least eHow often credits the original author of the page that an eHow rewriter has copied and rewritten to create their own "original" content. My site is on a ton of eHow articles as a source. It would be very easy to rank the source articles higher than the ehow rewrites. Given that Google doesn't do that, it seems they like the eHow content better. I guess the takeaway is to emulate eHow, dumbing down your content and learning from the eHow site structure, etc.
|Did your rankings fall with the earlier "Scraper" update, or only when Panda was released? |
The hit came on February 24, so it's Panda.
Looks like demand media has changed things w.r.t links. Is this on their own or someone asked them to change? No idea. But being big has big advantages as google SERPS mature with updates like panda, as they being a big business are more and more tilting towards big businesses in SERPS.
This is what i feel about panda.Big sites do rank high and get traffic for most pages on their site while smaller sites usually have a lot of pages that don't really get any traffic.
Let us assume that google applies the percentage of low traffic pages (not necessarily low quality) to the overall number of pages in the site, as a weight while ranking any of the pages on the site, then smaller sites do get affected.This is also the case with the real spam.Not all pages on their site receive good ranking and traffic.They usually tend to have one or two pages that float to the top.
By applying this weight, it is easy for google to push them down.But what google finds difficult to handle right now are those spam sites that just have few pages ( 1 to 10) but do succeed in ranking most of them higher, thereby getting traffic.
But this logic will definitely work negatively against several genuine sites too and that is what seem to be happening right now.
Why you do see ehow or cnet or softonic or other top content sites rising up dramatically, might be because of this ratio.They seem to have a lot of successful pages on their sites as compared to ezinearticles and others.
indyank, that sounds like a reasonable theory, although it's a theory that would make the rich get richer, and that's an ugly prospect. Still, it may hold some water, even if I don't like the thought that it may be so.
Recently the Farmer update penalized content farms with low quality content. And then the Panda update went after the sites that the content farms (and other low quality sites) linked to.
From what I read, Panda doesn't have anything to do with the quality of the content on your site, it has to do with what sites link to you.
All of these modules work together to create a SERP. Publishers can delete low quality content to counter the Farmer update, but that doesn't do anything about the low quality links to their site.
Can good quality content overcome the Panda update?
[edited by: tedster at 1:33 pm (utc) on Mar 22, 2011]
[edit reason] moved from another location [/edit]
|From what I read, Panda doesn't have anything to do with the quality of the content on your site, it has to do with what sites link to you. |
From what you've read where? And how do you reconcile that with Google's long-standing position that your site can't (ordinarily) be sabotaged by what third parties do on their own sites? Do you have any examples of the links you're talking about? The content farms and scrapers that have plagiarized my content have almost never linked back to the original.
dazzlindonna, even I too don't like this idea but that is what it looks like. Most of the sites I know, that got hit by panda, seem to confirm to this theory.
This is why I said that being bigger and successful has more advantages with this panda update. As you said and as I read in one popular news story, panda makes the rich more richer and the poorer more poor.
This theory also confirms what some google employees ask you to do - "Delete or noindex low quality pages".
But they never defined what low quality is.To me it is those pages that don't get hits as good as other pages on your site.It is not about thin content really but it is more relative to other pages on your site, in terms of traffic.
I haven't yet made changes for this theory.But I soon intend doing it.
I don't think all the shoes have dropped yet.
|Jane, how are other pages /sites that you changed ? |
Just the one page jumped out as having a significant increase. But I also deleted many pages on that site and total traffic was still at its highest since the Panda update. Of course it could all change for the worse today, but I hope it is the start of maybe a little upward trend.
can large sites with thousands of pages have lots of traffic on each of them? there will always be some pages getting more traffic than others. How much traffic should a page get in order to sit above 'thin' content? At one point they will go down in the serps. Does that mean they become low quality?
Low quality to me means: very low traffic, bad bounce rate, no PR, no links, thin content (very small amount of content). I've used this measure successfully to start pruning my stuff.
I've already done quite a bit of an overhaul on my site, but made mistakes along the way that saw a further Google slap that cost me an additional 15% of traffic. I've got a lot of bad pages still floating in cache, but hoping it clears out soon.
Now with regards to what's in the SERPs right now? They're pretty bad IMO. For a lot of my own searches, I skip a lot on the first page now and go on 2nd to 3rd pages. It breaks my heart to see really good stuff get buried below the likes of eHow. I've blocked eHow pretty much. It's terrible. It's truly a plagiarist's paradise. I can imagine the writers from AssociatedContent, Hubpages, Ezinearticles now moving to eHow. Talk about rewarding a true content farm!
I agree with all other metrics of yours but thin content is a very loose term.What is more important is whether you have the answers for what the user had come to your site.If it is there, it doesn't really matter whether it is 3 lines or 10 lines.
But the key is the ratio of successful pages to the total number of indexed pages on your site.
A page becomes successful if everything is right about it.Bigger sites have a lot going in their favor to get more traffic to most pages on their site.The fact that several pages of theirs are already on top makes it more easier for them.
|I don't think all the shoes have dropped yet. |
Just as intensive as the webmaster focus is on Panda right now, that's how intensive I think the focus at Google is, too. Panda is not just a tweak to an existing algo - it is a revolution in how Google ranking works and how it WILL work going forward.
No one should underestimate how big a deal it is to add an automated "quality" factor to an algorithm that was designed around "relevance". As far as I know, the entire science of IR (Information Retrieval) was developed only around relevance - and that has a many decades history.
This is new stuff, untried, no track record, being created as we watch. Did anyone notice the recent Microsoft paper about measuring "credibility" [seobythesea.com]? They're on the same scent, so I don't expect Bing to be an escape hatch for this kind of scoring.
Today's Google needs to be approached and analyzed almost as a new search engine. We cannot use only our past experience to guide our expectations of how Google will work.
So yes, I agree with netmeg. All the shoes have not dropped yet... not by a long shot.
If true, then Google & Bing will pretty much wipe out small sites. With "social media coordinators" the larger sites will eat the smaller ones alive.
"Short" introduction, since I'm new here: I work for a small company that owns a little over a dozen domains. Of these websites, only one has been impacted by Google's February 23-34 update (~40% drop in total traffic; >50% drop in Google traffic alone). It also happens to be our most-trafficked site.
I admit that some of the content on the affected site could be considered "thin", but the same could be said for the unaffected sites, too (some actually being worse, in my opinion).
So, naturally, I've been keeping an eye on what others have been going through, too.
I'm also fairly new to SEO, so feel free to correct me on anything I say below that just doesn't align with what's already known.
|hyperkik wrote: |
From what you've read where? And how do you reconcile that with Google's long-standing position that your site can't (ordinarily) be sabotaged by what third parties do on their own sites?
There seems to be a tendency for everyone to look at their drop in traffic as the application of a penalty. What if isn't a penalty at all? I've seen others allude to this elsewhere on this forum, but what if it really is simply the removal of a previous benefit?
Dan01's comment about incoming links made me take a look at the incoming links on our one affected site and compare it to the unaffected sites. What I noticed was that the affected site, according to GWT, has over 100,000 incoming links from a single, third-party website (~200,000 total). On our other sites, incoming links from a single source don't even break 5,000 (~25,000 total).
Now, the website sending those 100,000 incoming links claims to be a search engine for the market that our affected website falls under. It also appears to be legitimate, although more of a directory than a search engine proper.
I can't seem to locate any public statistics for that site, though. Alexa has no historical data. If that's because the site doesn't receive any significant traffic, that may be telling.
So, I'm wondering... What if the previous love from Google for our affected website was in significant part due to the sheer number of incoming links from this one site? If so, and this is something now taken into account in the Panda update, it seems reasonable that all those "votes of confidence" from a single source no longer weigh as much as they used to.
Admittedly, the above relies on my ignorance about how Google treated "a ton" of links from a single source prior to this update. If it already saw over 100,000 links as "too much" and discounted most or all of them out of suspicion, then my theory is bunk.
Also, as others have suggested, the linking site itself may have been impacted significantly by the update.
Being new to this, I'm even more in the dark than most of you, but I thought I'd share what few thoughts I have on the subject. :o)
@indyank, a metric that measures low-traffic pages divided by total number of pages wouldn't make any sense... it would punish sites that have answered very long-tail inquiries well and with good content. A great site could easily have 95% of its pages be low-traffic pages -- and in that scenario 100% of its pages could be high-quality content for those who land on those pages via search.
Maybe I misunderstood your metric, but it doesn't make sense to me.
It could also be the case that those very-very-long-tail pages were never linked to by anybody -- why would you link to an obscure question/answer that very few people are likely to ask -- e.g. what was the price of gas in Atlanta, Georgia in March 2009? -- so I would like to think that Google won't suddenly reassess quality based on deep links. That would fail miserably for many quality sites.
|A great site could easily have 95% of its pages be low-traffic pages |
are you saying low traffic yet good quality pages? But how will google know that they are quality pages? Is there traffic to it? do people link to them? Do people share them via likes and tweets? do people comment on them?
How will google know that they are of good quality unless people interact with those pages in one way or the other?
Google will know quality only through user interactions.This is where bigger sites have an advantage as people throng them and interact even if it had content that may not be as good as what is there on your site.
Unless all the relevant signals are passed to google, it won't know the quality other than basic things like spell errors.If google doesn't see those signals, those pages will still be low quality to these automated bots.
if you are talking from a traffic perspective, and you have 95% pages that address long tail queries, then this algo may not help you with those 5% pages that address queries that pull in the real traffic.
You are right that these are some drawbacks if google is relying on these kind of signals to apply to the site as a whole.
But this algo definitely seem to be applying some sitewide metrics to the individual pages.
"are you saying low traffic yet good quality pages? "
Traffic varies, lets be honest. So unless Google compares that page to exactly similar pages....
Indyrank, there is absolutely, positively a sitewide thing, like a chained ball holding it back.