Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Why Haven't Sites Come Back from Panda? Matt Cutts Tries to Explain

         

walkman

6:49 am on Jun 8, 2011 (gmt 0)



This is a rush(?) transcript from Dany Sullivan's blog so probably not everything is 100% correct. The italics and bolding are mine.
[searchengineland.com...]
DS: Talking about Panda, says that he’s getting a ton of emails from people who say that scraper sites are now outranking them after Panda.

MC: A guy on my team working on that issue. A change has been approved that should help with that issue. We’re continuing to iterate on Panda. The algorithm change originated in search quality, not the web spam team.
....
DS: Has it changed enough that some people have recovered? Or is it too soon?

MC: The general rule is to push stuff out and then find additional signals to help differentiate on the spectrum. We haven’t done any pushes that would directly pull things back. We have recomputed data that might have impacted some sites. There’s one change that might affect sites and pull things back.

DS: You guys made this post with 22 questions, but it sounds like you’re saying even if you’ve done that, it wouldn’t have helped yet?

MC: It could help as we recompute data. Matt goes on to say that Panda 2.2 has been approved but hasn’t rolled out yet.

DS: Reads an audience question – is site usability being considered as more of a factor?

MC: Panda isn’t directly targeted at usability, but it’s a key part of making a site that people like. Pay attention to it because it’s a good practice, not because Google says so.

Matt mentions 'pull back' but that's nonsense and very disingenuous of him. Pull back to me means letting a previously labeled bad content rank. We're talking about improved sites and content, no need to pull back, just reanalyze it.

So it's clear to me that this is a penalty. Maybe if you got links from every newspaper in the Northern Hemisphere you might escape but for the rest it looks like it depends on Google engineers. It took them 3+ months to admit it.

mrguy

5:23 pm on Jun 8, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I am hoping that some of these reports are actually making their way to actual eyeballs in the Google Spam team. I would encourage everyone to keep sending out spam reports where they see fit.


Why should Google have to do their own job after all they have people doing it for free. Personally, when they start paying me as a consultant, I'll help them better their results so THEY can make more money.

PPC_Chris

5:38 pm on Jun 8, 2011 (gmt 0)

10+ Year Member



walkman, I'm not saying that everything was an attempt to game Google. In fact, we were hit very hard and weren't trying to game Google. But clearly, thats what Google was going after in Panda... what they considered to be low quality content that existed for the sake of ranking in organic search. That doesn't mean there weren't a lot of others that got hit.

Planet13

5:45 pm on Jun 8, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



My guess is that Google could run this new Panda algorithm at any time, which would effectively lift penalties (or the penalty-like element of Panda) for sites that have been hit.


maybe I am misunderstanding Panda. Or, maybe I am just misunderstanding the statement above.

I thought it was just a (significant) change to the algorithm. I don't understand the concept of it being "run." I believe Matt said that if you made changes to your site, then they would be noticed the next time your site was crawled by googlebot and the index changes would take place as they normally do.

am I missing something here?

(Maybe the quote meant that when Panda 2.2 is released the changes in the algorithm might reverse some of the effects of the prior changes in indexing?)

supercyberbob

6:14 pm on Jun 8, 2011 (gmt 0)

10+ Year Member



I'm loving it.

Google hasn't been transparent about Panda, and it looks like it's biting them in the butt.

Keep digging a deeper hole, and your stock price will sink more, Googletanic.

dazzlindonna

7:15 pm on Jun 8, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As far as I know, Planet13, Matt never said this: "that if you made changes to your site, then they would be noticed the next time your site was crawled by googlebot and the index changes would take place as they normally do." JohnMu implied that in a thread forum thread though he also mentioned that he was not necessarily referring to any particular update. Matt later said on Twitter that they don't run the Panda algo all the time, so putting all those pieces together went something like this: John said that in general, changes we make get noticed on the next update and possible serps return came if warranted, so we all assumed that would be the case with Panda as well. We assumed wrong (or John mislead us, intentionally or not). Matt's tweet let us know that they only "run" the Panda "stuff" (call it whatever you want) periodically. We don't know how often. Every few days? Weekly? Monthly? Every 3 months? Every year? Who knows... In addition, we know there have been a couple of new Panda rollouts since the first, but they seemed to only trap more sites, rather than re-evaluate the ones already snared. We are all waiting for that "re-evaluation" Panda to be run. Not sure it ever will be.

walkman

7:24 pm on Jun 8, 2011 (gmt 0)



I thought it was just a (significant) change to the algorithm. I don't understand the concept of it being "run." I believe Matt said that if you made changes to your site, then they would be noticed the next time your site was crawled by googlebot and the index changes would take place as they normally do.

They said that initially but it's clearly a lie, 3.5 months later. Nothing we can do about it, but at least let's not fall for their spin. Assuming he's not lying again, he said that there's something that might bring you back, but it's probably mission impossible for average sites.

edit: dazzlindonna, the google support forums suggested to remove 'bad' pages etc as that would help. But Matt clearly said that on 'Panda iterations' or whatever he called them sites would come back.

freejung

7:47 pm on Jun 8, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I feel for people who have devoted dozens or even hundreds of hours in an effort to restore their websites after the Panda slaughter. It looks like some or even much of that may have been in vain.

That depends on what you did. I spent a couple of days redesigning layout to feature the content more prominently. I think the site is better for it regardless of Google.

I've always had a general rule of thumb about SEO-related modifications, particularly in response to Google algo updates: only make changes to your site that will improve your site in and of itself, regardless of Google's reaction to your changes. Otherwise you may very well be just wasting your time.

brinked

7:52 pm on Jun 8, 2011 (gmt 0)

10+ Year Member



Everything Matt Cutts has said here is not a surprise. They are working on improving panda and releasing a new update...this is something that should ALWAYS be done when releasing an algo change. Nothing is perfect on an initial release. The bones of any great software are making improvements and working out the bugs.

I am glad to hear they are working on the issue where scrapers are outranking the original article source. I do not care how poor quality content google views a given site, they have every right to rank first for there content no matter how crappy it may be.

At this point calling this a penalty or a re ranking is just semantics. Who cares what anyone calls it as long as you recover from it, correct?

I am not going to praise google for improving something they release...that is there duty as a quality company. It is what separates a great company from a not so good one. Google received a lot of feedback about panda and it is there job to look at that feedback and adjust to get it right which I am confident they will do.

walkman

7:54 pm on Jun 8, 2011 (gmt 0)



@freejung
fine but think of people that could have started another site (working a site essentially banned from Google might not be a good idea,) fired some employees and maybe save the company, get a job and maybe save the family from bankruptcy, move to a smaller house, move with their parents and a million other things.
Misleading or outright lying to people is not nice at all.

But the message is clear: have dozens of sites, don't trust what Google says and don't don't depend on them, even though they have 65%-70% of the market.

rlange

8:01 pm on Jun 8, 2011 (gmt 0)

10+ Year Member



dazzlindonna wrote:
n addition, we know there have been a couple of new Panda rollouts since the first, but they seemed to only trap more sites, rather than re-evaluate the ones already snared. We are all waiting for that "re-evaluation" Panda to be run. Not sure it ever will be.

I've mentioned it before in another thread, but two of my company's websites were hit by Panda 2.0 and recovered with Panda 2.1 with no changes to the sites themselves.

Also, in yet another thread, I pointed out that the website for the company itself lost 50% of its traffic back in March and, starting Monday, seems to have gained back a significant portion of that traffic—numbers-wise, anyway. The new traffic is from the same country as the traffic that was lost, but it's to a different section of the site.

I'd say some re-evaluations have occurred since the original Panda was unleashed, but I can't make any sense of the results over here...

--
Ryan

freejung

8:04 pm on Jun 8, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yeah, I'm in the "get a[nother] job and maybe save the family from bankruptcy" category and I'm not liking it any more than you are. What I'm saying is, you always have a choice. What Google does with their algo is their choice. How you react to it is yours.

Particularly in this case we really don't know what if anything can be done to restore a site's rankings, so it would be particularly wise to react by doing things that are productive in themselves. Maybe getting another job falls into that category, and maybe redesigning your layout does too.

As for people who spent a lot of time spinning their wheels chasing the algo (making changes that do not improve their sites other than the attempt to recover from Panda), yeah, I feel bad for them but Google didn't force them to do that, it was their own choice.

maximillianos

8:33 pm on Jun 8, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Matt is right about one thing, CultofMac.com was not hit by Panda. Just look at Quantcast (direct measure data).

HuskyPup

8:39 pm on Jun 8, 2011 (gmt 0)



I believe Matt said that if you made changes to your site, then they would be noticed the next time your site was crawled by googlebot and the index changes would take place as they normally do.


I can definitely confirm that I have seen improvements after tweaking a site, not changed it, simply SEO'd even tighter. I have not done this on any of my big sites but one of my niche-targetted .co.uk's and will test similar tweaks on one of my .asia sites soon.

Doing the same to my main B&M .com site is a huge task therefore I need to know I'm right before considering that!

suggy

9:06 pm on Jun 8, 2011 (gmt 0)

10+ Year Member



It's amazing how people read what they want into stuff!

Then for the 'banging drums' it's a case of "click... whirrr... replay conspiracy theory/ anti-google vitriol...."

As I see it, this is what it means:-

1) "We have recomputed data that might have impacted some sites" -- in other words, they have iterated over the Panda data (you know, the data they said was updated periodically, not constantly) at least once since launch.

If you've made changes to your site, at least the part that Google has managed to index (maybe not all 10,000 pages you binned!) at the instance(s) they rerun the calcs is in the Panda data.

2) "There’s one change that might affect sites and pull things back" -- is saying that they have only back-tracked with one change.

Yes, sadly for you, Matt is not talking about your site (why would he be) he's talking about his algo and a change that softened it a little.

3) "It could help as we recompute data." -- a fuller response might have been: "look, we weren't anticipating such wholesale decimation of websites inflicted by spooked webmasters. Now we're way behind the current reality in terms of indexing all this and recalculating the link graph, etc. Frankly the index is now a mess of 404s, 410s, 301s, canonicals, noindexes and nofollows and it's going to take a while for the our picture of the web to catch-up with the new reality."

?!

johnhh

9:44 pm on Jun 8, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Fast becoming a Suggy fan :) or may be he thinks the same way as me
Frankly the index is now a mess of 404s, 410s, 301s, canonicals, noindexes and nofollows

I thought about this as I have made loads of changes . However - when I was considering them - is it right to have pages that say "no widgets available " , is it right to have old pages still indexed that have been superceded by new pages, is it right to have pages that have duplicates ?

Basically - have I been lazy and assumed our rankings wil go on for another 6 years ( we got hit in 2005 ) and not sorted this all out ?

We have pages returning in one section of our site that show a 1st June cache dates, but not all pages in that section, and I have identified why.

dazzlindonna

10:03 pm on Jun 8, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've mentioned it before in another thread, but two of my company's websites were hit by Panda 2.0 and recovered with Panda 2.1 with no changes to the sites themselves.


True, it does seem as though those hit by Panda 2.0 had a chance to recover in 2.1. It seems to be much less likely for those hit by Panda 1.0 for some reason.

superclown2

10:40 pm on Jun 8, 2011 (gmt 0)



: "look, we weren't anticipating such wholesale decimation of websites inflicted by spooked webmasters. Now we're way behind the current reality in terms of indexing all this and recalculating the link graph, etc. Frankly the index is now a mess of 404s, 410s, 301s, canonicals, noindexes and nofollows and it's going to take a while for the our picture of the web to catch-up with the new reality."


With the computing power Google have got they could download the whole web and sort it in an afternoon.

Panda was a completely new ranking system with hundreds of new variables. It was inevitable that there would be bugs.

When Google run tests they use their employees as guinea pigs. Problem is that highly educated geeks from, mainly, wealthy backgrounds don't do searches like the rest of us. This system had to be released onto the public to test it properly and now ironing out those bugs will be a priority but it all takes time and experimentation.

walkman

10:59 pm on Jun 8, 2011 (gmt 0)



If you've made changes to your site, at least the part that Google has managed to index (maybe not all 10,000 pages you binned!) at the instance(s) they rerun the calcs is in the Panda data.

I assure you I didn't have 10,000 pages or even close, but even now I have 4 times less than I had on Panda day. And Google gets them every 2-3 days, 100% of them. On my non-pandalized sites I have over 100,000 in one and 30k+ on another, traffic is way up there. Whether content is analyzed directly or not Matt answers is very clearly if you read between the lines:
DS: You guys made this post with 22 questions, but it sounds like you’re saying even if you’ve done that, it wouldn’t have helped yet?

MC: It could help as we recompute data.

He speaks of 'data,' not 'as we re-index and re-process the content of the page.'

"There’s one change that might affect sites and pull things back" -- is saying that they have only back-tracked with one change.

Or if you get 4800 links from different newspaper sites. We don't know what that signal is. But we can ask around and see if any Feb pandalized site had come back or not.


3) "It could help as we recompute data." -- a fuller response might have been: "look, we weren't anticipating such wholesale decimation of websites inflicted by spooked webmasters. Now we're way behind the current reality in terms of indexing all this and recalculating the link graph, etc. Frankly the index is now a mess of 404s, 410s, 301s, canonicals, noindexes and nofollows and it's going to take a while for the our picture of the web to catch-up with the new reality."

Google is talking about figuring out why mangos get hot in boxes, so it's obvious they have more than enough computer power for that. My site:domain.com and all WMT data has been accurate for well over a month.

Rlange, two of my competitors that were hit by Panda 2 are showing a major increase by Alexa for the past month. It corresponds with the loss they suffered. As far as I can tell, no changes were made by those sites, and at least one truly sucks. I know it's Alexa but presumably they used the same method before and after.

gadget26

12:51 am on Jun 9, 2011 (gmt 0)

10+ Year Member



Gentlefolk, as a newbie just trying to keep my Pandalized 1.0 business from going down the last swirly-gurgle-bit of life's commode, I REALLY NEED (and appreciate enough to become a paid member) the kind of speculation that I find here. Purely negative posts don't really help me. Please keep the speculation going. The truth is in there somewhere.

My 2 cents.

tedster

1:02 am on Jun 9, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm all for intelligent theories, too. But let's keep the editorializing out of it so that the theories have a chance to pop off the page - thanks.
This 238 message thread spans 12 pages: 238