Forum Moderators: Robert Charlton & goodroi
DS: Talking about Panda, says that he’s getting a ton of emails from people who say that scraper sites are now outranking them after Panda.
MC: A guy on my team working on that issue. A change has been approved that should help with that issue. We’re continuing to iterate on Panda. The algorithm change originated in search quality, not the web spam team.
....
DS: Has it changed enough that some people have recovered? Or is it too soon?
MC: The general rule is to push stuff out and then find additional signals to help differentiate on the spectrum. We haven’t done any pushes that would directly pull things back. We have recomputed data that might have impacted some sites. There’s one change that might affect sites and pull things back.
DS: You guys made this post with 22 questions, but it sounds like you’re saying even if you’ve done that, it wouldn’t have helped yet?
MC: It could help as we recompute data. Matt goes on to say that Panda 2.2 has been approved but hasn’t rolled out yet.
DS: Reads an audience question – is site usability being considered as more of a factor?
MC: Panda isn’t directly targeted at usability, but it’s a key part of making a site that people like. Pay attention to it because it’s a good practice, not because Google says so.
I am hoping that some of these reports are actually making their way to actual eyeballs in the Google Spam team. I would encourage everyone to keep sending out spam reports where they see fit.
My guess is that Google could run this new Panda algorithm at any time, which would effectively lift penalties (or the penalty-like element of Panda) for sites that have been hit.
I thought it was just a (significant) change to the algorithm. I don't understand the concept of it being "run." I believe Matt said that if you made changes to your site, then they would be noticed the next time your site was crawled by googlebot and the index changes would take place as they normally do.
I feel for people who have devoted dozens or even hundreds of hours in an effort to restore their websites after the Panda slaughter. It looks like some or even much of that may have been in vain.
dazzlindonna wrote:
n addition, we know there have been a couple of new Panda rollouts since the first, but they seemed to only trap more sites, rather than re-evaluate the ones already snared. We are all waiting for that "re-evaluation" Panda to be run. Not sure it ever will be.
I believe Matt said that if you made changes to your site, then they would be noticed the next time your site was crawled by googlebot and the index changes would take place as they normally do.
Frankly the index is now a mess of 404s, 410s, 301s, canonicals, noindexes and nofollows
I've mentioned it before in another thread, but two of my company's websites were hit by Panda 2.0 and recovered with Panda 2.1 with no changes to the sites themselves.
: "look, we weren't anticipating such wholesale decimation of websites inflicted by spooked webmasters. Now we're way behind the current reality in terms of indexing all this and recalculating the link graph, etc. Frankly the index is now a mess of 404s, 410s, 301s, canonicals, noindexes and nofollows and it's going to take a while for the our picture of the web to catch-up with the new reality."
If you've made changes to your site, at least the part that Google has managed to index (maybe not all 10,000 pages you binned!) at the instance(s) they rerun the calcs is in the Panda data.
DS: You guys made this post with 22 questions, but it sounds like you’re saying even if you’ve done that, it wouldn’t have helped yet?
MC: It could help as we recompute data.
"There’s one change that might affect sites and pull things back" -- is saying that they have only back-tracked with one change.
3) "It could help as we recompute data." -- a fuller response might have been: "look, we weren't anticipating such wholesale decimation of websites inflicted by spooked webmasters. Now we're way behind the current reality in terms of indexing all this and recalculating the link graph, etc. Frankly the index is now a mess of 404s, 410s, 301s, canonicals, noindexes and nofollows and it's going to take a while for the our picture of the web to catch-up with the new reality."