|Dose this mean Panda is updated at relatively long interval? |
I'm more inclined to believe it's a suggestion that Panda is still in Beta, and hasn't completely been let loose on the net to operate under its own intelligence. Which is rather quite scary.
I think what we're seeing is a algorhithm's first bite into it's sandwich paused then observed in slow motion by engineers as websites travel down the digestive tract and come out the other side.
Another metaphor.... the bear has begun to crawl, but the baby-gate is still in place.
Another metaphor. They've rang the bell and now have taken away the meat powder but we're all still salivating.
I could continue on with Schrödinger's cat, but I think you get where I'm going with this.
It ain't over till the fat lady sings.
I don't buy that Google can't handle frequent re-calculations. Maybe they can't do it daily or on the fly but sure can do it monthly or it makes no sense to go with that algo. A very slow to calculate algo in a fast changing field is useless.
This is penalty and it's deliberate. Unless Google sees your changes and removes the Panda penalty no amount of tinkering they do is going to help you much, at least not sitewide.
Question to ask: Will noindexing pages, currently supported by Google like tags and Adsense blocks were, come to bite people for another 6 months?
|Will noindexing pages, currently supported by Google like tags and Adsense blocks were, come to bite people for another 6 months? |
I know Google engineers recommended, but their is no evidence that it works at all. For that matter there is no evidence that anything works!
|@mattcutts I can word the question better in >140 chars :) so far I don't know of any site that's recovered traffic. Would be good to know |
via @tomcritchlow [twitter.com]
@tomcritchlow answered that for NPR, but they didn't use it. Answer not as amenable to 140 chars.
via @mattcutts [twitter.com]
@tomcritchlow short version is that it's not data that's updated daily right now. More like when we re-run the algorithms to regen the data.
via @mattcutts [twitter.com]
I don't ubderstand this conversation.
What did Matt Cutts answer for NPR that they didn't use?
What data isn't updated daily right now?
What does Matt Cutts mean by "re-run the algorithms"?
@aristotle: We don't know the specifics of how Panda works nor what data it relies upon (nor will we find out this information). Hence it's difficult to know *why* the Panda update can't be done (say) daily. In answer to your questions:
1) Not sure.
2) We'll never know. Panda probably collects and uses some fairly sophisticated data, hence it's probably this data that can't be updated daily.
3) It seems, re-run the Panda algorithm. For whatever reason (which again, we'll probably never know), Panda apparently can't be updated daily. Hence instead Google will simply re-run the Panda algorithm(s) periodically instead.
The conversation, as presented, is hard to understand. You actually have to start with this.
|@mattcutts assuming a site completely reworks their site/content after panda, how long before they will regain traffic? |
Then various tweets were interjected, which culminated with @mattcutts saying:
|@tomcritchlow short version is that it's not data that's updated daily right now. More like when we re-run the algorithms to regen the data. |
they are perfectly happy to put millions of low quality tweets on their frontpage, updating them every 2 seconds or so, and that doesnt seem to be a problem.
but if they come across an otherwise decent site with just 1% duff pages then it is banished to page 100 for a month.
|1% duff pages then it is banished to page 100 for a months |
In pursuit of perfection they've forgotten how imperfection is perfect. What would the world be if we didn't evolve from an abnormality?
If Google was a king or president, and websites its citizens. We'd erupt on to the street in protest.
What's happening with this still isn't clear to me. Originally I thought that Panda was a revision of the main search algorithm, which is supposedly updated almost continuosly.
Now it appears that Panda may involve a separate calculation whose results are fed into the main algorithm. But this separate calculation is only done periodically. Not sure if "periodically" means weekly, monthly, or some irregular interval.
The problem with a long re-run interval is that a website can be completely revamped in the meantime. For example, suppose Panda gives a certain website a "high-quality" rating. But then the next day a new owner takes it over and fills it with a lot of low-quality biased articles full of falsehoods and mis-information about a controversial social issue. Despite the addition of these new articles, the site would continue to get high rankings until the next Panda re-run, whenever that might occur.
So it seems to me that Google would need to do the Panda re-runs fairly often to avoid this kind of problem.
Back in year 2003 or so, the index updates were about once in a month, with little ill effects. Actually I recall when updates gradually became continuous people were complaining as it was somehow a negative thing.
If Panda is run infrequently it makes modifying a site to accommodate it a pretty tricky and probably very frustrating deal.
But thinking back a few years to the old days when Google only took us to the dance once a month or so, this may not be such a new time frame.
On top of that, we have this, (from the G "Quality Guidance" post). [googlewebmastercentral.blogspot.com]
|Some publishers have fixated on our prior Panda algorithm change, but Panda was just one of roughly 500 search improvements we expect to roll out to search this year. |
OK, so exactly what affected any given site? Panda, or one of the other 500 +/- Google changes (AKA improvements)?
A similar question could, and probably should be asked in reference to any changes we make on our site(s) today. If our rankings change after we make a change, was the reranking because of a rerun of Panda or because of one of the other 500 +/- Google changes (AKA improvements)?
|But thinking back a few years to the old days when Google only took us to the dance once a month or so, this may not be such a new time frame. |
It has been over 2 months already, and IMO is done on purpose to penalize sites, content farms or innocent ones.
Remember it took Google 2+ months after the penalty to say that our sites should be like Apple products and pages like mini-PHD dissertations. Unless we're Amazon, Mayo Clinic or Huff Post. And they only commented as PR move, after getting bad press for hurting small businesses.
[edited by: walkman at 7:07 pm (utc) on May 8, 2011]
imagine if Panda was used in other fields...
if google had a music shop, they would have one listen to The Beatles 'White Album' and dismiss the entire thing as a load of old rubbish because of 'Revolution 9'. And then they would refuse to stock it in their music shop for a month.
Or even worse... they'd refuse to stock ANY beatles music at all... all because they cant stand 'Love Me Do'.
Ever since that tweet exchange was published, I was thinking what is the reason for (so far) one-off run of Panda. I think this is done deliberately and not because of lack of technical resources. My thoughts are going along these lines:
a) Stop webmasters figuring out what works and what not in testing small incremental changes in order to recover their pandalised sites
b) Ensuring that any new content that comes to web is as close as possible to G. definition of "high quality", and in that way perhaps eliminating many future "low content" and spin-offs pages
So when (if) the Panda is rerun then:
- if your site comes back, you have no idea which of your actions actually fixed it, so there is less chance to game it. And knowing that you perhaps have a chance to fix the site for the next Panda rerun, and that if you missed your chance, you may end up being pandalised for another X months until perhaps another re-run should ensure you do your best to follow their quality guidelines
- if you have not been hit by Panda, be careful what you output on your website as you might get hit in the next Panda run, therefore thinking twice before any new page is published
And perhaps in this way G. is hoping to reduce the flood of pages output to web in a shotgun approach in a hope that "if I produce 100 pages, maybe one or two will rank..."
Looks to me like a major update of Panda today. Even worse blundering than last month.
The new key to ranking success... don't write anything detailed or authoratative because people copy it and Google will dump your rankings down the toilet (and then stubbornly refuse to accept the philosophy behind their algorithm is ridiculous).
|Stop webmasters figuring out what works and what not in testing small incremental changes in order to recover their pandalised sites |
as if anyone would do that :)
It seems to me that when Panda was run for the first time, it probably flagged a lot more sites than any subsequent re-runs would catch. So if there have been any subsequent re-runs, their effect may have escaped notice, especially if sites previously flagged were given a time penalty.
Wait, why wasn't this time frame to recovery mentioned in Gooble's latest Webmaster Central blog post?
Agree with aakk9999 that this is probably a "fixed-time" penalty to avoid people working out the algorithm. I think Panda, at a very crude level, is a separate domain classifier, that probably runs in isolation; classifying every domain in Google's index takes time - even for Google.
If you search I actually suspected it was a fixed time penalty when no one came back with 30 days.
Mighty arrogant and so backstabbing of Google. They go from one thing to a drastic other without notice and keep sites paralyzed for months. With an unproven algo to boot! Way to make enemies Google, many of the people hurt just followed Google's guidelines anyway on Adsense and tags. A 30 day notice on the new guidelines wouldn't have hurt, would it?
A friend of mine had two ads, but both were <!-- google ad... --> hidden from view or disables since he didn't like the looks. Yet, it's quite possible that Google just looked for a specific string and ruined his business for at least 2+ months anyway. There's no one he can contact or ask and if it's a Google bug he is still screwed and out of money.
|Looks to me like a major update of Panda today. Even worse blundering than last month. |
Where are you stevb?
Are you seeing similar garbage that appeared in Google.co.uk last Friday with many clean quality sites pushed way down into the 200/300s?
It seems to be slowly working its way through but the huge promotion of out-of-date forum boards and MFA sites has been horrible and the geo-targetting non-existent. I've been having to use Bing, DDG and Entireweb all weekend with about 3 hourly delves into Google to see if anything had improved.
FWIW, for me, images do not seem to have been touched at all which probably explains why I haven't been hurt as much as some...at the moment!
"Not updated daily" is still pretty vague. I don't know if the following is a clue as to the actual interval, but...
A handful of my company's websites were hit by P2 (April 11). At some point between April 27 and April 29 two of those websites completely recovered, traffic-wise.
The two websites are related—near-duplicate content, but not a 1:1 ratio and in different languages; English and Spanish—and I've made no updates to either website since they were hit.
None of our other, larger impacted websites have seen any noticeable recovery. On one of them I've actually been experimenting with individual pages with no success.
FWIW, I have seen badly hit pages coming back onto page 1 for my site, but they are at 10 instead of 1. These pages have been re-written and massively expanded.
I am working on the premise that the individual pages are now considered fixed, but the site wide demotion of authority is still in effect for the time being. This should change when Panda data is updated?!
Has anyone that was slammed on Feb 24th come back?
If yes, did you make any major changes and when did you come back ?
Apparently Panda was a one time thing and it sticks for a while, it has to be run separately and it doesn't update itself without Google hits the "Update Panda" button.
|Apparently Panda was a one time thing and it sticks for a while, it has to be run separately and it doesn't update itself without Google hits the "Update Panda" button. |
I think they need to re-run it fairly often because new sites are constantly being created and old sites re-vamped. Maybe they put a time penalty on sites that are caught, but keep re-running it just on new sites and sites that haven't been caught yet.
|I think they need to re-run it fairly often because new sites are constantly being created |
Yep, I'm about 10% of the way through constructing and launching a new site and many of the pages have gone straight into the new SERPS in excellent positions within a few days of being uploaded, even widget images are doing well.
aristotle, yep you described better than me.
Those hit originally are screwed, not others and that's why we see those hit post 2/24 come back. Something heavy was attached to those hit on 2/24.
From the ad supported sites: 3 of my sites gained heavily since then, one lost more % wise. The one that lost also makes /made almost 10 times as much money though ;).
I suspect Panda was a "grade" given after an evaluation of your overall site, I said this last month and those comments would seem to agree.
He also wrote this, some insight into Panda.
|@PandaProse the objective is to find and return higher-quality sites. We'd done pretty well on webspam, but low-quality needed addressing. |
A NEW site will be tagged with that "low" quality tag for some time and I vaguely remember matt saying (before Panda rolled out) that webmasters should probably focus on just one site moving forward. I don't know if his comments were related to Panda, it hadn't happened yet, but in hindsight his words were a warning.
[edited by: Sgt_Kickaxe at 8:58 pm (utc) on May 9, 2011]
|Yep, I'm about 10% of the way through constructing and launching a new site and many of the pages have gone straight into the new SERPS in excellent positions within a few days of being uploaded, even widget images are doing well. |
Coincidentally, I launched a Thai version of my website, on a separate domain (.th), interlinked at the page level only. I'm already seeing it in the serps (and bringing traffic), less that 24 hours after going live, in #1 positions for searches in Thai (of course).
| This 41 message thread spans 2 pages: 41 (  2 ) > > |