Forum Moderators: Robert Charlton & goodroi
When asked if an iteration of Panda was implemented this week, a Google spokesperson told us, “yes.” She also provided the following statement:
“We’re continuing to iterate on our Panda algorithm as part of our commitment to returning high-quality sites to Google users. This most recent update is one of the roughly 500 changes we make to our ranking algorithms each year.”
If you’ve followed the Google Panda update saga throughout the year, you may recall Dani Horowitz’s story. She runs an IT discussion community called Daniweb, and it was hit hard by the Panda update, but she made a lot of changes, and gradually started to build back some Google cred
Oftentimes mis-matched traffic is the result of an ambiguous search term. In any case, a high bounce rate for one term logically shouldn't affect your rankings and traffic for other terms.
I don't understand the mismatched traffic thing. I mean, you have a title and meta description that adequately describes the URL, right? The traffic isn't being sent directly; the user has to click on something. And granted Google will rewrite elements, which is annoying, but I haven't seen a rewrite that's THAT off the mark. So how is it Google is suddenly sending mismatched traffic?
[edited by: walkman at 4:00 pm (utc) on Oct 5, 2011]
Bing does is that way, but Panda is a sitewide penalty when certain criteria is reached, so the entire site suffers. I know it first hand. Maybe if sections are divided clearly, it's different. maybe.
I don't understand the mismatched traffic thing. I mean, you have a title and meta description that adequately describes the URL, right? The traffic isn't being sent directly; the user has to click on something. And granted Google will rewrite elements, which is annoying, but I haven't seen a rewrite that's THAT off the mark. So how is it Google is suddenly sending mismatched traffic?
Could that be why Matt Cutts suggested sub domains as a solution.When you have thousands and thousands of writers, when dividing them, simply by chance some will make it. Hubpages is no way near their pre-Panda levels.
[edited by: walkman at 4:09 pm (utc) on Oct 5, 2011]
Say I have /nike-shoes and someone searches for "Nike shoes size 13 black color." I have Nike shoes but not size 13. Bounce. Maybe because we're still learning, but we haven't figured it out how to include all that in the title or description. You know, "don't click here if you want size 13 because I don't have it." If I do, then I have to leave something else out.
And that's something you expect *Google* to be able to figure out?I expect all search engines that ranked me for that word to 'solve it,' especially when they are a monopoly, or near it. If I don't have have what's in the title and description, then they can /should penalize me. How is it my fault that I mention the word 'size' and '13' in the page or said "sorry but we're out of size 13 shoes." and Google puts them together?
No they should not be expected to know that. But, they also shouldn't penalise an entire site because they have sent people for certain key phrases that provided a negative experience.Ding, ding, ding! We have a winner, but that ruins Google's brand-promotion plans since brands will be outranked quite often for many keywords. This non-wholesale penalizing will result in a better experience for users but the money is in cleaning the 'cesspool.' Google engineers can't be super-smart and not know this, so which one is it? Are they dumb or... ?
Sadly, some people will always click on result #1, even if it's clearly off topic. We currently have a number 1 ranking for a way off topic term and people are clicking it.
The Panda Shamanism got a mortal blow when Daniweb and others that escaped in July were hit again. Al their work, apparently didn't do jack.
No they should not be expected to know that. But, they also shouldn't penalise an entire site because they have sent people for certain key phrases that provided a negative experience.
Um, Daniweb has recovered, again
Seems like my main site is following Daniweb's patterns.Google discovered some new 'data' as soon as Daniweb hit the twitterverse causing Google to look stupid and showing how it's near impossible to come out of Panda. Amazing, maybe the 'data' was under someone's desk and Google used their "scientific process" to change the SERPs. Your site will be fine as long as it fits the exception they made for Daniweb. You do realize why you came out in July, right?
We recovered in July, and got creamed again September 27. Some of our keywords lost as much as 70% traffic. Our GA was all red after September 27. Worse, our Adsense went down the drain as well as impressions went down and RPM went down.
My pandalized site appears to be following Daniweb's recent patterns in the opposite direction.
Small recovery on Sept 27 (first since Feb 24) then yesterday...it's all gone. Worse than ever.
I think this panda update hit a lot of false positives again, and is the reason why Matt gave the weather report he gave yesterday - to be prepared for changes.
Brand promotion is just a side effect of the Panda medicine. Brand websites have something that Panda wants but Panda doesn't want brands so go figure what it is that brand websites have that makes Panda boost their rankings and do that.Use adwords to advertise and see the difference in bounce rates between long tail and that. Bingo, I just gave away the secret!
They don't care about false positives, they care about famous false-positives that can give them bad press.
I'm seeing the usual SPIKE in Google crawler activity that comes right before a panda update... anybody else seeing anything? Seems kind of odd since we just had a panda update last week...
Danny Sullivan has been pressuring Google behind the scenes to give us more on what is going on with Panda. I think, not certain, it lead to Google's Matt Cutts to tweet a "weather report" (Yahoo first named search updates weather reports back in the day), on Panda:http://www.seroundtable.com/google-panda-25-tweaks-14127.html
Our site has followed the same pattern as DaniWeb. To summarize, 50% drop in Febraury, 100%+ recovery in July, 43% drop on September 27th and now what appears to be a full recovery starting October 4th.
One thing I did do was submit a reconsideration request to Google last week, before I read anything about Panda hitting, because I thought we may have been penalized for a developer bug that caused a lot of UGC to be hidden from users but not from the engines.
However, seeing what happened with DaniWeb's recent recover, and what Alika said above, I'm assuming the developer bug had nothing to do with it.
I would not be surprised whatsoever to hear that a site gets nailed by Panda after adding 1 million pages of less than ideal content
[edited by: Whitey at 11:30 pm (utc) on Oct 5, 2011]