|My friend you got it totally wrong, or partially wrong, not everyone tries to game Google to the extent of not having happy users |
I agree. And I think my statement was just not written out as well as it should have been. I don't mean to imply that all SEOs, including myself, seek to game Google and trick unhappy users into converting. And I don't mean to say that we "lost the battle" because we have to have good sites.
What I meant with that was that we lost the battle, as SEOs, to be able to exert as much control over our rankings with on-page SEO factors and off-page link factors. Stuff like that is much easier for me, as an SEO, than making users happy. Now all of the sudden I have to worry about what was traditionally someone else's job.
But I guess SEO has always been like that. We have to worry about the designers, the developers, the copywriters, the merchandisers, and pretty much babysit anyone who touches the website. Now we'll be consulting on a whole new field of issues. But hey, that's job security right?
To anyone who thinks this isn't about user-feedback on quality signals just because they improved bouncerate and didn't improve, or have a site with high bouncerate that did OK you must remember that A: That's just one metric and B: Google has said several times that they don't use "Google Analytics" data in their ranking algorithm. I'm not saying they don't use bouncerate or click-back data from other sources (I think they do and stuff like that is what this is all about.) but just because you have one site with high bounce rate in Analytics that seems to be doing ok doesn't really mean much either way.
Watch that video Rand did above. Listen between the lines when Matt Cutts talks about this update and when it will be "reiterated" (why can't they just reiterate today? Why not anachronistically; isn't that what Caffeine was all about?). Think about Google's goal and what they've been preaching these last few years in terms of what SEOs should "really" be paying attention to instead of BS like PR sculpting. It's all right there like an open book, IMHO.
Whitey I explained why I think that is. First it's not a penalty; it's an algo change. There's no "time element" other than how much time it takes for more user-feedback to come in that gives them some insight into how your changes have affected your site's quality in the eyes of your users.
It isn't about "thin pages" as a stand-alone issue. It's about what your visitors think of your thin pages. Google has been doing word-count for a long time. They've been looking at variety of content types for a long time. If this was about being "thin" you'd have been "slapped" a long time ago. Again, this is just an opinion and an observation. I'm sure I'm wrong on at least some accounts, but I'm convinced that it's all about user feedback.
@balibones if it is all about user feedback why didn't they deployed Panda globally instead of "English only" search terms ?
Tranquilito, that is a very good question and one that I can't pretend to know. But I could guess that perhaps the data signals are different. Maybe people in different countries interact differently. Maybe they don't have the same feedback mechanisms in place. For instance, maybe the block sites or the plus one features aren't global yet (just two examples out of possibly dozens). Then again it could be as simple as a hardware issue, or a coding issue. I don't keep up with international SEO so I can't really say, except that the fact that it was released on US sites, then all English sites, before non-English sites doesn't necessarily mean it wasn't based on user-feedback.
|@balibones if it is all about user feedback why didn't they deployed Panda globally instead of "English only" search terms ? |
Perhaps because the words "user feedback" in Albanian refer to an unnatural act?
|"All those signals are now being used in this Panda..." he knows what he is talking about. |
Our thread on the topic - don't miss it: Panda Metric : Google Usage of User Engagement Metrics [webmasterworld.com]
Just as a clarification - Panda is not "all about" anything at all. It's a complex algorithm leaning on many factors in complex combination.
I've recently been thinking about which site gets top credit for frequently copied content. Do you think user metrics might be playing a role here, too? Widely republished information seems to be an issue for at least some of the original publishers who were Pandalyzed.
|@balibones if it is all about user feedback why didn't they deployed Panda globally instead of "English only" search terms ? |
My memory might be wrong but I think Matt somewhat addressed that in the video yesterday - something about structures of different languages and still doing testing before they're ready to pull the switch on other languages.
If somebody could decipher the answer from Matt Cutts would be interesting :) . He was asked by somebody from Poland when Google will start Panda's international rollout.
His answer: "There were some characteristics that were more applicable to English-language sites" .... the link structure of websites in Poland is a lot different from the link structure of sites in other countries.
Delay = consumer feedback loop = I don't think so!
That's pure speculation. And, if it had anything to do with consumer feedback, I suspect that the junk that floated to the top in some niches (and I'm including some of my own websites in that junk), wouldn't be sticking like s#!T to a blanket at the top of the SERPS! Plus, plenty of others have noted that some of their high bounce websites are doing fine in Panda.
Guessing it has more to do with the massive recalculation job, following unprecedented changes to algo + unprecedented changes to the index. Just consider the way many webmasters have taken a machete to their websites. Can you imagine the mess the index and link graph is in?!
Then there's your own website. If you haven't made big changes, you can't grumble about a lack of improvement in rankings; if you have, Google's got quite some job on it's hands to figure out what's what among the ashes of 404/ noindex devastation!
Have patience; it will turn.
Of course, once you do get your rankings back, you'll be on borrowed time if you can't improve an 80% bounce rate....
Someone said that the algo has been let loose and if, say, a content farm is more likely to have a brown image and you do too, that's a minus on your side. Not a killer, but still e negative among the many other signals.
I do know this:
- I have changed my content a lot
- I have deleted tons of pages
- I don't have articles or farm and G is not specifically targeting my niche.
- There's no way Google can tell with reasonable certainty how my specific pages are by the users they send. Maybe in 2-3 years but what a user 'said' a month ago is old anyway.
- By all things we know, my traffic should not have gone down after making these changes.
Do not assume that Google got it right and site X or y deserved it or 'my site is not going to get hit because I am doing good now'. This is a very new algo, but normally children don't drive or they drive with daddy holding them in his lap.
Tranquilto I think he was using "link structure" as an example of how things are different, and not necessarily as the exact problem they're dealing with.
Suggy, I wouldn't call it "pure" speculation. It is based on many other things. But yes, speculation is always a part of this job. Regarding bounce rate, I discussed that a few posts back. It's one of many metrics, which is actually part of your point of this being more complicated and unprecedented than other updates.
It will all come out in the wash eventually. I've stated publicly my theory and am, of course, still listening and considering everyone else's. And it is bound to be more complicated than what we can summarize in a forum or a blog post.
Nevertheless I'm going to continue thinking that the lag-time in Panda recovery isn't just based on malice or incompetence, as I don't think the folks working at Google are either.
It's also possible that unless 'signals' improve Google ignores any other changes.
|Lost ~60% on Panda 2. First move was last Monday 5/16. Moved up 15% after many changes. Mainly, a breakup of large pages, reorganization of large silos, unhyperlinking reference page external links, new server and 50 hours tuning performance (CDN, sprites, minification, all new JS and CSS), and tons of removing internal links. Also,our blog seeder is now throwing away ~20% of their posts with links to "click here" or other stupid anchor text to look more organic. I have no clue why we moved up in ranking. And this is exactly what G wants. This thing is a huge black box. |
|We have 1,300 highly tuned pages. Too highly tuned. Sales funnels and links are all tuned for best performance. Sales funnels all lead to the same place. Too perfect. Not natural enough. This is why we lost 60% of our Google traffic on 4/11. I was in denial but now see the light. |
We have too much content that overlaps. Since the beginning our model has been fairly simple. Get 3K new e-mail opt-ins a day (there is some magic to this clean with a superb SenderScore and low FBL). Build a super quality newsletter every two weeks. Build the newsletter landing page as a content page. Make it evergreen by updating the page at least once per year keeping it current. Point PPC campaigns at the new page with no budgetary limits just manage to an acceptable cost-per-order. Let the SEO team optimize the page. Use reputable in-house blogger (members of their blogging community) to write valuable content about our new article on a high quality blogs (e.g. nice inbound links). Make $100-$500 per week from each page by soft-selling products (very soft). And from 1999 to 2011 this worked exceptionally well.
@DirigoDev - Thanks for sharing this - a powerful insight into your teams work.
The thing that stands out to me about your work and recovery, is you removing internal duplicate content and improving the strength of your internal page rank across the rest of the site.
I've seen sites with weaker link juice and duplicate internal content being effected, and sites with the same content on different ccTLD's holding. So go figure.
Does this not point to an issue with the internal Page Ranks of sites and "Trust Rank" acting in conjunction with Panda ?
Has anyone else removed pages along these lines with success ?
@DirigoDev - Actually, I'm still ( respectfully ) sceptical of your claim that you *may* have reversed Panda. Sure you went down on those dates which was most probably Panda , but your revival doesn't look like Panda reversal.
Why? 'cos Matt Cutts is saying that the data is re-run manually at their end , and you responded more or less quickly.
Maybe some more analysis on key pages would reveal more to us. And i take your point that you " don't know" and it was a great wake up call for established practices to be revisited, but Panda reversal - i don't think so.
Internal duplicate content, internal linking to such pages and low link juice may be contributors on a different level - just my hunch.
|Lost ~60% on Panda 2. First move was last Monday 5/16. Moved up 15% after many changes. |
Latency might probably explain the 10 day difference in time. Panda 2.1 was run around may 6th so if it was Panda and if Matt Cutts was right /told the truth only then you would reversed a Panda. You saw results about 10 days later.
We can speculate if it's Panda or not...we have seen many more post Panda 1.0 recoveries. Might have been attributed to Panda when it's not? Who knows, but a 15% increase is always good.
Hmmm ... still sceptical ( but would be happy to be wrong ). I wonder how often those pages index as it sounds like a robust site that caches quickly.
For some more anecdotal evidence, SERountable is currently running a poll where webmasters are reporting on Panda recovery [seroundtable.com].
As I write this post, out of 233 replies to the poll so far, 11 are reporting full recovery and 24 are reporting partial recovery. 155 report no recovery and there are a few other options.
|out of 233 replies to the poll so far, 11 are reporting full recovery and 24 are reporting partial recovery. 155 report no recovery |
Really appreciate the poll -it's the closest we've got to some sort of figures. But how can folks be sure the improvements are not down to non-Panda fixes.
From what i understand ( and correct me if I'm wrong ), Panda fixes are not automatically changed algorithmically. They require a data re-run from Google. So maybe these reports are skewed towards folks who fixed their sites which had other implications.
That's my concern ( but always happy to be wrong ).
Whitey, if you want certainty, then SEO is not the place. Business is not the place, heck, life is not the place.
Thanks for the reminder Tedster :) ..... still alive , just.
For some , getting out of this Panda slap would make Rasputin's escape's look easy.
It's worth noting that not all Panda hit pages have 100% elimination of traffic . I mean if you improve various aspects , even if the page is Pandalised it may show improvements. But the whole things absolutely beats me.
No doubt about it. Panda is one of the toughest nuts to crack that ever landed on site owners.
Tough because there are still too many unknowns.
Something I think we should all be keeping in mind here...
Though we know the Panda algorithm has been run 3 times, we don't know what it has been run FOR.
There's been some speculation in other threads here that those runs were only geared towards further pandalizing sites, and were not in any way considering sites in a way that would lift the Panda penalty.
To me that makes a lot of sense.
I don't believe that there has been a Panda run geared towards improving the rankings of Pandalized sites. Only adding more sites to the pandalized list or further pandalizing or closing loopholes pandalized sites may have found.
|"There's been some speculation in other threads here that those runs were only geared towards further pandalizing sites, and were not in any way considering sites in a way that would lift the Panda penalty." |
IF and it's if, Google needs lots of data to get us out and they don't have it yet, then we stay Pandalized and others are added to the pandalized mix. But who knows. I do, however, know that I have lost traffic as I improved my pages and deleted many 'bad ones.'
But Google may have introduced other factors in the mix, brand recognition for example, how many tweets you got or whatever.
|One site was hit by Panda 1 and 2. Another was hit by Panda 2 only. I blocked pages that, although useful for visitors, could be considered "thin" for searchers (there were quite a few). Robots.txt at first, it was too slow, so I switched to noindex tag. I also did some major redesign in terms of layout and ad placements. Those sites haven't improved one bit with Google. One nice thing, my redesign at least helped improve Adsense performance somewhat. But at this point, I might as well stop working on them and move to others projects. Years of work down the toilet makes me wonder if I should have gone black hat instead, but I don't think I could look at myself in the mirror. This Panda update has put the fear of Google in me. |
koan @ [webmasterworld.com ]
This would make common sense. But, there's no predictability on outcomes right now, in which to risk investment in testing on large sites. Smaller one's maybe. This is going to be a long wait. IMO
|I do, however, know that I have lost traffic as I improved my pages and deleted many 'bad ones.' |
But Google may have introduced other factors in the mix
Same story here. Makes me wonder if I even know what it takes to "improve a page".
|Tough because there are still too many unknowns. |
What we do know are
1) Google P favors big brands. (Ie Wiki, Google products, Youtube, Blogspot, etc)
2) The introduction of more variety (Ie : Wiki, newsites, Youtube, blogspot, ecommerce, and etc on page 1/2) in search result, which leads to higher level of irrelevancy sometimes.
3) Google P despises duplicate content/content farm ( Ie : One Way Furniture digs out of G update [internetretailer.com ] )
4) Google P hates slow loading site.
What else do we know and not know?
Is there a time / trust element in the release from the Panda's grip?
If there is is a "minimum threshold" type of element, you meet it or you don't. I wouldn't be surprised if several factors combined to come up with your final grade.
It looks to be some kind of reservation system that they have introduced for the front pages.one for wiki, one or two for e commerce, one or two for youtube videos, one for news and so on.
By bringing in this variety to the front page for any query they are hoping to satisfy users with whatever they were looking for.It is a mix of results.
I do feel that it is more of a change from their side to introduce this reservation and very little to do with quality of sites.They just pushed out sites to make way for this mix.
[edited by: tedster at 6:34 pm (utc) on May 29, 2011]
From keneth's link:
"Nearly three months after the Panda update, traffic isnt back to where it was before the update, but Lieberman says the site is slowly climbing back up in search result rankings"
In a site that was discussing this story, the owner, Mitch, said that he was misquoted, the traffic hasn't budged at all. Apparently all the links he's got so far from WSJ, NPR and blogs discussing them didn't help him.
Walkman, that would be a useful link for us all to see. Please - share
| This 115 message thread spans 4 pages: < < 115 ( 1 2  4 ) > > |