Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

A Simple Theory about Panda

         

aristotle

6:48 pm on May 29, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Here is a simple theory about Panda:

-- Panda evaluates the "quality" of a website mainly by analyzing user-behavior.

-- By user-behavior, I DO NOT MEAN BOUNCE RATE. Instead, I'm referring to more reliable indicators of quality, such as
----- User bookmarks a page as a favorite.
----- User saves a copy of the page on their hard drive.
----- User prints out a copy of the page.
----- User returns to the same page later.

-- Google mainly uses the Chrome browser to collect this data on user behavior. Tens of millions of people now use Chrome as their main browser. This is enough to allow Google to collect statistically meaningful data. And Chrome enables Google to collect data for ANY WEBSITE.

-- In order to evaluate a site statistically, Panda needs a minimum number of user-behavior data points. Thus, the data must be collected over a period of time. As new data is collected, the oldest data can be discarded, but enough must be kept to enable a meaningful evaluation.

-- At the present time, for some sites Panda could still be using data that was collected as far back as last year, because it is still needed for a statistically-meaningful evaluation. This could explain why people who made big changes to Panda-affected sites still haven't seen any major ranking improvements.

HuskyPup

3:20 pm on Jun 5, 2011 (gmt 0)



But the real problem is the listing is reproduced 3 times in spots 1,2, & 3.


The worst example I have seen like this are positions 1, 2, 3 and 4 PLUS "+ more from example.com" with another 3/4/5 links, amazing!

aristotle

4:55 pm on Jun 5, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This thread has become hard for me to follow. People mention "signals" that Panda could be using, but don't identify what these "signals" are.

When I started this thread, I identified some specific signals. Maybe Panda uses them or maybe not, but at least I identified them. This thread would be easier to follow if everyone would identify what signals they're talking about.

aristotle

6:19 pm on Jun 5, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



To add to my previous post, I would like to bring everyone's attention to a possibly important signal that is discussed in a recent thread started by Kidder: [webmasterworld.com ]

The signal is a relatively large number of direct type-ins of a site's name or URL in Google Search by people who remember the site from a previous visit, or heard about it from a friend, or saw it mentioned somewhere on the web. I think this could be a very important ranking signal in the Panda part of Google's algorithm.

tedster

6:51 pm on Jun 5, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm sure that particular signal has been part of the Google algorithm for a long time - not just introduced recently. It was almost definitely part of the Vince or "brand" update, I'd say.

aristotle

7:31 pm on Jun 5, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



tedster- I agree that Google has used it to identify brands for a long time.
But I think the increased availablity of statistical data has enabled them to begin applying it to lesser known sites as well. The reliability of a signal should increase as the amount of supporting data increases. As Google continues to get more and more data, they can steadily increase the number of sites they apply a given signal to.

CainIV

3:30 am on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think it is safe to say that Google will continue to push the boundaries on what signals it can and cannot use, and will develop new ways to use previously unusable signals, measuring them against each other to further prove validity, etc.

new_shoes

11:48 pm on Jul 11, 2011 (gmt 0)

10+ Year Member



I don't believe bounce rate is the most important engagement metric G uses. I think it's more important to them if a googler goes back to Google and keeps searching. If somebody keeps searching for the same thing, they obviously didn't find at the last site.

That's my hunch.

Until I figure what works with Panda, here are the factors we are working to improve:
1. Get users to stay longer on the site (at least longer than on competitor sites)
2. Give them what they came for, so they don't return to Google and continue their search
3. Do everything to make the site appear more credible using Gs 20 questions as a yardstick (we split test this with user groups - feedbackarmy.com)

Atomic

11:59 pm on Jul 11, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@plan new_shoes

Looks like a good plan. I think all three are keys to success, but this one stands out:
3. Do everything to make the site appear more credible using Gs 20 questions as a yardstick

Most of the sites I see that have been hit do not look credible. They have terrible navigation and are downright unusable. Who cares if they have good content or not! Users have greater expectations tan they did even a few years ago. Sites need to change to meet those expectations.

MrSavage

12:01 am on Jul 12, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I think the conundrum is this. Small niche site with less content, but precise reading and points. Result? A quick bounce back to Google. Big site? Heaps of link and articles and a mountain of content to mull over and therefore perhaps to never return back to Google in that session. I hope to the point information will be viewed as failure or lack of satisfaction. Different demographics? Different attention span and different website usage. Perhaps the only way to counter a quick return to Google is to get them to somehow click the +1 to indicate you got the info you wanted and your site was great. Otherwise quick back to Google is a scary proposition and a losing one in my area of expertise/websites.

Whitey

12:19 am on Jul 12, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The strange thing is that the bounce rates have improved markedly on sites i watch with Google applying Panda to non performing pages, even if it has taken a lot of good pages out at the same time. It's as if Google has taken control away from siteowners for improving their own performance.

I think folks may be looking to deeply at Panda factors and the situation is now mature enough for the experts to comment on. This is what i see :

- All the benchmarks are visible and discussed for what makes a site perform better in the SERP's.
- The Panda algo is way too complex for anyone to break down in detail , comments and theory are helpful exercises, but they don't tell the whole picture as it is and likely confuses the picture.
- All high level fix factors for eventual remedy have been discussed with many acting on them
- Nobody knows for sure when a site will return ( except perhaps Google ) - so in this sense it's acting like a penalty ( forget the semantic definitions for a moment )

Whilst I believe the main target was spam content farms, a lot of "fringe sites" were caught up. Many of these were not 100% and as such got caught as collatoral damage. It is likely that all these fringe sites that took the hit are caught in a kind of holding pattern, that Google is unwilling to roll back or update for some considerable time. so the only way out for them is to improve markedly.

The danger is that Google cannot afford to reintroduce spammy sites into the results whilst it perfects it's algo which is likely far from perfect , and that includes fringe dwellers or sites that have had made recent improvements.

I think Google knew this when it dinged sites, which is why most fringe dwellerss have suffered a proportion of traffic drop and not a full drop. Those sites should be on notice to make radical changes or completely perish.

Given the sites that have improved and appear in the top results, it's clear that Google has gone over the top introducing sites with "unique" content , but poor user appeal at the expense of sites with sometimes "poor" content but with better user appeal. I think that's the inherent danger of an algorithmn assessment of sites - but that's how search engines are made and perfected. Living with it is a different story.

That's my theory.

... oh ... and one last thing, having removed these offending sites, Google has accelerated sites that lived at teh bottom of the pond into the results which really are pretty offensive. But that's the interim collatoral damge for introducing this necessary initiative.

At the end of the day Google may decide it's not worth their while trying to control organic search in a lot of areas as it looks to phase in it's own Google real estate options to fully occupy the 1st page and take control of content by default, both in the SERP's and to a degree publishers.

That's my other theory.

koan

3:42 am on Jul 12, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



At the end of the day Google may decide it's not worth their while trying to control organic search in a lot of areas as it looks to phase in it's own Google real estate options to fully occupy the 1st page and take control of content by default, both in the SERP's and to a degree publishers.


So we're back in 1999 with Yahoo! type portals... I wonder who are the new wonder kids in search that will deliver us from corporate rule. Maybe they're still university students, some loveable idealist geeks, and have just find a goofy name for their project. They're going for a minimalist homepage, free from overbearing ads, that just works.

lfgoal

10:07 pm on Jul 13, 2011 (gmt 0)

10+ Year Member Top Contributors Of The Month



So, is it one of the implications of the new google...that many websites will be forced to slog it out in the longtail backwaters until they acquire enough deep signals of quality (time spent on site, bookmarking, user navigating through the same site, etc) before they merit higher rankings

...which are increasingly less valuable as a result of being under pressure from local results?

tedster

11:20 pm on Jul 13, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



One of the problems with theories about Panda is that, since there's almost no solid knowledge, the commonly repeated theories are starting to be accepted as fact.

There is no solid proof that things like "time spent on site, bookmarking, user navigating through the same site" are in the Panda algorithm. No proof that they aren't either - but you just can't run with a ball until you actually catch it.

MrSavage

12:31 am on Jul 14, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



My theory of Panda is that there is no roadmap to inclusion. It's so general in nature, that your well done page elements are less of an indicator of importance. Do a search today, even put it in quotes. You will find that the all important page title is rarely hilighted like it once was. If elements like that are not as important, then it's whatever I've been doing with my site to gain edge, has washed away. SEO is like hand waving. Come here Google I have what you want. I appears that what they want for search now is to rely on trusted sites and more brand recognition, either big or small. Domain keywords were another variable to say hey, this is what I'm all about. Now, not important. You tell Google what you want, and they essentially seem to flip through their "A list" and decide for you the best match. I simply feel that the indicators most of us used are now so scaled back that the only to counter this is to get a different niche or somehow get a brand going. I've done a bit more reading about Panda is this is some of what I'm getting from those insights from people who might know.

A lot of the confusion can be simply because if you didn't have the new Panda trust factors in spades, you somehow have to figure those out or build them into your site.

In closing, if you've spent time on the forum lately you can see, feel and hear the anguish that Panda is providing. Not for all, but for some. I can only relate this to experiencing a massive earthquake. The ground isn't settled, the plates have shifted forever and we don't know when the next aftershock is hitting. People are scrambling for the most part.

Take it for what it is. The more you read as I have from insiders, the more helpless fixes become. Sorry to say that. I'm somewhat insulted by Google that they haven't communicated more to the webmaster community but whatever. That's rant territory.

walkman

12:51 am on Jul 14, 2011 (gmt 0)



There is no solid proof that things like "time spent on site, bookmarking, user navigating through the same site" are in the Panda algorithm. No proof that they aren't either - but you just can't run with a ball until you actually catch it.

And trying to increase them might very well make the user experience worst. Think of an extra click, "Please Bookmark Us/ Newsletter pop-ups" make user do this and do that when showing all the info in an easy to read page is best--by everyone's definition.

I could understand maybe a 10% drop since this MIGHT mean a better site, but a 70-90% drop? That said, if Google did it, it is so.
This 105 message thread spans 4 pages: 105