homepage Welcome to WebmasterWorld Guest from 54.161.236.229
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Panda ranking signals
Hissingsid




msg:4367376
 3:46 pm on Sep 26, 2011 (gmt 0)

I'm trying to make some sense out of the changes implemented in Panda. I'm starting to think that Panda is a bit like Malaria. At first people put Malaria down to bad air, hence its name and it wasn't until they realised that it was a disease carried by Mosquitoes that they could start to do something about it.

For many of us with big complicated sites and many keywords that used to generate traffic it is difficult to hypothesise on what has caused the change. It is difficult to understand what signals is you site giving off or not giving off that feed into the new algorithm. We could start from the premiss that what is presented as a highly complicated Google algorithm can be simplified into a few key things that drives it.

I'm in the fortunate position of having a small (ish) uncomplicated site and in my niche one two word term produces two thirds of all traffic in the niche. Over the last few years there has been very little movement at the top of SERPS and Panda has resulted in a shuffling of the top ten rather than a complete step change. I can therefore spot certain signals that emanate from sites that have benefited and are less strong from sites that have done less well. Id like to share what I am seeing in the hope that others might chime in and out of the debate we can all find something that we can do.

1. Panda is about sites not pages.

2. Sites that have done well are larger than average in the niche.

3. Sites that have done well have many pages within the site that are directly linked to by other "quality" sites. Sites that have done less well have many links from fewer sites. They have multiple links from a site to the home page and older pages of the site but less deep links to pages on a specific topic and less new links to specific pages on a particular subject from pages on other quality sites that are on that subject.

4. Sites that have done well have more NEW quality content than sites that have done less well. The new content on the sites that have done well is picked up by other quality sites that link to it.

5. One site that has done well I would estimate has much more traffic on the general theme of the niche than sites that are focused more on one aspect of the subject. I would also estimate that their bounce rate would be lower.

A couple of other things that I don't think provide signals into the algorithm but are worthy of a mention are:

I've noticed that different terms have entered my long tail but that these are not really appropriate and have a large bounce rate as a result.

The main two word terms has the first word which is the general theme and a second word which is the specific topic. In WMT my site has a full significance bar for the second word and a 60% fill for the general theme, first word. I would guess that sites that have done best from Panda have this the other way. ie a full bar for the theme and somewhat less for the specific topic word.

 

deadsea




msg:4367423
 4:47 pm on Sep 26, 2011 (gmt 0)

The main two word terms has the first word which is the general theme and a second word which is the specific topic. In WMT my site has a full significance bar for the second word and a 60% fill for the general theme, first word. I would guess that sites that have done best from Panda have this the other way. ie a full bar for the theme and somewhat less for the specific topic word.


The significance bar in WMT for each term is largely driven by the number of pages on the site that mention the term. We found that it was showing lower significance for some of our main terms but high significance for some generic terms that we used heavily in template copy.

We went through our template copy and replaced the generic terms with the specific keywords for which we wanted relevance. The WMT report responded within a couple weeks. It didn't have any effect on the amount or type of traffic we were getting. Panda rolled out the next month and that site was not hit by panda. The work we did there could have been the deciding factor, but I really doubt it.

aristotle




msg:4367436
 5:08 pm on Sep 26, 2011 (gmt 0)

Do you see any characteristics of a "content farm" ? This is frequently mentioned as one of the identifiers for sites affected by Panda.

dazzlindonna




msg:4367469
 6:44 pm on Sep 26, 2011 (gmt 0)

One thing you didn't touch on was design. How do the sites that fared well compare in terms of design to the ones that fared less well? And yes, I mean the actual look and feel of the site (not template code).

whatson




msg:4367516
 10:30 pm on Sep 26, 2011 (gmt 0)

I've got to say that most of your points are debunked.

johnhh




msg:4367519
 10:41 pm on Sep 26, 2011 (gmt 0)

One thing you didn't touch on was design
all we see is sites with zero design features and style ( as in pre 2000), and zero investment, doing well. Might as well go back to tables ... then you get those bullet points in the SERPS that take up so much real estate.

Disagree with Hissingsid on just everything

walkman




msg:4367521
 10:55 pm on Sep 26, 2011 (gmt 0)

None of them apply to certain popular sites: they have 99% of rehashed stories, complete with entire paragraphs lifted and "NY Times says that....", and load extremely slowly but they have skyrocketed with Panda.

That's the truth. The lies that the Google cooks told after Panda no longer fly now. Once content doesn't matter in a supposedly update against 'thin and shallow content' or content farms what else can you say?

wheel




msg:4367541
 12:06 am on Sep 27, 2011 (gmt 0)

all we see is sites with zero design features and style ( as in pre 2000

That IS a design. I've got a site, custom designed in 2011. It looks like any other site designed in 2003 :).

Presentation may matter, but if Google's looking at content, I'd guess that it only matters as much as the presentation/design eclipsed the content. Design for the sake of design is a detraction, not a benefit.
Once content doesn't matter in a supposedly update against 'thin and shallow content' or content farms what else can you say?

I'd say that 'we' haven't figured out what signals Google is using to wham out Panda.

Reno




msg:4367552
 12:47 am on Sep 27, 2011 (gmt 0)

As we all know, there are hundreds of facets to the algorithm, and with Panda, user behavior/interaction has a direct impact on ranking. This combination makes it extremely difficult to say with any certainty what the Panda wants because what it wants is a moving target, and it moves with such regularity that what one person finds will work will not work at all for another.

Why? Because the 2 site designs are different; their users are different; their content is different; and in all the accumlated differences hides the mystery.

So everything that Hissingsid says may be absolutely on the mark for him, and simultaneously, absolutely off the mark for whatson & johnhh

This is exactly what Google set out to accomplish ~ create a digital organism that is continually morphing, which makes it close to impossible for SEO types to quantify, identify, or even vaguely understand.

Thus, go back to square one, to the pre-Google days: Build entirely for your users, not the Googlebot........... and hope for the best.

.........................

walkman




msg:4367561
 1:10 am on Sep 27, 2011 (gmt 0)

I'd say that 'we' haven't figured out what signals Google is using to wham out Panda.

Te most obvious is that Google lied. You could have the site 100% like they said it's ideal and Panda would have destroyed it.


This is exactly what Google set out to accomplish ~ create a digital organism that is continually morphing, which makes it close to impossible for SEO types to quantify, identify, or even vaguely understand.

Thus, go back to square one, to the pre-Google days: Build entirely for your users, not the Googlebot........... and hope for the best.

Expect that they started with "good sites" and then set to destroy the other ones, not cause them to lose 10% or 25% of traffic, but essentially drive them out of business. And then we have the manual exceptions and the "eHow wasn't hit? Wow, let change the algo so they get hit next time."

Example: Google can't say that, say, Nordstrom.com is a bad site, or people would blame Google. Now if Nordstrom harasses you with pop-ups to put your email or take a survey, that's 'a plus' since Nordstrom.com is a 'good site.' If you do not do it, Panda will hurt you because that pop-up leaves behavior patterns that can be measured over tens of thousands of visitors. So, not hounding your visitors can cause you to go bankrupt thanks to the allegedly brilliant minds at Google. Why? Because they decided to give all to the winners that 'the world's biggest kingmakers' chose, or mostly big sites. Go ahead and emulate them, specially in certain niches. You can't. If a big site is caught by mistake, they will be saved, yours will not. It also serves Google'$ corporate purposes or panda would not be live today.

In the end, a monopoly that essentially controls the internet, reneged on their previous promises and decided what sites live and what sites will most likely die. All for its own benefit.

Reno




msg:4367574
 1:59 am on Sep 27, 2011 (gmt 0)

I see some patterns in the way sites are ranked. Here's an example based on the rock sensation Willy Widget:

~ If WW has an official website (WillyWidget.com) that is not penalized due to excessive black hat stuff, it has a good shot at #1;
~ Number 2 will thus be Wikipedia, or some other encyclopedic type site;
~ There will also be social media: Facebook.com/willywidget, Twitter.com/willwidget, MySpace.com/willywidget, etc
~ And other high rankings will be from "Trusted Brands" in the rock music field, such as MTV, Billboard, etc.
~ Mixed in will be YouTube videos and news stories about WW.

These listings will likely take up ALL of page 1, so if you have a site dedicated to WW and, especially, if you are an affiliate of various companies selling posters, cds, t-shirts, etc, you are probably going to be on page 2 or lower no matter what you do re site design and content. Why? Because you are none of the above, and never will be.

That is a simple example of how I'm seeing many site rankings, and while it is admittedly a generalization, it's an example of a pattern that in my experience is fairly common. Because so much of the real estate is ceded to all of the above, the rest of us are dropping off page 1, which is to say, into Death Valley, and once you're in Death Valley, you are essentially dead.

..............................

Shaddows




msg:4367693
 9:22 am on Sep 27, 2011 (gmt 0)

@Reno
I cannot disagree with any of that (except you missed the block of news results pertaining to WW), assuming your targetted term is "Willy Widget" - and certainly that will have the highest traffic.

However, as a derivative website, you can rank for subjects. "Widget memorabilia" for example. Traffic will be lower, but conversions much, much higher.

If you are a fan site, and you cover off a large proportion of stemmed phrases, you will eventually break the top 10 (and it's quite likely you will go straight into the top 5).

If you are clever in your branding, you could create the go-to resource for WW, with its OWN search traffic- not deriving referals from the generic WW terms. Widgetera, the home for loyal Widgies and Willy Widget related stuff. Prime affiliate real estate, for a start. Direct selling from a subdomain.

But yes, the days of SEO getting you on page 1 for a media term or celebrity is pretty much done. The solution is a new business model, not a revamped site.

Post-Florida, plenty of business models were broken. But the people affected were pioneers in the industry, happy to try new things, adaptive and flexible by character. Post-Panda, people are treading well-worn paths and have been slower to let go of strategies that worked well up til now.

Hissingsid




msg:4367710
 10:59 am on Sep 27, 2011 (gmt 0)

I didn't mean to suggest that I have any answers or that my observations are correct but just reporting what I'm seeing in the hope that I might find one or two signals that I can either boost or reduce so that I can move up a place or two.

I find it difficult to believe that Panda is subjective. It must use objective assessment and it makes sense (to me at least) that if Google is using objective measures then we must be able to see them or at least the most important ones.

After Florida there was a massive debate here about what had caused problems for sites and what was the solution. In the end the debate fell into two main camps. The over optimisation penalty camp and the LSI (semantics camp). I had some pretty strong evidence that semantics was the issue because of unique language differences between UK and American English. I decided to stick with the semantics hypothesis and implemented changes based on this. I went back to #1 and have stuck there for most of the time since with a couple of notable exceptions.

What I am looking for now is a hypothesis I can believe in that will drive my activity over the coming months. To be honest although I would like to eventually be proved right in the strategy I select it is more important to me to have a clear strategy that I actually believe in.

It would be nice if folks who have evidence that shows that my observations can be ruled out because they are seeing other stuff to share that. Simply saying you are wrong doesn't add anything to the debate.

Cheers

Sid

zeus




msg:4367711
 10:59 am on Sep 27, 2011 (gmt 0)

before Panda I was on page 1 for my keyword search now page 7, but he the killer a site on page 2 is also a image site, but with links to po.. sites, also when you sometimes click there images you get to such sites, plus content text is only ads, of cause on image sites there is not much text, but here there is almost non.
The point is how can such a site be on page 2 with 144.000.000 results

walkman




msg:4367744
 12:01 pm on Sep 27, 2011 (gmt 0)

I find it difficult to believe that Panda is subjective. It must use objective assessment and it makes sense (to me at least) that if Google is using objective measures then we must be able to see them or at least the most important ones.
It's not but say that it favors certain qualities that small sites are less likely to have. Plus what Google sends you may make improvement almost impossible: bad keyword referral = more people leave.


It would be nice if folks who have evidence that shows that my observations can be ruled out because they are seeing other stuff to share that. Simply saying you are wrong doesn't add anything to the debate.
Don't want to out any site but the one I have in mind is a 'new media site,' they essentially fail every single point on the advice given by the Google Gods on how to write. (Maybe the owner follows Google's advice on what to wear or how to floss his teeth with gFloss ;))
AlyssaS




msg:4367762
 12:56 pm on Sep 27, 2011 (gmt 0)

1. Agree

2. Disagree - lots of small sites have done well under Panda

3. From memory, G did a separate update that dealt with how backlinks were treated (Feb 16th?, Feb 9th? something like that, see archives), it was before Panda anyway, and I think people are conflating the two updates when they shouldn't be. (Also they may have done further backlink updates since which we've missed because everything is now ascribed to Panda automatically!

4. Didn't need to add new pages/content, just needed to fix existing content to get out of Panda

5. Not sure about this one either


My feeling about panda was that they were looking at ratios - ratio of ad space to content, ratio of internal links supporting a page to external links, percentage of good pages on a site - once you improved over a certain percentage, you came back, plus a bunch of other metrics.

Also Panda seems to me to be keen on "basics" - sloppy coding where you accidentally put the same meta description on all pages was tolerated before, but not after. Pages with a lot of spam in the comments were tolerated before but not after. Tag pages that duplicated category pages were tolerated before, but not after. Other people have mentioned slow loading affecting the site. I don't know why this is - perhaps when their original raters seeded the algo, the good sites they found were not sloppy or careless, and they built that in as a benchmark.

Also the long click. One thing I did was go back to my analytics to look at what people were searching for before Panda, and then added information in, so that if that search was performed again, they wouldn't backspace because what they were looking for wasn't there. I ended up adding a lot of info to existing pages, they changed substantially.

I don't know if it was this that helped me or sorting out the sloppy stuff, but I was hurt by Panda 2.1 and came back with Panda 2.2 and have been stable since.

suggy




msg:4367779
 1:35 pm on Sep 27, 2011 (gmt 0)

I'll bite...


1. Panda is about sites not pages.
YES, it seems that Panda contributes to an agregated domain trust score. This can make it hard for any page to rank, once a site is 'pandalised'. NO, because the initial analysis is at page level and operates on keywords/ semantics - I know this because I can rank for generics but then disappear when a niche brand is added, likewise sometimes I cannot rank for the synonym which is illogical (how can I be top 5 results for one and nowhere to be seen for less popular synonym?)

2. Sites that have done well are larger than average in the niche.
Disagree. Lots of smaller sites have slipped under panda's radar in my niche and lots of bigger sites have been hit (though, agreed not the giants). Seems to hit the middle ground.

3. Sites that have done well have many pages within the site that are directly linked to by other "quality" sites. Sites that have done less well have many links from fewer sites. They have multiple links from a site to the home page and older pages of the site but less deep links to pages on a specific topic and less new links to specific pages on a particular subject from pages on other quality sites that are on that subject.
Disagree. There are plenty of sites in my niche that have done well off a very limited link portfolio.

4. Sites that have done well have more NEW quality content than sites that have done less well. The new content on the sites that have done well is picked up by other quality sites that link to it.
Agree except the links part. I think that old content is an issue - ie too much of it will affect Panda trust score. Why? Well, think about it, isn't the fact that 90% of your content hasn't changed since 2007 a pretty good indication the site is unloved and isn't contemporary. Does google really want to 'promote' the same content is was 4 years ago (even if it is still good)? Using this as a factor would also encourage webmasters to tend to their sites a little better and trim down. I think it's clear now Google wants us to start editing the junk out (so they have to deal with less of it!).

5. One site that has done well I would estimate has much more traffic on the general theme of the niche than sites that are focused more on one aspect of the subject. I would also estimate that their bounce rate would be lower.
I think it's easier for more focussed sites to rank under Panda. Time on site is probably a factor (as opposed to bounce) in Google's long click objective.

In terms of getting behind a plan, I know where you are coming from. FWIT, the plan I am backing is:-

1. Fix and verify all 404s, canonicals, domain redirects, conceivable technical issues
2. Refresh the entire site content. This means rewriting all product and page content for us and is proving successful on a page by page basis. Focus writing to omit needless words, not to be too irreverent or vague, not to be repetitive and not to be derivative of another page.
3. Delete all content that is too similar. Gone are the days of writing for google. If users don't make the distinction between #*$! for man and #*$! for men and #*$! for gentlemen, you shouldn't either.
4. Sign-up and get behind every other signal of quality: third party reviews, facebook, +1, twitter, industry assoc. membership, ISIS, Verisign, McAfee secure, etc.
5. Add user engagement functionality - widgets, predictive search, comparison pages, video -- anything that keeps them on your site longer

walkman




msg:4367781
 1:47 pm on Sep 27, 2011 (gmt 0)

I know this because I can rank for generics but then disappear when a niche brand is added, likewise sometimes I cannot rank for the synonym which is illogical (how can I be top 5 results for one and nowhere to be seen for less popular synonym?)

I have been pushed way down for the most popular keywords too. Maybe G determined the top sites for X keywords?

Suggy, did your rankings improve as you are updating content? I am thinking of going totally without any meaningful text that can be analyzed, it's not stopping others in my niche from ranking, and at least 2-3 with several amazing user reviews have been banished from the top after the April update.

suggy




msg:4367782
 1:55 pm on Sep 27, 2011 (gmt 0)

I have been pushed way down for the most popular keywords too
- you missed my point, which was how can I rank for "wine glass" but not "[Brand] wine glass" when the brand in question is very niche?

I have pages that are creeping back into top 3 results for some decent keywords. I also have pages I changed and made worse! But, I think I understand why now. Philosophically, since Panda google seems to be more about SEO than ever before - especially what's going on on your own site. But, the old lazy SEO of write 'some content' optimised for a keyword researched in Adwords and throwing some internals at it seems to be dead. Google seems to really dislike overlapping/ written for specific searches content now. Precision is everything in your content.

walkman




msg:4367788
 2:06 pm on Sep 27, 2011 (gmt 0)

I have been pushed way down for the most popular keywords too
- you missed my point, which was how can I rank for "wine glass" but not "[Brand] wine glass" when the brand in question is very niche?

Or maybe the decided that your site is not 'fit' for that brand /set of searches? I ask because I rank very high for say, brand footwear but not for brand shoes (widgetified). The fact that people may be going less after 'footwear' maybe a factor for me but still does not explain that much disparity.

nippi




msg:4367885
 5:03 pm on Sep 27, 2011 (gmt 0)

I agree with everything alyysas has said and pretty much it describes what is did too.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved