| 8:37 pm on Jun 26, 2012 (gmt 0)|
The problem with people (including myself in the past) who use low quality links, is that they almost always go overboard due to the fact that it's easy/cheap to create alot of the links.
Many Blackhats/Greyhats who do manipulative link building by smartly keeping a balance of anchor text used have avoided Penguin all together. Even more, with your example ...actually having a good site with good metrics will only help further.
So I'll say they are completely separate because I've seen great sites with loyal users that survive Panda, only to be annihilated when Penguin ran because their link profile was 75% to 100% manipulative and targeted.
| 6:37 am on Jun 27, 2012 (gmt 0)|
|great sites with loyal users that survive Panda, only to be annihilated when Penguin ran because their link profile was 75% to 100% manipulative |
If the objective of search engines is to return sites that people like and engage with (positive usage metrics), shouldn't Panda override Penguin? With the examples I have seen, there is a room for debate where one can argue that Google lets websites with "not-too-clean" a backlink profile pass through Penguin, where positive Panda mattered.
In other words, is the answer to Penguin just lies in backlink profile? Or making it Panda friendly is a possible solution?
| 7:38 am on Jun 27, 2012 (gmt 0)|
I agree. I think Panda trumps everything as you say.
Google wants to only show sites with quality content. Links are too easily gamed so I think their role in ranking is diminishing. Google is trying to remove the game and get us all back to scrutinising our sites and how people use them to improve the experience, rather than wasting time studying the algo, building links, etc.
Be nice wouldn't it if we didn't have to play these stupid SEO games any more. There's a new business in town......User Metric Optimisation (UMO). This is the most productive use of anyone's time these days I think. Links are dead or dying. Don't waste your time or money on them.
I'm realising more and more that a good user experience is THE most important thing (as determined by user metrics) to get your site ranked. It levels the playing field. Why waste time building links when you could be working on improving the user experience of your website.
| 11:56 am on Jun 28, 2012 (gmt 0)|
claaarky, yup Panda trumps everything as you say. But, Google still needs links to discover new content and grade them. Bigger the count of links (genuine links that is to say) would mean the the website is likely to have higher traffic, which makes Panda analysis that much more reliable. Links, whether we like it or not, will be a part of the mix.
| 12:14 pm on Jun 28, 2012 (gmt 0)|
No they won't.
The penny has dropped for me on that one this morning.
Google IS getting some simple, basic, very powerful user metrics via the browser........you don't need links any more, you need advertising and a great user experience.
Have a look at the "Is Panda all about Exit Rates" thread.
You can now start a website and rank in Google by just creating a great user experience and advertising in a local newspaper.
Once people go to their browsers Google discovers your site and collects the user metrics to find out if people like it. If they like it, Google knows it's a good site and works out which keywords to rank it for.
| 12:33 pm on Jun 28, 2012 (gmt 0)|
IF Google doesn't need links then why does Google penalize for unnatural links?
| 1:14 pm on Jun 28, 2012 (gmt 0)|
Google tries to downplay their need and use of links, but if they didn't work so well they wouldn't be so vocal about penalizing people for them. Simple fact is if you get the right links you can out rank almost any page in the serps. Once you get to the top though you better have a site users enjoy. If you don't, that's when the Panda comes out to play.
| 1:30 pm on Jun 28, 2012 (gmt 0)|
Google is trying to help webmasters by telling them the links they are getting have no value and to stop wasting your time and money on them.
| 1:43 pm on Jun 28, 2012 (gmt 0)|
Helping by penalizing - makes no sense.
|Google is trying to help webmasters by telling them the links they are getting have no value and to stop wasting your time and money on them. |
Did it occur to you that Google is penalizing sites for unnatural links when in fact they are natural?
| 1:59 pm on Jun 28, 2012 (gmt 0)|
When you say natural, how were they obtained and what sorts of volumes are we talking about?
I do know someone who was penalised after running a press release with links in it which took off. On the face of it that might seem natural as in people loved the article, but if those links didn't drive traffic to the site Google might think, hmmm, better let this chap know this is a pointless exercise - you're not getting any traffic and you made it happen. Stop doing it.
| 2:05 pm on Jun 28, 2012 (gmt 0)|
|Google is trying to help webmasters by telling them the links they are getting have no value and to stop wasting your time and money on them. |
+1 Google. You have drank the kool aid.
| 2:07 pm on Jun 28, 2012 (gmt 0)|
|When you say natural, how were they obtained and what sorts of volumes are we talking about? |
Sure, some might look unnatural, but unsolicited nonetheless.
| 6:50 am on Jun 29, 2012 (gmt 0)|
|you don't need links any more, you need advertising and a great user experience |
It can only solve half the riddle. Presently, reach of Google in getting reliable data on user metrics is limited. The challenge only increases for smaller sites with lesser traffic. Which is why, Panda mostly affected larger sites. But, going forward Google's reach may improve and so its leaning upon user metrics.
User metrics, in isolation, may not be the answer to everything. There may be a site like answer.com that people spend only a few seconds on, mostly on a single page looking up for a word.
Besides, user metrics will depend on the relevancy of traffic. How is Google to know what keywords to rank a website high on, if it doesn't know what the website is about? It might be a great site, but you send it the wrong traffic, metrics look pretty ordinary.
| 7:27 am on Jun 29, 2012 (gmt 0)|
Who said Penguin is just about link profiles?
Isn't this just one element of Penguin?
My understanding of Penguin is *any* type of action that is used to artificially increase rankings.
| 8:45 am on Jun 29, 2012 (gmt 0)|
|My understanding of Penguin is *any* type of action that is used to artificially increase rankings. |
Link profile and anchor text variation is still the main factors of Penguin. Why do you think negative SEO is having an impact?
| 8:57 am on Jun 29, 2012 (gmt 0)|
One of many factors I would say. As well as many of the classic over optimisation techniques I.e keyword stuffing etc.
| 9:00 am on Jun 29, 2012 (gmt 0)|
jinxed, to my understanding Penguin's main focus was on backlink profile, as on-page factors were to Panda. Yet, I agree with you in the sense that it is not just about backlink profile. Certain on-page factors are needed to validate a website's attempts at manipulating ranks, such as presence of the keyword in Title and Content that is also present on a majority of the inbound links' anchor text.
| 3:52 pm on Jun 30, 2012 (gmt 0)|
+1 Jinxed. Backlinks may be the main thing with Penguin, but there are sites with very spammy backlinks still ranking, and sites with clean link profiles that got hit.
Penguin and Panda are both more complex than just one factor, and they are both measuring many of the same sites/queries. There's going to be some overlap and conflict.
But how that conflict is getting resolved, I'm not sure. There was a Panda update right before Penguin, as if Google thought Panda would wipe out the weak sites and make it easier for Penguin to catch the spammy sites. But then a Penguin refresh came before the next Panda, which would give Panda a chance to "rescue" some borderline sites from Penguin. At least, that's my understanding of how the computations are happening - if I'm oversimplifying, I'll yield to correction.
I also think it could work the other way - if Panda pegs a site as weak, that might influence Penguin to assume spam when it's unsure. That could explain the false positives with Penguin (I know, not everyone believes there are any). I have a site that I was expecting to get Panda slapped any day. Around April 19 (the Panda update right before Penguin), my traffic slipped a little, but not anything out of the ordinary peaks and valleys. Then to my surprise, I got the Penguin hammer - clean backlinks, no spam tactics. I had SEOs look to see what I had done that got mistaken for spammy, and they couldn't tell me. Google says there's no penalty. Was I really Penguined, or did I get a Panda slap, and then Penguin looked at something like my high number of editorial outbounds and concluded spam based on the Panda slap?
Whatever the case, I'm confident we need to THINK of Panda, Penguin and the regular algo as one big whole, or we'll miss something critical.
| 4:36 pm on Jun 30, 2012 (gmt 0)|
Contrary to what most people believe or say, Google has only enhanced their reliance on backlinks to the extent that they title pages in SERPS with what the anchor texts and other attributes of such backlinks suggest than relying on titles given to pages by their authors.
There is definitely a lot of merit in the argument that Penguin is a subset of Panda.Panda is definitely not an update that ignored backlinks in assessing page quality and hence it is not an "on page factors only" algorithm. Neither is Penguin a "backlink profile only" algorithm. But backlinks and their quality do play an important role in both the algorithms.
| 4:52 pm on Jun 30, 2012 (gmt 0)|
I would also say that a lot of human element is involved in either of these "quality" algorithms in the form of quality assessors or something related. I would believe that this is something beyond what the "experts" or Google themselves claim to have used such human army.
| 6:45 am on Jul 2, 2012 (gmt 0)|
McMohan, just reading back through this thread and thinking about how relevancy would be determined if links become a thing of the past, and the logical conclusion is PPC advertising, such as Adwords.
I personally would only spend money trying to appear in ads that were relevant to my business, that were likely to generate me business. By doing that I would be inadvertently telling the organic side of Google what I see as the most important parts of my business.
If the traffic passed through the ads produces user metrics that indicate that people really interact well with my site (they like it), Google would then know my site is great for that term and could rank me accordingly in the organic results.
| 10:33 am on Jul 2, 2012 (gmt 0)|
|I personally would only spend money trying to appear in ads that were relevant to my business |
That's a great argument to put forward claaarky. I once advocated legalizing paid links. Why would anyone pay so much money buying links if the site were not relevant for what s/he is trying to rank for? Stronger the signal sent by paid links, stronger is the intent to rank for and possibility of relevancy. If the user metrics is positive, so much the better, that business will sustain. If it isn't, the site will no more invest on paid links. It could be self-sustaining.
The downside is, per Google, you are still manipulating its algorithm (which is suppose to be shielded from the PPC side) by spending on PPC. The issue has only shifted from links to ads.
| 11:11 am on Jul 2, 2012 (gmt 0)|
That's a great point McMohan. When I first set up PPC ads years ago, I didn't really know which terms to target or which ones would work best for my site, so there was a bit of trial and error.
We got there in the end and now the tools to help with that are much better, but it was the performance of our website, the usability that let us down and made it difficult to make a profit. If we can really fine tune the user experience and improve our conversion rate then PPC is certainly going to be a great tool for us to move our business forward with.
I'm sure if we can create a really great user experience people will talk about us more as well, possibly link to us, and that might help Google understand our site a bit better. I'm just not sure where links are going but I'm sure if I just let them happen naturally and focus on the user experience then we shouldn't have any problems and if they help Google understand our site, all the better.
| 2:03 pm on Jul 2, 2012 (gmt 0)|
When discussing how these algos work together, I always come back to ehow and other content farms that somehow survived the Panda purge. They fell briefly, but then recovered - why? They're still a content farm, still the very thing Panda was supposed to eliminate. Same with About.com. I assume the changes they made after falling were to barrage the algo and other computations with good signals that outweighed the "content farm" signals.
Does anyone know what they did to recover? What did they do that sent SUCH good signals, that Panda rolled over for them?
| 3:31 pm on Jul 6, 2012 (gmt 0)|
A site that I work on was affected by Panda in March 2012 and the Panda runs in April 2012. It was also affected by Penguin in April 2012.
Since the site was affected by all of these, I am thinking that there might be some overlapping factors among the Panda runs and the Penguin run mentioned above.
I have not really been able to figure out what those factors might be.
Can any of you share what you think that they could be?
| 5:21 am on Jul 7, 2012 (gmt 0)|
Here's my observations on some content farms that I visit from time to time:
Read what Brinked said in this [webmasterworld.com...]
They (content farms, those that eventually escaped Panda) have done all what they can to make their content farm sites look interactive & worth reading.
Meaning, they improved the user experience by adding videos, related content, images, good nav boxes, etc etc etc.
They also deleted hundreds of thousands of zero traffic pages + structured/categorized optimally their best content.
I actually compete two or three about.com sub domains, and I can tell you that they are making a good job having several writers with top SEOs making every page optimized to its neck.
Google algorithm has a hard time recognize whether this content is 100% good or not, because visitors continue to dig reading.
Sometimes it is 'good enough' and sometimes not.
| 4:12 pm on Jul 7, 2012 (gmt 0)|
This also gets back into the question of a self-perpetuating cycle: do people like ehow and about because they like them, or because their top placement in the SERPs influences people to assume "This must be a good site."
I found a totally incorrect "fact" on About.com last night while researching legal issues for a friend who's being sued.