| This 193 message thread spans 7 pages: < < 193 ( 1 2  4 5 6 7 ) > > || |
|Is Panda all about Exit Rate?|
| 7:30 am on Jun 25, 2012 (gmt 0)|
Commercially, what I am about to do may not be the most sensible thing, but I feel itís right and I want to share what I have discovered about Panda. It may help you understand more about quality and how to escape Panda.
Please note, I have not escaped from Panda yet Ė I came to these conclusions on June 22nd 2012 and began addressing my issues based on a new understanding of how Panda works. This theory could develop and I could end up with egg on my face massively, but it makes more sense than anything Iíve ever read anywhere before. Here goes............(apologies for the long post in advance).
Since Panda hit my ecommerce site in April 2011 Iíve been trying to improve the quality of my site using Amit Singhalís guidelines as a basis but completely without success.
I always imagined that Panda was a magical formula Google concocted using their human guinea pigs when they sat them down and asked all those questions relating to quality, and that ĎPandaí is Google crawling your site looking for the signs of low quality. Itís not.
Last week my attention was drawn to a statistic in Google Analytics that for some reason Iíd never noticed before Ė Exit Rate. Thatís when it dawned on me Ė Panda is all about user metrics, they canít Ďseeí your site, they donít crawl it using a magical formula, they collect signals given off by humans as they use your site to tell them where the bad quality is. If you have too much of it, they demote the rankings of the pages with bad content and any pages that link closely to those pages to protect Googlers from hitting your bad content (what we know as Panda).
I compared pages on my site with very high exit rates to those with very low exit rates and immediately it struck me how much better the pages with low exit rates were. It also struck me how many different reasons there were for the high exit rate pages being worse (in many cases it was just a bad product that rarely or never sold, or the price was too high, the description was poor, the image as poor, etc.). The low exit rate pages were our top sellers, good products, good descriptions, nothing bad to say about the product or the content or presentation of the page.
Then I realised this is where Google started. They wondered about Exit Rate, sat people down, asked them to compare web pages, asked them why they liked or didnít like a page, and found that Exit Rate correlated with human feedback. Itís obvious really Ė people leave your site because theyíve either done what they came there to do or something put them off. This is the ultimate test of quality.
Google doesnít need to Ďseeí your pages, it just looks at where people leave your site, maybe what they did before leaving your site (how long they were there, how many pages visited, etc.) and if your site has a high proportion of pages with a high exit rate, your users probably donít like the quality of those pages.
Of course, people have to leave your site at some point, and that may be because theyíve found what they want, so there has to be an allowance for that. And there may be a different model for different types of sites. But I found, when I looked at my high exit rate pages, in most cases it was obvious why people didnít like them. In the case of our product pages, the high exit rate pages were generally non-sellers, cluttering up the site and, as I now realise, turning off customers.
To try to disprove my theory I started reading back through Amit Singhals guidelines and it all made sense (as I knew it would one day!). I also looked back at various discussions about Panda, things that people did to recover from Panda, and it explained everything.
Itís beautifully simple and it deals with a huge range of Googleís problems in one hit. Users naturally react differently to webspam, duplicate content, scraped content (or original content if itís been scraped), brands. Sites with a high proportion of high exit rate pages tell Google all they need to know about the quality of your site, from a web userís perspective (which takes into account an unfathomable range of considerations that even Google have struggled to document - what Google needed to say is what Iím saying now, look at your exit rates!).
This theory explains so many of the things weíve all noticed about Panda, how it works, itsí effects on Googleís results and our sites. Hereís a few......
Why canít they run Panda more regularly?
They need a monthís worth of user metrics to be able to make a judgement about your site.
How did I recover from Panda without changing anything?
Scrapers can hurt your exit rate. So can competitors. If you have content people have seen elsewhere it affects their perception of your site. If Google got rid of your scrapers, your user metrics would improve without doing anything. The user metrics of your site are affected by whatís happening on other sites so even a new competitor doing something similar to you can affect your user metrics.
Why are brands dominating the results?
Itís not brands that are dominating the results, itís websites people like and trust that are dominating. Not every brand will always be loved and trusted, and their user metrics will reflect those changes. But generally people trust what they know so sites people like (letís create a new term to replace brands Ė SPL, sites people like) can have bad pages but people wonít leave their site just because of it, so their user metrics are better. You could set up an identical site with your name at the top, the Ďqualityí of the content would be identical, but the user metrics would be much worse.
Why did Google suggest merging pages?
Iím guessing user metrics show that users donít like seeing several similar pages on your site, in the same way they donít like seeing similar content on numerous sites.
Do images or photos help improve quality?
Not necessarily. Every page on the web can produce a different response from users. The only way to know is to experiment, check your exit rates, repeat until exit rate is low.
Should I add more content to my site?
See ďDo images or photos help improve qualityĒ.
Can I escape Panda by improving my brand signals?
If you mean getting backlinks with your website name in them, no. That does not make you a brand. Google doesnít actually care if youíre a brand or not, it just knows that people react better to SPLís (sites people like). User metrics prove it.
Can I escape Panda by getting better quality links?
No. Users canít see your links, links make no difference to how users perceive your site alongside the rest of the web. In my experience, while demoted by Panda, links wonít get you anywhere. Once you get out of Panda though......well, hold onto your hat.
Why does moving content to subdomains work for some people?
If you correctly identify, fix or remove bad content from your site (using Exit Rate as a guide) you will be left with only good content. It doesnít matter how you do it, what matters is that you get your exit rates down.
Will no-indexing or blocking robots from pages of bad content help?
No. If a user can see it, Google has user metrics on it. This is not about googlebot, itís about your users.
I could go on, but you get the point. User metrics tell Google everything they need to know about how real human beings perceive your site in context with other sites they may have come across. It really is as simple and as complex as that.
What I donít know is where the threshold is and Iím guessing it may be different for different types of sites (information versus ecommerce for example) and there will be other factors combined with it but, put simply, I think Exit Rate is the place to start looking if you want to find and fix your bad content issues. It really opened my eyes. (Note: I think bounce rate impacts exit rate, I might be wrong on that Ė removing bounce rate from the Exit Rate calculation may give you a truer reflection on how people react with your content as they move around it).
I suspect that some sites may not able to get below it if they are basically set up with spammy intent (you see how effective this simple method is) but for many of us, understanding that exit rate tells you where you bad content may be, could be the answer to your prayers.
I hope Iím right, or at least on the right tracks. If I am, itís time to end the Panda woe and improve our website KNOWING what quality content really is. Maybe a discussion here will help test the theory and perhaps help us gain an even greater understanding.
I hope this helps you and me escape Panda, I truly do.
| 12:49 am on Jun 27, 2012 (gmt 0)|
|Now - how to apply that to category pages, eh? |
You are right, for some sites (like mine) category pages are a necessity. They need to be there in order to organize the content so users who are browsing the site can find the goodies. But they too can be an authority and get traffic if you design them right. Our category pages used to be (pre-Panda 1.0) our highest site entrance pages. They ranked well despite not really having content on them other than links with some minor text to the content pages. Some had more text than others, and those were hit less severely. (we still compete very well with a few of them - against some mighty competitors too for some very high traffic terms)
Post-Panda they need to offer something more though. Not just "more", but more than other similar sites offer. I've had to think creatively in this regard and what I've been doing so far has helped (what that particular thing is will vary by site). The category pages need to present the information to the user so they a) know they are in the right place (if they entered from a SE), and b) want to click on one of the content (or product -for ecommerce) links. My hardest hit category pages had large bounce and exit rates, and not coincidently very little to offer the user. (also fyi - our content pages were never hit by Panda, they lost only collateral traffic from the decrease in category traffic so the links to content on the category pages don't seem to have been a factor)
Also as a side note (and apologies to Claaarky for kinda veering off topic), I stumbled on John Mueller's post about 2 months ago (despite trying to read the Google forums regularly since Panda 1.0) and have used that as my guidelines on what to do with under-performing category pages. I wish I had found that when it had been posted, we may have had recovery sooner. I'm currently noindexing pages that need work. I've merged/301'd ones where I can. There isn't anything on my site that needs to be deleted outright. I'm convinced that our meager Panda recovery is from noindexing pages. I've improved some, but those changes haven't yet gotten any traction from the search engines, at least not from Google. The redesign might have helped some - it was done in February of this year with Panda demotions continuing in Feb, March, and April. No-indexing started after the last April update but clearly should have started Feb 2011. And we've now seen nearly a 20% recovery combined with the last 2 Panda updates (3.7 & 3.8). And let me clarify - I'm no indexing pages that are slated for improvements. I'm just getting them out of the index until they can be modified. Everything on my site was built for the user - just not maybe as well as it could have been :)
| 12:52 am on Jun 27, 2012 (gmt 0)|
|Only a small number of visitors to my site [pre-Panda] would follow that path. |
And users don't follow that path on my site either - I was just using it as a hierarchy example of exit rates. The homepage should have the lowest rate IF the user enters from there. Second lowest would be the categories, and so on.
Most of our traffic comes from SEs directly to our content post panda. Pre-panda most of our users came from SEs to our category pages. So post-panda our pageviews per user has dropped considerably.
| 1:34 am on Jun 27, 2012 (gmt 0)|
Pleased someone agrees.
Someone either here or on a similar topic threw a HUGE spanner in the works which made me sit up and THINK.
The OP was opining about Google Chrome and the data it sends "back home"
I've made my evaluation of my site visitors behaviour. I have NO other information available to me. Neither do you or others.
That poster "claimed" Chrome can discern who saves pages, who prints out pages etc.
For your site.
My theory might well fail. I'll live with that.
| 6:26 am on Jun 27, 2012 (gmt 0)|
CALLING ALL PANDA SUFFERERS!
Okay people, as I work through my statistics, look at my site and realise how I've neglected my business, I am coming to an understanding of Panda that is blowing my mind.
Everyone suffering from Panda needs to read this and try to get their heads around this.
One thing seems very clear to me now. Panda is mainly, possibly completely, about user metrics and it's also about how good you are at running your business. Before you can even hope to understand Panda and get back to having a successful website you MUST accept that. You stand a much better chance of getting your traffic back if you accept that to be the case. Switch your brains over now, and read on......
User metrics (exit rate and time on page being the ones I've relied on heavily during my analysis - there are other statistics in play but I haven't figured those out yet) tell Google everything they need to know about how successfully your website delivers what people want from it, compared to your competitors. To do well in Google, your website needs to do a better job at delivering what people want from it than any of your competitors. You cannot hope to rank well with lazy practise or if you just don't really know what you're doing. These are harsh facts but keep reading.
My stats have revealed I've been doing an appalling job of running my business. I thought I was entrepreneur of the year, every year, but now I realise I am a complete amateur. For the last 9 years my visitors have been telling me something by the way they use my website and I haven't been listening. I left products on the site for years that don't sell, just in case one day someone might buy one. I put pointless blog articles on the site so I had lots of 'content' to appease Google rather than thinking about my users. My users were shouting at me via my statistics "what the hell is all this rubbish". I just didn't realise it. Being 'hit' by Panda has helped me realise that, and I can now see how to make my business much, much better, more efficient, how to deliver what my visitors want.
Here's one simple tip for ecommerce sites who want to get out of Panda - if it doesn't sell, get it off your site pronto because it will damage your visitors user experience, damage your user metrics, damage your rankings in Google. This has implications for your conversion rate as well, it has many, many implications. This concept should hold true for other types of sites as well - it's all about delivering what people want. Understand what your user's goal is and you'll may understand where you've gone wrong.
People who really know what they are doing and are running their websites well won't be noticing any problems. They may know about Panda and be worrying about whether it could hit them one day. Well, I have two pieces of news for those people...1) Congratulations, you really know what you are doing, and 2) There probably isn't a website in the world that isn't affected by Panda in some way, but you shouldn't notice the effects to badly because you know what you're doing.
The harsh reality of this conclusion is, I'm afraid, if your site loses traffic after a Panda update, your users have been telling you they don't like your site so much any more. They don't like it because you're not giving them what they want. By integrating user metrics into the algo via Panda, Google is showing us what our visitors think of our sites. And rightly so. It has hurt me badly, but now I can see the point. If you lose traffic it's a signal that your website is doing a worse job than your competitors websites are. It's natural selection, survival of the fittest.
This is why brands are ranking better. They have many staff, highly qualified professionals, skilled in their particular disciplines, working at making their websites ruthlessly efficient at delivering what people want. Small businesses can't afford all that. In my business I'm the webmaster, accountant, retailer, quality control, I rely on my ability to out think my competition without any chance of having the combined time and ability of my competitors.
But there is some light at the end of the tunnel for us small businesses or people with websites so vast they can't possibly ensure all pages are delivering what people want (I'm confident all Panda sufferers will be one or the other, possibly both). By understanding your statistics you can figure out what the big boys are up to. This statement is so complex to explain I won't even attempt it. You will only understand that once you understand your stats, understand Panda and look at what your competitors' websites do compared to you.
This explains seasonality - why your website gets a rankings boost at your peak season. I always wondered Google knew I was entering my peak season and traffic started going up. My user metrics were changing. People were spending more time on my site, more people were going to my order form so the user metrics of all my pages were improving. Google sees that you are doing a better job at delivering what your visitors want, so your rankings improve. You see how this all fits together. It's genius.
So fellow Panda sufferers, my conclusions (and I think I am reaching the conclusion stage) is our websites are suffering because we're not very good at what we're doing. Sorry, it's the only conclusion. Our visitors play a huge role in determining our rankings now. Do a better job of pleasing them than your competitor websites and your traffic will return. That is what quality is.
I always thought of Panda as something you're either in or out of. People here have said repeatedly it's a ranking factor, it's a ranking factor. I now get it. We're all affected by Panda. Nobody escapes. Your visitors are always judging you. Every single page of your website they visit is judged by them and it all matters. You can noindex it, block it, whatever, but if people can get to it from your site it will be judged. If the judgement is it's bad content, it can affect rankings on any pages that link to it.
At this point in time (and my understanding is evolving all the time) I believe that's what it's all about. It's possible Google has thrown some other things into the mix - I won't truly know until I see my user metrics improve and my site recover. But now I know what I was doing wrong I can see how to fix it.
Don't give up hope people. Change your thinking, be honest with yourself, look hard at your stats, your website and your competitors websites. If you think your site is suffering from Panda for no apparent reason you to need to look harder. It's all about your users and the feedback they leave in your stats as they use your site. Look at your stats - your users are trying to tell you what you're doing wrong. It's all in there.
| 7:35 am on Jun 27, 2012 (gmt 0)|
After reading the entire converation in this thread, I really doubt Panda is all about user metrics, exit rate and bounce rate, It all depends on the niche, google is specially targetting a niche to penalised all the sites. When they first ran panda Ecommerce site were effected. If you have noticed the niche, all niche has different bounce rate and exit rate. If you see Automobile Niche and Tech Niche both has lot of difference. User Metrics and Exit Rates I don't think it is possible to compare metrics accordingly. When Refersh Panda if you can notice which niche got effected majorly. If you are in that niche you might be effected. When google refresh Panda, mostly niche website are getting effected. How to recover from panda, there is no answer for the same yet.
| 10:38 am on Jun 27, 2012 (gmt 0)|
Hi, first post here but have been following most of the related threads for more than a year, so take it easy. Firstly, sorry for the long and disjointed first post, but I feel some of my thoughts may be useful within this thread.
@claaarky I tend to agree with you in part, that this could be a helpful metric in identifying weak pages but, in a similar manner to 'Rockzer', I very much doubt this is the be-all-and-end-all of the Panda layer to the algorithm.
I've been paying attention and reading between the lines to what various Googlers say or imply and Matt Cutts has said that 'serp blocking' has been used in Panda. I believe this is probably in a confirmation capacity to Panda's findings.
Recently at SMX Advanced when asked the question about 'bounce back to serp rate' I found it very telling that MC skirted around the issue, tried to change the question around to imply 'GA bounce rate' and then stated they don't used that 'noisy' metric. He was further pressed and told that wasn't the question and still gave an inconclusive politician's answer. A lot of the transcripts at the popular SEO news sites seem to omit the meat of this exchange.
Early this month (June) a question was posed on twitter to Matt Cutts:
|if you got hit with both penguin and panda should you just give up? |
to which the reply was
|I think the site that set the record most recently had nine completely different things that we flagged on it. |
Whether the reply was regarding Panda, Penguin or both is open to interpretation. I feel those 9 things can't cover all Panda browser metrics and Penguin linking or site metrics. I may be off the mark here I take it to read Panda (can) equal on-site.
Reading the thoughts and findings of certain patent bloggers gives an idea of how complex (or simple) Google could really be.
My view of Panda, in really simplistic terms, is like so.
Metrics were used in the initial data seeding of Panda, with the help of Quality Raters, and sites were put into taxonomies/categories. Sites with only enough of certain metrics were used and were put into good and bad piles. They then looked for differences with these groups, that they could get a machine to identify, that were highly likely to apply to one group. Then the algorithm was run across all caches on the relevant data centres. As the machine learns, new factors are possibly being found and identified that correlate strongly to certain groups within certain category of site. Metrics could then possibly be used to try keep to Panda on-track with the different groups in different site categories. Also, as Panda's evolved, groups have been split down into less distinct groups with very fine differences between them - hence turning more into a ranking layer which could move a site up or down a little (should things be 'not to bad'), as-opposed to showing more of penalty effect many initially saw.
So maybe not so much about the metrics, which explains why smaller sites with inconclusive metrics can get hit, but it's about 'Perceived Metrics'. So I think that claaarky's right but also think those that disagree with him/her are also right.
The thought on keeping users going/navigating in the right areas is an interesting one and quality sites, in certain classifications, would be able to do this.
I have a site that was hit in April 2011 that I believe was showing improvement month-on-month from August last year until the Penguin recently started to peck at it. I tinkered too much with it but everything seemed to lead to perceived user satisfaction, mainly stemming from the fact that a site that updates nearly all of it's pages in this sector on an almost daily basis is likely to be a quality one for the user, plus there were a few content/grammar issues.
Another site is an old blog which isn't updated, some of the posts could make an semi-interesting read for the search terms they're found for. This site seemed to be fine until the 'Page Layout' algorithm, and has a large Adsense block floated left of the content with a large banner ad at the top of the right side bar with another large Adsense block below. That's not the issue and I've recently removed the left floated block to see if this was the case and time-scales etc...
On this second site there's a subdomain which fits the same sector as the first site I mentioned. It's really just for testing, is pretty low traffic and has only 7 pages, all have sailed through Panda and ranked at an acceptable level until April 27th update. These pages have a basic layout, OKish text and the information that users would visit for is populated via xml from the database of the first site mentioned. Wording pulled from the xml feed is unique as I have a double entry system in place.
Mid-April I added 2 links right in the centre of the eye-line on each of those 7 pages. These pointed to related sites offering the same/similar info and I did this because no sites in this sector do. It may sound mad but this was a little test that would get users to a better site quicker, showing lower satisfaction and worse metrics with this site and not looking around as they'd found something better.
April 27th saw Panda 3.6 drop all rankings of those pages like a stone, to well outside the first 100 results. So I removed the links from these pages and left it.
June 8th then saw Panda 3.7 reinstate all rankings of these pages.
Rankings were altered so abruptly on those dates and the links were the only changes. It's a possibility it wasn't the links and that the site sits on the cusp of being in or out, but that seems slim to me. I don't believe Google could have collected enough 'real' user metrics to make this decision at all, they sent <10 visitors during this Panda'd phase - which makes me firmly go along the lines of 'on-page factors that fit in with poor metrics/satisfaction'.
[edited by: danwuk at 10:52 am (utc) on Jun 27, 2012]
| 10:43 am on Jun 27, 2012 (gmt 0)|
|After reading the entire converation in this thread, I really doubt Panda is all about user metrics, exit rate and bounce rate, It all depends on the niche, google is specially targetting a niche to penalised all the sites. |
I don't think Panda is able to target specific niche's, however I do believe the algorithm is amplified with increased levels of competition.
I have seen this across my own sites where a single brand (competitive) is punished and the others (not so competitive) survive.
| 11:12 am on Jun 27, 2012 (gmt 0)|
Thank you Tedster for pointing out John Muller advise.
Now its time to go to work.
| 4:57 pm on Jun 27, 2012 (gmt 0)|
Claarky, I've been reading this with great interest and going over all my notes from last year, and cross-checking with my old Analytics account, and I've finally confirmed something that puts a hiccup in your theory: last year I made all the improvements you're currently making, and for a long time things were getting better and better... and then, suddenly, Penguin slaughtered my site. (I know this is a Panda thread, but if a "fix" for Panda gets you killed by Penguin, then we need to know that. That's why I'm sharing this.) Here's how it all happened:
My sites had not been hit by Panda, but I suspected one of them was vulnerable to it - thin content and so on. So I started beefing up content, etc., like so many of us did. But a few months into the fixes, I also decided the heck with Google, I was done with SEO and from now on I would only look at visitor signals to get my clues for improvement.
So I did everything you're doing. I cannot emphasize this enough: I spent the last 10 months or so improving or deleting pages that got bad signals from visitors. This included my pages that got the most exits - I deleted or improved them. I looked at pages that got low conversions, and did the same with them. Traffic just kept shooting up and up.
Then I got thoroughly trashed by Penguin. I had sidestepped Panda by making improvements for users, but Penguin mistook all those changes for SEO attempts. At least, I guess that's what it was. A couple of SEO pros have looked at the site, agreed it's a false positive, but nobody's sure why Penguin hit my site (I was not manually penalized - the algo just wrecked a lot of my rankings).
Here's the bottom line.
--Before Panda came along, I had some high exit pages that were of low quality.
--After the first Panda, those were the first pages I improved or deleted.
--This seemed to help, until Penguin came along and decided I was spamming. Can't win for losing, apparently.
While I know I wasn't spamming, and therefore Penguin seems like a false positive for my site, I can't shake the feeling that Penguin was functioning as intended when it demoted my site, and even when it's raised up total empty EMD MFA sites on many queries. I wonder if maybe Larry Page is so obsessed with those awful webmasters wrecking his precious pagerank that he's influencing engineers to err on the side of making false positives - maybe they think ANY big bunch of changes constitutes "aggressive SEO" because they've forgotten that sometimes changes we make for visitors ARE good for SEO, and that's just how SEO is supposed to work.
I hope someone finds my story helpful. FWIW, I'll keep making my big changes to please visitors. I don't consider Google's 60% of net search traffic worth changing directions for.
| 6:03 pm on Jun 27, 2012 (gmt 0)|
Sorry to hear your site got trashed after side stepping Panda, that's rough, but it does encourage me that the approach you took and I am taking now, helped traffic improve.
I do wonder what might be waiting for me if my site does recover but I do have faith in this approach to solving Panda. It makes sense to me. I didn't know where my bad content was until started looking at exit rate so I'm a step further ahead if nothing else.
| 6:13 pm on Jun 27, 2012 (gmt 0)|
Claarky, I think you've misread - I was never hit by Panda. I was doing what you're doing now, looking at Exit Rate and improving it because that made sense from a user standpoint, and then Penguin killed 80% of my traffic - possibly *because* I was making changes to improve my exit rate.
| 7:34 pm on Jun 27, 2012 (gmt 0)|
|I had sidestepped Panda by making improvements for users, but Penguin mistook all those changes for SEO attempts. At least, I guess that's what it was. |
Without taking this excellent thread off topic... what types of improvements "to please visitors" do you think might have sent false Penguin signals to Google?
I've seen people add 100 contextual links a page in the name of pleasing users. I'm not saying you were so misguided, but... apart from something like that... I'm not seeing how steps to increase user engagement within a site are likely to create signals that would have triggered Penguin.
| 7:59 pm on Jun 27, 2012 (gmt 0)|
Robert, in a case on website I run, I added a list of compatible models, that was vital to the page. But in some cases the number of links was nearing 100 for that list alone. I had to temporarily remove it after Penguin was released, just incase it become a problem. Users complained, so we had to find a work around.
| 9:03 pm on Jun 27, 2012 (gmt 0)|
@Robert, good question and you're quite right to ask it! I really don't know what sent the wrong signal, and one possibility is that I made so many changes that the collective lot of them triggered something. My best guesses would be:
--I combined some high exit rate pages into single awesome pages. This typically mirrored the common post-Panda SEO practice of combining "thin content" pages into longer pages and 301 redirected them, b/c as it happened, most of my high exit rate pages (back then) were rather weak.
--When a high exit rate page couldn't be combined or improved, I deleted it. Again, this mirrored deleting "thin content" pages.
--I narrowed my content focus to those categories that got the best response from visitors. That could have looked like I was targeting certain keyphrases, when in fact I wasn't even allowing myself to do keyword research anymore.
--I started being much more selective about who I linked out to, aiming for lesser-known sites most users wouldn't run into otherwise. Maybe this could've looked like some sort of unnatural linking strategy.
FWIW, my changes came from studying tried and true marketing techniques, particularly "marketing personas", wherein you imagine your target audience, what they like, how they spend time and money, what they care about, and you imagine that person's response to everything you do on your site.
But all this came under the umbrella of looking at high exit rate pages and trying to fix them.
| 9:07 pm on Jun 27, 2012 (gmt 0)|
Diberry, I can't really see how improving UM, would trigger Penguin. I personally made a few tweaks, to be cautious. But to be hit so hard, for on site optimisation, I'd be massively surprised.
I appreciate you've made changes x, y and z, but that doesn't mean the hit from Penguin, was necessarily as a result of the said changes.
Have you looked at your back link profile in detail?
[edited by: realmaverick at 9:28 pm (utc) on Jun 27, 2012]
| 9:28 pm on Jun 27, 2012 (gmt 0)|
realmaverick, I don't claim to know what triggered Penguin, and I'm certainly not saying improving user metrics did. I'm saying the things I DID to improve them could have looked spammy to it.
My backlinks were the first thing I looked at back in early May - they're weak, but not spammy. This was confirmed by a couple of SEO pros on this forum who were understandably skeptical. I was actually glad to let them check it, in case I was missing something. I wanted to recover, not win arguments! :)
If Penguin is generating false positives, then it's seeing spam where no spam was intended. When I tell you what I did, of course it doesn't sound spammy. But you have to consider how the end result of what I did might have looked to a series of computations hunting a little too hard for signs that someone's gaming Google. Of course what I did *shouldn't* have triggered anything... but something did all the same.
| 9:41 pm on Jun 27, 2012 (gmt 0)|
Thanks for the info. I've not personally seen a solid case, of a website that's been hit by Penguin, based on on-site factors. That of course isn't to say it's not possible.
| 4:26 am on Jun 28, 2012 (gmt 0)|
Well, I'm not sure I would know what to look for in my backlink profile. What I do know is that I didn't try to build links because I was scared of just this happening, and people who know better than I say there's nothing wrong with the backlinks but that it's a weak profile. Shrug.
| 7:27 am on Jun 28, 2012 (gmt 0)|
I've been thinking about why people are sceptical about the idea that there is a link between good user metrics and a way to sort out your Panda issues. The fact that I haven't recovered from Panda yet (having only just started fixing my site on this basis a few days ago) is obviously getting in the way.
So here's something to think about.....
When I originally started thinking about this idea I looked through the stats for my category pages, comparing March 2011 (pre-Panda) to March 2012 and I discovered several category pages which are receiving the same traffic from Google now as before Panda 'hit' my site (very slightly more actually). I studied this like mad when Panda first arrived to ravage my site and couldn't make any sense of it until now.
The user metrics for those category pages are almost identical now to before Panda. However, areas of my site hit badly by Panda also have identical user metrics now compared to pre-Panda. So why did Panda hit some areas and not others? My spreadsheet, all ordered by url (my urls contain the category name) with exit rates over 30% highlighted in red shouted the answer to me. The category pages hit by Panda LINK TO HIGH EXIT RATE PAGES, whereas the category pages not affected do not (or to a lesser degree). The number of high exit rate pages my categories link to and how high the exit rate is of each of those pages has a direct correlation with the degree with which Panda hit those category pages.
So what can be concluded from this:-
1) The user metrics of category pages not hit by Panda are within Panda's range of what makes a quality page, and so are those of the pages it links to.
2) I can get traffic back to the category pages hit by Panda by dealing with the high exit rate pages they link to (either remove them or improve them)
3) I also need to deal with the user experience issues of any category pages which have worse user metrics than the category pages Panda didn't hit.
As I continue through this process with my site I see a quality site emerging. By physically looking at pages that have user metrics which look out of place (based on my knowledge of my site), I can see what people like and what they don't. And it's all obvious, and it's all in Amit Singhal's guidelines.
There is one other thing I think has created confusion here - how Exit Rate in GA is calculated and what part of it is really of interest to you when diagnosing your bad pages.
Exit Rate in GA includes Bounce Rate. To get an idea of how people use my site I realised I needed to get the bounce rate portion of the statistic GA shows, out of the Exit Rate to show me what I'm calling 'Internal Exit Rate' (the rate at which people navigating around my site, arrive on a particular page and then leave). This gave me a much better indication of what was really going on within my site. Bounce rate clouds things.
As some people have said, diagnosing your problem and fixing it are two different things. When I look at my high 'Internal Exit Rate' pages and compare them to those with low IER pages, it doesn't look difficult to me. In most cases the bad pages are worded really badly, have too much text, bad images, the product is rubbish, the price is too high, the product title is misleading, similar products in the same category - there's always a reason once I think how my users think and keep Amit Singhals guidelines in mind).
People say, ah but Panda won't be all about user metrics. Well, my evidence says otherwise, for my site at least.
I think it's a simple methodology that's easy to follow for anyone trying to get pages of their site 'out' of Panda as well. This knowledge will benefit people hit by Panda in the future by helping them know how to keep their site in good shape for their users. People hit by Panda will benefit more than those not affected - learning how to interpret your stats will help you create a better user experience, get the Panda away from you and keep him there.
Unless you have a vast site it's not even that time consuming to do and doesn't require specialist knowledge. I have just over 1,000 pages and 300 of them look potentially bad according to my stats. My customer service girl is going through these making comments about why the pages might be bad and I'm fixing them, mostly by removing them if there's no traffic or sales (i.e. they have no value to my users or me). We're about half way through, should have the rest dealt with in a few days, and then I expect to see traffic returning the areas of the site I've dealt with first at the next Panda update. Then we will monitor these stats like a hawk to make sure we're delivering a good user experience, because that will keep us away from the Panda.
It's natural to think that a difficult problem must have a really complicated solution. In life I've generally found it to be the complete opposite. It doesn't matter how Google are doing it, my stats show a link between bad user metrics and a Panda demotion.
I'll leave it there - I hope you take the trouble to look at it the way I have. It's amazing how dealing with this is changing my site. I can see it getting better every day and it makes sense to me that this is what Google wants us to be doing - making our sites a better user experience. The other stuff (links, social activity, etc.) is just gravy - this is the meat. It fits with everything they've said about Panda, the guidelines they gave, everything. I've not heard a single argument here to counter this.
There is one catch with this. If everyone gets this idea and it's right, there will be a quality war. Google will turn up the heat to encourage us all to make better and better sites in order to rank. There may be a limit to how great anyone could make a site, but I reckon there's a long way to go yet. The internet and websites are still in their infancy really. Eventually the people making sites for the wrong reasons won't be able to stay in the game. User metrics will hound them out. But when that day comes Google's results and our websites will be amazing, absolutely amazing.
That has to be good in the long run, doesn't it?
|Martin Ice Web|
| 8:54 am on Jun 28, 2012 (gmt 0)|
your theory has two breaking points:
1st: it implies that google serves always the best and related results in relation to the query,
2nd: how can a new webside be measured if itīs getting no visitors?
AND: since Panda my bounce and exit rates are up! That would imply that panda is very wrong.
| 9:03 am on Jun 28, 2012 (gmt 0)|
Claarky - How does Google decide whether someone left because they found the answer to what they were looking for and are satisfied or whether they left because they didnít.
| 9:03 am on Jun 28, 2012 (gmt 0)|
Thank you, Claaarky, for all the time and effort you are putting into this thread. I have been looking at the GA for my "information" site carefully and think you are really on to something important. I'm a real amateur with GA, though, and with over 6,000 pages to sort out I have been struggling a little. At the risk of sounding really stupid, can you possibly share how you "get the bounce rate portion of the statistic GA shows, out of the Exit Rate" ? I agree that your "Internal Exit Rate" would be a more appropriate indicator.
| 9:08 am on Jun 28, 2012 (gmt 0)|
Martin Ice Web
1) It doesn't imply Google always serves the best sites for a particular query, it implies Google serves the best 'good user experience' site for a particular query.
2) If Google is collecting user metrics via the browser and your site can be found by people anywhere (any form of advertising whatsoever) and they visit your site, it can collect the user metrics.
Your bounce rates and exit rates are up since Panda hit because the changes you've made aren't the right ones. They didn't improve the user experience of your bad pages, you created more bad pages, or good pages worse. That's why it's important to understand how to identify what a bad page is.
If you replace the word Panda with the phrase "User Experience" it makes things easier to understand.
| 9:33 am on Jun 28, 2012 (gmt 0)|
No problem. This thread has helped me challenge my own thoughts and understand this even more.
Just been trying to decipher my spreadhseet but I think this is it.
First you need to know that:
- the Unique Page Views figure in GA includes Entrances
- Bounce Rate ONLY relates to Entrances
- Deduct Entrances from Unique Page Views (UPV) and you get the number of people who navigated to the page from within your site
- Work out the number of people who exited from the page overall (UPV x Exit Rate%)
- Deduct the first figure from this, to give you the number of internal navigators who left.
- Work out the ratio of internal navigators who left to total internal navigators and you have your Internal Exit Rate%
You need to look at Average Time on Page as well to make sense of what people are doing, and look at your page to see whether that all seems to tie up.
Once you look at the pages with high Internal Exit Rate that you wouldn't expect people to be leaving left, right and centre, the penny will drop. The a load more will as well. It will get quite noisy.
| 9:38 am on Jun 28, 2012 (gmt 0)|
I think you need to look at the pages where that is happening and think about whether you're happy to let your visitors shoot straight off again once they've got what they want.
I think this is the key to a good user experience, and how to monetise a site. Once Google sends you a visitor, treat them like gold and they'll make you more money.
| 10:43 am on Jun 28, 2012 (gmt 0)|
My answer to point 2 of Martin Ice Webs question has really made me think about how Google are getting the stats......
Imagine they've done a deal with Microsoft and Apple for user metric stats via their browsers (internal exit rate, time on page, url).
All you need to get a website off the ground is advertising (like a normal business). Links are only valuable if they drive traffic, because it's only visitors to your site that matter now.
Social media is only important if it actually drives people to your site.
Everything about SEO changes.
People will immediately think about how they can spam this, get round it, screw up competitors. By now, Google has enough information on how people use websites of all types all round the world. They know what looks natural. They'll be able to detect unnatural behaviour.
Genius, pure genius.
| 10:58 am on Jun 28, 2012 (gmt 0)|
You don't imagine how many people are reading this discussion since it started.
Googelers and Bingers among them :-)
| 11:17 am on Jun 28, 2012 (gmt 0)|
It has occurred to me Zivush.
Hello Google - thanks for running Panda on Monday by the way so I have a few weeks to get my site sorted.
Nice people those guys at Google. Always had a lot of time for them.
| 11:18 am on Jun 28, 2012 (gmt 0)|
Thought I'd chip in.
Affiliate links = exit
I mean, all links are exits, but deliberately pushing people off your site might be a worry. Maybe a redirect script like WebmasterWorld uses could help the issue.
True or not, this idea might lead to yet more balkanisation of the internet, as people remove outbounds to protect against exits.
| 11:42 am on Jun 28, 2012 (gmt 0)|
Don't think about how to get round it. Think about how to please your visitor and hang on to them. There may be a way to be an affiliate and still create a great user experience even if you send people off-site.
Open the other site in a new window and give people an incentive to come back to your site once they've finished playing with the other one. Obviously if they get engrossed in the site you sent them to because it's such a great user experience, they might not come back. Give them a reason to come back.
| 11:50 am on Jun 28, 2012 (gmt 0)|
Oh my God, another realisation......
Links are now completely useless to you unless they drive traffic to your site and the visitor then interacts with your site in a way that looks natural or perhaps reacted well to your site. Any anchor text attribution will only have impact if the link is there for genuine reasons AND drives people to your site who like it.
Blimey, they've nailed this!
Traditional business rules apply. Advertise, drive traffic to your site, make your site a great user experience.
This is like a massive weight off my shoulders. I'm sick of all the SEO rubbish I've done over the years - eye of toad and wing of bat, that'll get you to top of Google. Yeh right.
| This 193 message thread spans 7 pages: < < 193 ( 1 2  4 5 6 7 ) > > |