By "Time on Page", do you mean the AVERAGE time on page? I thought that a bounce means leaving the page quickly, say within 15 seconds, and returning to the SERPs. So that anything longer than 15 seconds, or whatever time period Google uses, doesn't count as a bounce.
Bounce rate (as determined by standard analytics packages) is a notoriously challenging metric. I've seen a high performing page with a bounce rate of 85% - and this was a page with 750,000 search visits per week from 4 very competitive query terms!
When that page's traffic was confined only to Google search traffic, the bounce rate was still 82%.
So it seems clear to me that, as Matt Cutts and others have said, bounce rate is too noisy a metric to lean on very heavily.
I've seen a lot of analytics over the years, and I'd say both those numbers are very low, but it's hard to make any decision based on such a number. I think the search engines tend to look only at bounces that go back to the SERP within a handful of seconds and then make another choice from the same result set. That is sometimes called a "fast click". And even then, we still have a relatively noisy signal.
Average time-on-page is just a bit less noisy, IMO - and for that reason a higher time-on-page has more appeal to me. That still assumes that the extra time is because the layout is friendlier to the eye and gaining more engagement. It's always possible that a layout is more difficult to cope with, and that might also increase time-on-page.
By time on page. Yes, I mean the average time they spend on the page. I actually am using 10 forms of the 2 different templates to form an average for these numbers.
Personally I would prefer to view layout 1. So the time on page metric might be a reflection of that as the tedster pointed out. Thanks Ted.
Pardon the interruption:
Are you using google analytics to measure time on page and bounce rate?
Isn't there something of a "flaw" to those metrics in terms of time on page?
I thought that if someone comes to your page, they could hand around on that page for 10 minutes, and then if they go to some other site, that 10 minutes time on page would not actually be counted.
Am I wrong about that?
Yes, I'm using google analytics. I have not heard of a bug like that though. Has anyone else?
That's not a bug... it is an indication that G did not get enough information to return a number for either "bounce" or "time on page" within their parameters for reporting.
It depends on your site. It boils down to "which one satisfies users better?"
On my site users typically view one page and find their answer. In that case I try for 100% bounce rate and finding the answer on the page as quickly as possible. I try for high bounce rates and the lowest time on site that can give the user their answer.
If one of your test split articles in two and tried to make users click to the second page, throw the test out. Analytics isn't measuring it correctly and you are turning your users away.
If your site is an e-commerce site, I would try to measure by "items added to shopping cart" or "checkout completed".
If your site is a an entertainment site, I would measure by "time on site".
As for Google analytics, it can only measure time between first page and last page. If a user only views one page, the analytics see that as "no time", even if the user spend 10 minutes on that one page.
|If a user only views one page, the analytics see that as "no time", even if the user spend 10 minutes on that one page. |
Yes, that is what I heard, but if I remember correctly, that "no time" is NOT calculated in the time on site metric.
So if one visitor spends one minute on the page and then goes to a second page on the site tracked with GA, then a second visitor views only one page, and thus is given "no time" on the site, I am FAIRLY sure that the time on site for that page would be one minute (and not 30 seconds).
Hope I am right about this.
|As for Google analytics, it can only measure time between first page and last page. If a user only views one page, the analytics see that as "no time", even if the user spend 10 minutes on that one page. |
In the new(ish) google analytics 'real time' feature it looks to me as if they are measuring quite accurately both arrival on the page and time spent on the page as well as subsequent activity on the site i.e. recording how long someone stays on a page within the site, whether they are looking at just one page or looking at several.
Is that not the case?
|I think the search engines tend to look only at bounces that go back to the SERP within a handful of seconds and then make another choice from the same result set. That is sometimes called a "fast click". And even then, we still have a relatively noisy signal. |
For certain kinds of non-SEO research, I will often go down a serps page looking at several results. These in fact aren't pages I dislike... they're just the ones that initially attracted me the most... and going back to the serps can mean that I'm following up leads and learning as I go. Perhaps I've even picked up enough additional information to refine my search further.
But this isn't necessarily a problem with the pages... it can reflect the limitations of my knowledge when I started the search
I'm a little concerned when I see the "do you want to block all results from example.com" message when I do this. Makes me worry that I might be hurting perfectly good sites. I assume that Google is aware just how noisy this signal can be for certain searches, but one never knows.
I think you only have to worry about BR or TOP when it's extremely bad. I believe that's the only time Google would take it as a valid signal. When a signal is inherently noisy, it only makes sense to value it when it's so extreme it's not that noisy.
Your site: BR - 95%
Your competitor's site - 10%
P.S. If you want to check your competitors' bounce rate go to Alexa.
I would agree with @potentialgeek. Google will only probably look intently at these metrics when theu are very different to your competitiors in the SERPs. Rather than these metrics you need to be looking at what it is you want guests to achieve on your pages. As @deadsea mentions, it might be that if your guest can be satified in one page you will get 100% bounce rate, however if you are running ecommerce templates you need multiple pages and eventually a purchase to satify.
I would also (personally - Im sure some people will disagree!) recommend not using a many templates in a test. I find it much more useful to conduct smaller tests, but many many more of them. This way you can get specific information about what guests find helpful or what increased conversion etc and then add these learnings to the next test. Complex test can run the risk over complicating the learning and present false informtion back.
I wouldn't be surprised if the vertical you are in makes a difference. Looking at the "average" for that vertical and seeing how you compare carries some logic to it IMO. A 5 second bounce might turn out to be better than 75% of the sites in that sector for that term...trouble is you'll never know.
For example, in a lot of verticals, a quick scan of the pictures are enough to make a decision and that can be almost instant. Doesn't mean the quality of information is poor though.
Additionally, G might factor in information on the user and their general trends or even whether the search term included your brand which has to be a good sign.