Forum Moderators: DixonJones

Message Too Old, No Replies

How to interpret the result of AB test with contradicting metrics?

worse bounce but better pages/session and sessions duration

         

adamovic

9:53 am on Dec 1, 2020 (gmt 0)

10+ Year Member



We are running many AB tests. Sometimes, we get results that some metrics are better but some metrics are worse. We are obviously interested in 3 metrics: bounce, pages/session and avg. session duration.
I.e.

A test - Bounce 62.80%, Pages / Session 2.42, Avg. Session Duration 00:01:53
B test - Bounce 63.58%, Pages / Session 2.46, Avg. Session Duration 00:01:55

So who would be the winner? The one with the lower bounce or the one with the higher pages per session and average session duration?
This is the website that doesn't have conversation rates (it doesn't sell many products directly, so it doesn't have conversion rates, we simply want to improve user experience). Sample sizes are big enough (60K+ sessions).
I don't think it's important what we are actually testing, but this is actually for two variants of a mobile menu.

JorgeV

10:27 am on Dec 1, 2020 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



Hello,

From these figures, it doesn't matter. You can see by yourself that these numbers are way too close to draw a conclusion and bellow a margin of error.

RhinoFish

10:11 pm on Dec 3, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



How were the results rotated A/B? 50/50 sequential switching?

These numbers are very close, I'd say so close a winner cannot be determined. But I need to know the nature of the changes being tested... let's say you were providing a long form answer, spanning 8 pages, and after each chunk of content, you made the viewer switch pages... the lesser content on B might turn people off more, but make those who are curious go thru more gated ages, increasing the page count.

The friction to view a page is low, do you have something further downstream, like a lead form completion where they had to cough up an email or something?

adamovic

11:07 pm on Dec 3, 2020 (gmt 0)

10+ Year Member



t's a look and feel test (usually CSS change), i.e. if a dropdown menu will use a separator or not.

tangor

2:43 am on Dec 4, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Some a/b testing on "visual/css" changes might be too close to call. While layout is a part of the user experience it is not a substantial user activity dynamic and results many be near infinitesimal.

topr8

12:59 am on Dec 7, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>> We are obviously interested in 3 metrics: bounce, pages/session and avg. session duration.

why obviously? i completely disagree with this and believe a whole industry has grown up around BS metrics.

>>I don't think it's important what we are actually testing, but this is actually for two variants of a mobile menu.
>>it's a look and feel test (usually CSS change), i.e. if a dropdown menu will use a separator or not.

i disagree, i think it matters a lot ... if you are testing a mobile menu, then personally i should think that the number of click throughs from that actual menu to other pages is the actual metric to measure. (and more in depth, what they actually click on in the menu, etc.)

eg. you could certainly get data for the number of users that 'revealed' the menu - in whatever way it is revealed, straightforward click or whatever. you could also measure of those who revealed the menu, how many clicked, what they clicked or if they abandoned

specifically regarding the pages/session metric, i assume that users can view other pages without clicking links in the A/B test menu, in which case, who cares. in this case the pages/session metric is irrelevant. <aside> if users can ONLY click through to a page from your menu, IMO that is a serious design fault ... IMO there should always be multiple ways of reaching a destination.

likewise in this context measuring bounce or session time is also pointless - i don't see how either can relate to having a seperator in the menu.

...

but maybe i misunderstand what you are trying to do.

...

@tangor ... totally agree!

tangor

2:26 am on Dec 7, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



A/B to me is if ads at top or at bottom or at either side have different results.

As for pages, really, folks, how many people actually come in via the front door (index.html or whatever your default might be)?

Serps serve interior pages, not the menu, front page...

The best one can do these days is fixed layout vs rwd in A/B testing.

tangor

12:14 am on Dec 8, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@adamovic ...

Just curious ... how long are your A/B tests run?

Anything less than six weeks for me has been pretty useless. Just askin!