Header tag

Tuesday 17 October 2017

Quantitative and Qualitative Testing - Just tell me why!

"And so, you see, we achieved a 197% uplift in conversions with Recipe B!"
"Yes, but why?"
"Well, the page exit rate was down 14% and the click-through-rate to cart was up 12%."

"Yes, but WHY?"

If you've ever been on the receiving end of one of these conversations, you'll probably recognise it immediately.  You're presenting test results, where your new design has won, and you're sharing the good news with the boss.  Or, worse still, the test lost, and you're having to defend your choice of test recipe.  You're showing slide after slide of test metrics - all the KPIs you could think of, and all the ones in every big book you've read - and still you're just not getting to the heart of the matter.  WHY did your test lose?

No amount of numerical data will fully answer the "why" questions, and this is the significant drawback of quantitative testing.  What you need is qualitative testing.


Quantitative testing - think of "quantity" - numbers - will tell you how many, how often, how much, how expensive, or how large.  It can give you ratios, fractions and percentages.

Qualitative testing - think of "qualities" - will tell you what shape, what colour, good, bad, opinions, views and things that can't be counted.  It will tell you the answer to the question you're asking, and if you're asking why, you'll get the answer why.  It won't, however, tell you what the profitability of having a green button instead of a red one will be - it'll just tell you that people prefer green because respondents said it was more calming compared to the angry red one.

Neither is easier than the other to implement well, and neither is less important than the other.  In fact, both can easily be done badly.  Online testing and research may have placed the emphasis may be on A/B testing, and its rigid, reliable, mathematical nature, in contrast to qualitative testing where it's harder to provide concise, precise summaries, but a good research facility will require practitioners of both types of testing.

In fact, there are cases where one form of testing is more beneficial than the other.  If you're building a business case to get a new design fully developed and implemented, then A/B testing will tell you how much profit it will generate (which can then be offset against full development costs).  User testing won't give you a revenue figure like that.

Going back to my introductory conversation - quantitative testing will tell you why your new design has failed.  Why didn't people click the big green button?  Was it because they didn't see it, or because the wording was unhelpful, or because they didn't have enough information to progress?  A click-through-rate of 5% may be low, but "5%" isn't going to tell you why.  Even if you segment your data, you'll still not get a decent answer to the either-or question.  


Let's suppose that 85% of people prefer green apples to red.  
Why?
There's a difference between men and women:  95% of men prefer green apples; compared to just 75% of women.
Great.  Why?  In fact, in the 30-40 year old age group, nearly 98% of men prefer green apples; compared to just 76% of women in the age range.

See?  All this segmentation is getting us no closer to understanding the difference - is it colour; flavour or texture??


However, quantitative testing will get you the answer pretty quickly - you could just ask people directly.

You could liken it to quantitative testing being like the black and white outline of a picture, (or, if you're really good, a grey-scale picture) with qualitative being the colours that fit into the picture.  One will give you a clear outline, one will set the hues. You need both to see the full picture.

No comments:

Post a Comment