Don’t Waste Time With Confusing CRO Tests – Analyse Then Hypothesise

 

Step away from the fun stuff, roll your sleeves up and get down and dirty with Google Analytics. The hypotheses of CRO tests are often blurred and I put that down to either a lack of Analytics analysis or a lack of Analytics understanding. Get studying, get your GA qualification and get stuck in. My CRO Tip for 2013 is to be absolutely clear in the hypothesis of your tests and only ever test usability OR persuasion; never both in a single test.

Prioritising what to test can often be staring you straight in the face, but only if you know what you’re looking for. Like when you watch a quiz show and someone proclaims a question is ‘too easy’: it’s only easy if you know the answer!

What’s that, you have a landing page with a conversion below that of the site average? And you have high stock levels of these products? And your margins are good on these? Well there is your priority test. Wasn’t so difficult was it. OK, this is perhaps a simplistic, no-brainer, scenario, but my point is that analysing the data can reap big wins for your online business. It’s vital.

Once you know what page(s) you want to improve, the next part of the process is often misinterpreted and can become muddled. You need to be clear on what to test. Is the priority to test usability OR persuasion? I’ll repeat it again; only ever test one or the other. You must be decisive here and not get too busy.

I picked the brains of Bryan Eisenberg, co-author of the excellent Always Be Testing, The Complete Guide To Google Website Optimizer and Calls To Action Secret Formulas to Improve Online Results, and he feels the same way:

“The single biggest mistake I see people make with their tests is testing variations before understanding the variables,” he says. “A good test must start with a good hypothesis. Seriously, if not you’re simply throwing s*#t against the wall to see what sticks – that is called Testing at Random! You should never ask – what colour button is going to convert better? Ask instead – why would a visitor click on one button versus the other? Colour matters in a call- to-action button if there is not enough contrast from the rest of the page that the button doesn’t stand out if you were standing six feet away.

“Amazon, the poster child for A/B testing, performs over 200 concurrent tests at any given time. They have learned how to test for impact first and then test for variations.”

Let’s look at this example below to understand Amazon’s approach:

Control

Hypothesis: Persuasion. Make product price and stock availability

more prominent and easier to scan = increase conversion rates?

Test Variation

  • Larger price
  • “In Stock” on own line and in larger green font
  • Orange title

Amazon ran a simple A/B test to learn if people who scan key information quicker are more likely to make a purchase. They didn’t test at random and you shouldn’t either.

Amazon Today

  • Notice that the product title, price and ‘In Stock’ message is now much more prominent following this simple A/B test

Keep the tests simple and always avoid getting ahead of yourself. CRO is kinda like therapy; analysis that will highlight necessary changes that will lead to improvement. This is a continuous process – you should test FOREVER, and so tests should be continually considered and never rushed. Analyse, then hypothesise, then test, then learn. Then do it all again. Over, and over, and over again.

I’ll leave the last word to The Grok: “Will you do Random Testing in 2013? Please don’t. Develop a great hypothesis; ask why and not what and you have greater success in all your tests.”

Glad tidings, and wish you all Happy Testing in 2013!

 

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>