DEV Community

Jane Wilson
Jane Wilson

Posted on • Originally published at waracle.com

What is A/B testing

To put it simply, A/B testing is comparing one thing to another. Like getting your eyes tested, does version A (with the lens) work better than version B (no lens).

Like changing the copy text on your sign up form in hopes it increases conversion or changing the colour of your CTA button to make it more obvious. Many marketers also use A/B testing with their ads so they can use their budgets more effectively.

No product is perfect on launch (which is also why you should always start with an MVP) and it can sometimes be hard to pinpoint exactly what is preventing users from getting the most out of a website or app. By using A/B testing alongside traditional web and app analytics, you can hone in on problem areas (like a certain form) and make small, measurable changes to increase conversion.

Planning your A/B test

Before you dive into running an A/B test, there are a few questions you need to ask yourself.

Are you running the test on-site or off-site?

  • Off-site tests tend to be things like ads which bring traffic to your site/app or email newsletters.

  • On-site tests are, therefore, carried out on elements of your site/app and those tests need to take into consideration how your site works as a whole.

What variables will you test?

  • Do you want to change when a user is asked to create an account?
  • Make a CTA button larger to make it more obvious?
  • Change the text used on a newsletter subscription button?

What results are you looking for?

  • This is particularly important for digital products. Are you looking for an increase in accounts created? More subscribers to your newsletter, or higher average basket total. You should already know your current baseline for your site/app to compare the results of your experiment to.

Running your test

You should run all of your different versions at the same time. If you run them on different days, or even at different times of the same day, you can’t really do a comparison. You should split your traffic evenly between your different versions too. Whatever platform you’re using to run your test should help with this.

Length of test

This is a bit of ‘how long is a piece of string’. In essence, you should run your test for as long as it takes to gather a statistically significant data set. For some that could be as little as one hour. For others, you could be looking at leaving the test running for a week or more. You don’t want to leave them running too long, however, as that can mean the test is influenced by other external factors.

A/B testing isn’t the sort of thing which can be done last minute. It takes planning, preparation, and patience. For the most accurate results, you’ll only be running one test at a time, and you’ll run it quite frequently (especially if you’ve changed more than one thing at once). If you’re ever feeling a little doubtful in your results, retest!

Testing multiple variables

Although for the most accurate results you should only test one variable at a time, that doesn’t mean you can’t test more than one.
If, for example, you want to test the colour of your CTA button, you can easily run an A/B/C test using three (or more) versions. This would be much more time-efficient than testing A/B, A/C, and B/C separately.

Multivariate testing

But let’s say you want to test the colour of your CTA and its button text (called multivariate testing). That would be much more complicated to run than A/B/C testing. You also need much more traffic coming to your site/app than is needed for traditional A/B testing. To work out how many variations you’ll be testing, use the following formula:

[number of variations on element A] * [number of variations on element B] = [total variations being tested]

For example, element A would be our CTA button colour and element B its copy text.

Benefits of A/B testing

By testing small changes against each other, you can slowly create the perfect version of your site, ad, or sign-up process. You also get a much better view of what your users like which may have been missed during your initial user research.
A/B testing can help you lower bounce rates, increase conversion rates (and value), and increase user engagement.

Oldest comments (0)