Prepare for Launch: Lessons from 1,000 A/B Test Launches

Rob Griffiths

In this article, we provide a guide for the A/B test launch process that will help you to keep your website safe and to keep your colleagues and/or clients happy. 

You’ve spent weeks, maybe months, preparing for this A/B test. You’ve seen it develop from a hypothesis, to a wireframe, through design, build and QA. Your team (or client, if you work agency-side) are excited for it to go live and all that’s left to push is the big red button. (Or the blue one, if you’re using Optimizely). Real users are about to interact with your variation and, hopefully, it’ll make them more likely to convert: to buy a product, to register for an account or simply to make that click.

But for all the hours you’ve put into preparing this test, the work is not over yet. At Conversion, we’ve launched thousands of A/B tests for our clients. The vast majority of those launches have gone smoothly, but launching a test can be intense and launching it properly is crucial. While we’re flexible and work with and around our clients, there are some fixed principles we adhere to when we launch an A/B test.

Get the basics right

Let’s start with the simplest step: always check that you’ve set the test up correctly in your testing platform. The vast majority of errors I have witnessed in the launching of tests have been minor errors in this part of the process. Make sure that you have:

Enough said.

Map out the user journey

You and your team might know your business and its website better than anyone, but being too close to a subject can sometimes leave you with blinkered vision. By the end of the development process, you’ll be so close to your build that you might not be able to view it objectively.

Remember that your website will have many different users and use cases. Sure, you’re hoping that your user will find their way from the product page, to the basket page, to the payment page more easily in your variation. But, have you considered how your change will impact users who want to apply a voucher? Do returning users do something new users don’t? Could your change alienate them in some way? How does your test affect users who are logged in as well as logged out? (Getting that last one wrong caused my team a sleepless night earlier this year!)

Make sure you have thought about the different use cases happening on your website. Ask yourself:

One of the best ways to catch small errors is to involve colleagues who haven’t been as close to the test during the QA process. Ask them to try and identify user cases that you hadn’t considered. And if they do manage to find new ones, add these to your QA checklist to make sure future tests are checked for their impact on these users.

Test your goals

No matter how positively your users receive the changes you’ve made in your variation, your A/B test will only be successful if you can report back to your team or client with confidence. It’s important that you add the right goals to your results page, and that they fire as intended.

At Conversion, shortly before we launch a test, we work our way through both the Control and Variation and deliberately trigger each goal we’ve included: pageviews, clicks and custom goals too. We then check that these goals have been captured in two ways:

  1. We use the goals feature in our Conversion.com Optimizely Chrome Extension to see the goal firing in real-time.
  2. A few minutes later, we check to see that the action has been captured against the goal in the testing platform.

This can take a little time (and let’s be honest, it’s not the most interesting task) but it’ll save you a lot of time down the line if you find a goal isn’t firing as intended.

Know your baseline

From the work  you’ve done in preparation, you should know how many people you expect to be included in your experiment e.g. how many mobile users in Scotland you’re likely to get in a two-week period. In the first few minutes and hours after you’ve launched a test, it’s important to make sure that the numbers you’re seeing in your testing platform are close to what you’d expect them to be.

(If you don’t have a clear notion of how many users you expect to receive into your test, use your analytics platform to define your audience and review the number of visits over a comparable period. Alternatively, you could use your testing platform to run an A/A test where you do not make any changes in the variation. That way, you can get an idea of the traffic levels for that page).

If you do find that the number of visits to your test is lower than you’d expect, make sure that you have set the correct traffic allocation up in your testing tool. It may also be worth checking that your testing tool snippet is implemented correctly on the page. If you find that the number of visits to your test is higher than you’d expect, make sure you’re targeting the right audience and not including any groups you’d planned to exclude. (Handy hint: check you haven’t accidentally used the OR option in the audience builder instead of the AND option. It can catch you out!) Also, make sure that you’re measuring like-for-like i.e. are you looking at unique visits in your analytics tool and comparing it to unique hits to your test.

Keep your team informed

At Conversion, our Designers and Developers are involved in the QA process and so they know when a test is about to launch. (We’ve recently added a screen above our bank of desks showing the live test results. That way everyone can celebrate [or commiserate] the fruits of their labour!) When the test has been live for a few minutes, and we’re happy that goals are firing, we let our client know and ask them to keep an eye on it too.

Check the test regularly

So the test is live. Having a test live on a site (especially when you’re managing that for a client) is a big responsibility. Provided you’ve taken all the right steps earlier in the process, you should have nothing to worry about, but you should take precautions nonetheless.

Once you’ve pressed the Play button, go over to the live site and make sure you can see the test. Try and get bucketed into both the Control and Variation to sense check that the test is now visible to real users.

At Conversion, there’ll be someone monitoring the test results, refreshing every few minutes, for the first couple of hours the test is live. We’ll check in on the test every day that it runs. That person also checks that there’s at least one hit against each goal and that the traffic level is as expected.

A couple of hours into the running of a test, we’ll make sure that any segments we have set up (e.g. Android users, logged in users, users who interacted with our new element) are firing. You don’t want to run a test for a fortnight and then find that you can’t report back on key goals and segments.

(Tip: if you’re integrating analytics tools into your test make sure they’re switched on and check inside of those tools soon after the test launches to make sure you have heatmap, clickmap or session recording data coming through).

Make sure you have a way to pause the test if you spot anything amiss, and we’d recommend not launching on a Friday, unless someone can check the results over the weekend.

Finally, don’t be afraid to pause

After all the buildup and excitement of launching, it can feel pretty depressing having to press the pause button if you suspect something isn’t quite right. Maybe a goal isn’t firing or you’ve forgotten to add a segment that would come in very handy when it’s time to report on the results. Don’t be afraid to pause the test. In most cases, it will be worth a small amount of disruption at the start, to have trustworthy numbers at the other end. Hopefully, you’ll spot these issues early on. When this happens, we prefer to reset the results to ensure they’re as accurate as they can be.

Conclusion

Launching an A/B test can be a real thrill. You finally get to know whether that ear-worm of an idea for an improvement will actually work. In the few hours either side of that launch, make sure you’ve done what you need to do to preserve confidence in the results to come and to keep your team and client happy:

Join 5,000 other people who get our newsletter updates