Our process for optimization has increased our clients' market share. We deliver
continuous improvements – backed by an average increase of 39% per test. have delivered great test results and
also a better understanding of our customer.
They are highly skilled, reliable and feel
like an extension of our team.

Cormac Folan, eCommerce Director for men’s shirts brand T.M.Lewin

Data + insight

Identify opportunities + personas


First, we find out why users aren't converting and where the website is underperforming. To do this, we need to combine quantitative and qualitative analysis:

Quantitative analysis: Using tools like web + data analytics, form analysis, session replay and heatmapping, we can pinpoint where users are abandoning.

Qualitative analysis: Then, we find out why they're abandoning by combining this data with insight from usability testing, surveys and user interviews.

Together, this helps us create user personas: conversion-focused segments that group together goals, motivations and objections and form the basis for our testing.



Prioritise and plan testing


In the strategy phase, we prioritise the testing roadmap to have the biggest impact on the conversion rate. First, tests are prioritised by their potential value, expected duration and the complexity of the test. Then, we plan the testing roadmap by "stream" and hypothesis:

A testing stream is an individual page or element within the funnel (eg a product page, the header, etc). Tests will be queued up in each stream, allowing rapid concurrent testing.

Planning by hypothesis means that we label each test with a theme – namely, the core principle that we're looking to test. This helps us ensure that we're testing multiple hypotheses in each sprint, rather than focusing too heavily on a single hypothesis.

This means we plan the test roadmap so we deliver tests that:

  • Have a high impact on revenue
  • Are quick to build and test
  • Help us refine our testing strategy in the long term.


Create impactful test variations


Our creative team focuses purely on optimization and testing. They work alongside our consultants to create impactful test variations:

Every concept is wireframed by a UX designer and consultant. For complex tests, the designer creates a prototype for initial usability testing.
The designer then creates brand-compliant designs based on the wireframe.
After sign-off, our developers integrate the tests into the testing platform, modifying the HTML, CSS and JavaScript, and setting up the tracking and targeting.
Finally, our QA team ensures the test variations work correctly in primary browsers and devices.


Launch and monitor tests


Tests are launched in the testing platform and monitored continually. We keep a close eye on the tests to ensure that they are running smoothly. We use robust statistical modelling to ensure that the results observed during a test are significant, and translate to long term benefits for your business.



Analyse results and scale winning tests


Every test teaches us more about the visitors and the principles that affect conversion.

We use web analytics and tools like ClickTale for heatmapping and session replay. This allows us to identify every change in user behavior. We then use this data to adapt the upcoming test strategy. For unsuccessful tests, we validate whether the hypothesis or the implementation was at fault. If appropriate, we try alternative implementations before rejecting the hypothesis. For successful tests, we scale the impact by potentially retesting a more aggressive variation and by applying the same hypothesis to other pages in the funnel.