The best way for you to improve the performance of your website is through A/B testing.
What is A/B testing?
A Note from OPTIMIZELY (my A/B testing tool of choice):
A/B testing (also known as split testing or bucket testing) is a method of comparing two versions of a webpage or app against each other to determine which one performs better.
By creating an A and B variant and testing them against each other, you can use data & statistics to validate new design changes and improve your conversion rates.
Running an AB test that directly compares a variation against a current experience lets you ask focused questions about changes to your website or app, and then collect data about the impact of that change.
Testing takes the guesswork out of website optimization and enables data-informed decisions that shift business conversations from “we think” to “we know.”
By measuring the impact that changes have on your metrics, you can ensure that every change produces positive results.
HOW A/B TESTING WORKS
In an A/B test, you take a webpage or app screen and modify it to create a second version of the same page. This change can be as simple as a single headline or button, or be a complete redesign of the page.
Then, half of your traffic is shown the original version of the page (known as the control) and half are shown the modified version of the page (the variation).
As visitors are served either the control or variation, their engagement with each experience is measured and collected in an analytics dashboard and analyzed through a statistical engine.
You can then determine whether changing the experience had a positive, negative, or no effect on visitor behavior.
THE CONVERSION SURGE A/B TESTING PROCESS
The following is the A/B testing framework I’ll use to start running tests on your website:
- Collect Data: Your analytics will often provide insight into where I can begin optimizing. I’ll start with high traffic areas of your site or app, as that will allows me to gather data faster. I look for pages with low conversion rates or high drop-off rates that can be improved.
- Identify Goals: Your conversion goals are the metrics that I’ll use to determine whether or not the variation is more successful than the original version. Goals can be anything from clicking a button or link to product purchases and e-mail signups.
- Generate Hypothesis: Once I’ve identified a goal I’ll begin generating A/B testing ideas and hypotheses for why I think they will be better than the current version. I also prioritize the test plans in terms of expected impact and difficulty of implementation and review them with you before I begin testing.
- Create Variations: Next, I’ll make the desired changes to an element of your website or mobile app experience. This might be changing the color of a button, swapping the order of elements on the page, hiding navigation elements, or something entirely custom.
- Run Experiment: Then I’ll kick off the experiment and wait for visitors to participate! At this point, visitors to your site or app will be randomly assigned to either the control or variation of your experience. Their interaction with each experience is measured, counted, and compared to determine how each performs.
- Analyze Results: Once the experiment is complete, it’s time to for me analyze the results. Optimizely will present the data from the experiment and show me the difference between how the two versions of your page performed, and whether there is a statistically significant difference.
If my variation is a winner, it’s a great feeling for both of us! I’ll then see if I can apply learnings from the experiment on other pages of your site and continue iterating on the experiment to improve the results.
If the experiment generates a negative result or no result, don’t fret. We’ll use the experiment as a learning experience and generate new hypothesis that you can test.
READY TO GET STARTED WITH A/B TESTING TO GET MORE LEADS AND SALES?