When it comes to designing website interfaces or mobile apps, strong interactions and a positive user experience is critical to building your business. Over the recent years, increased customization capability has provided businesses with the opportunity to optimize digital strategies. This capability gives companies the chance to test the strategies to see which is more effective. A/B testing, also known as split testing, is becoming more and more common for digital marketers today.
Testing takes the guesswork out of website optimization and enables data-informed decisions that shift business conversations from “we think” to “we know.”
You may be wondering what Progressive’s iconic Flo has to do with A/B testing. Before I explain Progressive’s A/B testing success, let me explain how the tests actually work. Stay tuned..
Going with the flo in regards to digital strategies leaves uncertainty in regards to the level of success. (See what I did there?) The process of splitting your traffic between two versions of your website such as a landing page gives you the ability to compare the performance of both.
Small details can make all the difference when encouraging your leads to continue down your website’s funnel. Variations such as these can be easily compared with the A/B testing strategy:
- Call to action buttons/links
- Font size
- Background color
Users are chosen at random to be shown one or more variation of the web page or mobile app. The interactions with the pages generated data that is used for statistical evidence of which version performs better. This allows businesses to make informed decisions when it comes to digital design and functionality.
I know what you’re thinking.. Considering building statistics back into your business plans seems to be a tedious task. The brilliance of A/B testing is that it’s much simpler than expected.
Now we can introduce our friend, Flo. Progressive has utilized A/B testing for several aspects of its Flo inspired digital media. For those that are statistically challenged such as myself, here is a simple breakdown of the steps of A/B testing with Progressive as a guide.
1. Collect Data
The first step is to take a closer look at your analytics to see which aspects of your digital strategies can be improved. Don’t have any analytics to analyze? Check out my previous post on Google Analytics.
For a business like Progressive, any website pages with particularly high drop-off rates or low lead generation metrics would be a good place to start. Progressive wanted to increase leads because the metrics for the landing page were low.
2. Identify Goals
After you find an aspect of your business you want to improve, come up with ameasurable goal that you want to strive to reach. How do these goals support your business objective? Make sure that the goals are tangible while still looking at the bigger picture. For example, Progressive shooting to increase lead generation can easily be measured while maintaining a long term vision.
3. Generate Hypothesis
The next step is to take your goals and consider what type of changes can assist you in reaching these goals. What do you think would be better than your current version? And why? These are the fundamentals that will help you build your hypothesis.
Progressive’s standard landing page was a minimalist style in order to encourage the user to continue down the sales funnel. It was a simple headline with a call to action. The hypothesis was that a page explaining how progressive works might increase the response to the call to action.
4. Create Variations
A/B testing software such as Optimizely, allows you to build variations of your mobile application or website. The majority of these softwares offer a visual tool editor that make these changes simple and hassle free.
Progressive wanted to create an alternative with three easy steps that emphasized the simplicity of its service. The landing page below was generated as the variation for the A/B test. The page still includes the call to action by guiding the user to input their zip code. By also including the arrows for the three easy steps, the users discover how easy it is to “save hundreds.”
5. Run Experiment
Now that the variations are created, it’s time to put them to the test. Once the test is initiated and users start to participate, their interaction with the site is recorded and measured. For progressive, this measurement would include the difference between the number of users inputting their zip code on the controlled landing page vs. the variation.
6. Analyze Results
Once the data is collected, your A/B software will present you with the data. It will show you whether or not there was a statistical difference between the controlled version and the variation.
Progressive saw an increase in traffic and lead generation with the variation of the landing page. The improved version of the website drastically improved the number of leads that continued down the website funnel. This resulted in a greater number of conversions. By getting the chance to test out the different options in advance, businesses can avoid the risk of implementing an unsuccessful variation.
Although Progressive had a positive result from the A/B testing, the variation is not always successful with the initial run through. Even if the controlled version has better data outputs, there is still opportunity for improvement. We can use this information to our advantage when improving hypotheses for the next round of testing.
Continuing to adjust variations through trial and error will promote a positive outcome. The best way to optimize your website or mobile app is to listen carefully to results and take steps to improve.
Now that we know the basics of A/B testing, it’s important to consider how we can continue to develop our testing practices in the future. A/B testing is still a relatively new concept for the digital marketing industry. Therefore, we are constantly searching for new ways to maximize our results.
A recent study from Forrester highlights a few key takeaways to help your digital strategies reach its full potential. The study found that only 30% of online interactions are currently being tested. This means that 70% of interactions aren’t being maximized. We need to expand testing in these four ways in order to reach our customers in effective and meaningful ways.
- Learn from every possible customer interaction
- Test your customers interactions across the entire customer lifecycle
- Deploy testing within all digital channels
- Leverage multiple testing techniques
By implementing A/B testing into your future digital strategies, you can better connect with your audience. Small differences in your interfaces and applications can create a more positive user experience. Interactions and connections will become stronger which will have lasting effects on your business. And remember, Flo approves.
Until next time,
Angie, Your Marketing Guru