In our latest SaleCycle Academy video, Aquisition Marketing Manager Bethany McDermott looks at some do’s and don’ts of A/B testing.
A/B testing is all about experimentation. By seeing what effect changes to areas like copy, calls to action and subject lines have, you can see what works best with your audience.
A/B tests are easy to carry out, but it’s important to take the right approach to get the best results.
Here are five do’s and don’t to help ensure your tests provide results you can trust.
Do: Test one variable at a time
It pays to keep tests simple, and try out one change at a time. With emails, if you test subject lines, CTAs and copy all together, you’ll never know which one made the difference.
For example, Hunter were running a 10% off promotion online and tested their regular subject line against one with the online promotion message.
By including the promotional message, email open rates improved by more than 52%.
Do: Test to Groups of the Same Size and Type
Different customer segments vary in behavior, so it’s important to target your tests at groups of the same size and type to deliver reliable and comparable results.
Do: Make Sure Tests are Statistically Significant
It’s easy to get carried away with promising test results, but it’s vital your results are “statistically significant” which basically means we’re confident that they’re based on true customer behavior.
We’re looking for results that are at least 95% statistically significant, so any changes made as a result are based on solid data. (Our A/B Test Significance Calculator can help you here).
Don’t: Become Distracted by Results in Other Areas
A/B tests can deliver unexpected results, but it’s important to maintain focus on your hypothesis. For example, if you’ve made changes to improve open rate and your click rate increases, it’s probably unrelated so stick to your goals.
Don’t: Be Afraid to Fail
Not every test will be successful. Even if your first texts don’t deliver the results you hoped for, keep going. A/B testing can take time and patience, and even the ‘failed’ tests can provide some insight.
Graham Charlton is Editor in Chief at SaleCycle. He's been covering ecommerce and digital marketing for more than a decade, having previously written reports and articles for Econsultancy. ClickZ, Search Engine Watch and more.