
Confidence arises from the belief you’ll succeed in your endeavors. Yet if belief is empowering, how much more so is knowledge? The good news: you can move from believing to knowing through A/B testing!
A/B testing is simply the practice of assigning two potential values to any variable, and measuring the impact of each on the outcome of your test. For email, the results can drive everything from the subject lines you use to the offers you present to the calls to action you employ, and more. Here’s a breakdown of how to make the most of your A/B testing efforts:
A: Approach it one step at a time
Deciding what to test can be trickier than it sounds. It’s like showing up hungry at the all-you-can-eat testing buffet: there are many delicious variables to pick from, but in this case, you can only choose one dish.
So what variables might you choose? Consider one (but only one) of these:
- subject line
- call to action
- button size
- button color
- button text
- body text
- headline
- closing text
- text link versus a button
- layout
- message length
- tone
- theme color
- personalization
- images
- offers
- word order
- content
Anything in the list above can impact conversion rates by influencing some part of the conversion process – particularly (but not limited to) the next step. The subject line impacts how many people open your email, the content impacts how many will be engaged, and the call to action impacts how many travel further down the conversion path. There are so many things to test, you should give serious thought to what you’ll test first.
B: Brainstorm your testing priorities
Focus your efforts on those things that are most likely to deliver better performance by looking at hitoric trends. If open rates are typically low, subject lines are a great place to start. One way to prioritize what variable to test is to consider how impactful improvements to it are likely to be, and how easily it can be tested.
Remember, there’s a reason you’re performing an A/B test on a particular variable, and you should know it before you start. You do so by formulating a hypothesis. A hypothesis is merely a theory you intend to test, such as “adding a recipient’s name to the subject line will increase open rates” or “green buttons are more effective than red ones.”
C: Consider what success looks like
Armed with your hypothesis, you next must determine how you’ll define success. Set a goal that’s higher than the historic norm – but not ridiculously high. It should at least be high enough to be statistically relevant. You might define success as a relevant increase in open rate, click-through rate or even conversion rate.
What does a subject line have to do with a conversion rate, you ask? The more people who are motivated by a subject line to open an email, the more who should heed the call to action and so on. If your open rate increases 50% and your conversion rate doesn’t, you should ask yourself where are customers abandoning the path to conversion.
It’s critically important to test A and B versions of your email simultaneously, as send time factors into performance. It’s also essential to test emails among similar populations or within the same segment to ensure you’re comparing apples to apples.
D: Do that which you learn
It’s very likely that most A/B tests won’t result in a dramatic increase in opens, click-throughs or conversions. Some will result in lower numbers all around, and many more will show no measurable effect. However, seemingly small positive impacts can lead to huge gains.
Remember, increasing open rates from 10% to 12% actually represents a 20% performance increase. If that carries through to conversion, you’ve had a nice day. Document the change that lead to this improvement, and incorporate it into your ongoing email practices.
It’s also worth noting that modern email campaign management tools provide for automatic A/B testing where once a subject line, creative element, offer or other variable exceeds a predetermined performance threshold, it becomes the standard for any emails remaining to be sent. This helps ensure your campaign performs best based on actionable test results.
E: Explore multivariate testing
We made a big deal out of testing a single variable at a time with A/B testing. In part that’s because if you test multiple variables simultaneously, you’re not actually conducting an A/B test: you’re engaging in multivariate testing. This type of testing attempts to show how a combination of factors can work together to improve performance. It’s another important tool, but it raises the bar on complexity. For many marketers, A/B testing is simpler approach that is a first step to more effective emails.

The ABC’s of A/B Testing
was written by me, Greg Norton – also known as webzenkai. I’ve got more than two decades’ experience building effective websites and powerful email campaigns that yield results. Feel free to contact me regarding this article or anything else you find on this website.