How to A/B Test Emails: A Step-by-Step Guide
A/B testing removes guesswork from your newsletter strategy by letting data guide your decisions. This guide covers how to set up, run, and learn from email A/B tests — from subject lines and send times to content formats and CTA placement.
Step-by-Step Instructions
Choose one variable to test
The golden rule of A/B testing: test one thing at a time. If you change both the subject line and the CTA, you won't know which change caused the difference. Start with subject lines — they're the easiest to test and have the biggest impact on open rates. Other testable elements include send time, preheader text, CTA button design, content format, and email length.
Define your success metric
Decide what you're measuring before running the test. For subject line tests, the metric is open rate. For content and CTA tests, it's click-through rate. For send time tests, it might be a combination of opens and clicks. Having a clear metric prevents post-hoc rationalisation and ensures you're learning something actionable.
Split your audience properly
Split your test group randomly — don't let any bias into the selection. Most platforms do this automatically. Send each variant to at least 1,000 subscribers for statistically significant results. A common approach is 20% test group (split evenly between A and B) and 80% receiving the winner.
Wait long enough for results
Give your test adequate time before declaring a winner. For subject line tests, 2-3 hours is usually enough. For content or send time tests, wait 24 hours to account for different reading patterns. Rushing to conclusions with too little data leads to false learnings.
Analyse results and apply learnings
Look beyond the winning variant to understand why it won. If a subject line with a number outperformed one without, that's a pattern to test further. Document every test result in a spreadsheet: what you tested, the variants, the metric, the winner, and your hypothesis about why. Over time, this becomes an invaluable reference.
Test continuously
A/B testing isn't a one-time activity — it's an ongoing practice. What works changes as your audience grows and evolves. Run a test with every send or at least weekly. Over months, the compounding effect of many small improvements significantly boosts your overall performance.
Pro Tips
- Keep a testing log with hypotheses, results, and insights — patterns emerge over time
- Test bold differences, not minor tweaks — 'Newsletter #47' vs '5 tools that will change your workflow' is a real test; 'your' vs 'the' is noise
- Don't stop testing something that works — preferences change over time
- If results are too close to call, the difference probably doesn't matter — move on to testing something else
- Share test results with your audience occasionally — they find it interesting and it builds transparency
Common Mistakes to Avoid
- Testing multiple variables at once — you can't attribute results to a specific change
- Drawing conclusions from tests with too small a sample size
- Not waiting long enough before picking a winner
- Only testing subject lines and ignoring content, CTAs, and send time
- Running tests without a clear hypothesis or learning goal
How Aldus Makes This Easier
Aldus automates A/B testing on every single send. The AI generates multiple subject line variants, tests them on a portion of your audience, and automatically sends the winner to the rest — no manual setup required. Over time, the AI learns which patterns perform best for your specific audience.
Frequently Asked Questions
How many subscribers do I need for A/B testing?
You need at least 1,000 subscribers per variant for statistically meaningful results. With a 20/80 split (20% test group), that means at least 10,000 total subscribers for reliable testing. Smaller lists can still test but should treat results as directional rather than definitive.
What should I A/B test first?
Start with subject lines — they're the easiest to test, have the biggest impact on opens, and generate clear results quickly. Once you've optimised subject lines, move to send time, CTA placement, and content format.
How often should I run A/B tests?
Ideally, every send. Subject line A/B testing can run automatically with every issue. For more complex tests (content format, CTA design), plan one focused test per week or month depending on your send frequency.