Strategy & Tactics

A/B Testing

A/B testing (or split testing) is the practice of sending two or more variations of an email to different portions of your audience to determine which performs better.

What Is A/B Testing?

In email A/B testing, you create two versions of an element — typically the subject line, but also sender name, preview text, content layout, or send time. A portion of your list (usually 20-30%) receives each variant. After a set period, the winning version is sent to the remaining subscribers. Effective A/B testing requires statistical significance — with small lists, differences may be due to chance rather than genuine preference. Most platforms require at least 1,000 subscribers per variant for reliable results. The key is testing one variable at a time so you can attribute the difference to a specific change.

Why It Matters for Newsletters

A/B testing removes guesswork from your newsletter strategy. Over time, consistent testing compounds — a 5% improvement in open rate from better subject lines, combined with a 10% improvement in clicks from better content placement, can significantly increase overall engagement and growth.

Best Practices

  1. Test one variable at a time for clear results
  2. Ensure your test groups are large enough for statistical significance (1,000+ per variant)
  3. Let tests run long enough to account for different time zones and reading habits
  4. Document your tests and results to build a knowledge base over time
  5. Test continuously — what works changes as your audience grows and evolves

How Aldus Handles This

Aldus automatically A/B tests subject lines on every send. The AI generates multiple subject line variants, tests them on a portion of your list, and sends the winner to the rest. No manual setup required — it happens automatically with every issue.

Try Aldus free

AI writes your newsletter. You just approve and send.

Get started →