How to set up an A/B test

split testing

So you planned your test, you have a hypothesis, your variants, and you’re ready to go…

In this lesson, we’ll focus on the technical part of setting up the test.

Before we start, there are some things you need to take in consideration, because we are going to need tools (mainly your ESP), and you need to know what the options and limitations are.


As I mention at the beginning of this module, most ESPs are very limited in what the offer as testing options. You have to be aware of this because there is a chance the test you intend to run is not possible.

To give you a few examples:

  • GetResponse is not limited to A/B testing, you can test more than 2 variants
  • Mailchimp and most ESPs on the other hand, only offer A/B
  • Emma only includes testing for subject lines
  • Aweber doesn’t have the option to send a test to a fraction of your list, you are forced to split 50% – 50%

How to set up the A/B test

In this video, I show you the basic settings of configuring your A/B test.

In some cases, you’ll want to select the winner yourself instead of letting the ESP do it for you.

Mailchimp gives you this report at the end of your test, once you’ve decided which variant to run, you simply select the winner at the bottom.

Manual winner

Testing content on Mailchimp

There is one more thing we should cover: testing items in the campaign content.

In most ESPs, you’ll find the option to create 2 different campaigns to run your test. As you can see in the image below, GetResponse gives you the option to duplicate and edit your “control” or directly create a new campaign. In fact, you have the option to run 5 variants of the content in your campaign.

Test content

Makes sense, right?

Well, unfortunately Mailchimp complicate things and, instead of creating 2 separate campaigns, you have to use “merge tags” inside the same email.

Like this…

Content merge tags

You are basically including the 2 versions of your testing item, for example your CTA button, in the same design. Mailchimp splits the variants based on those merge tags.

Sounds complicated? I know.


Aweber split test

Okay, one more… this one for Aweber users.

Let’s just say it: A/B testing is not one of Aweber’s strong points.

Basically you are running things manually. While you get the option to set a test and specify the percentages of your list you want to send each variant, you do NOT have these options:

  • You can’t set a duration time for the test
  • Your winner is not automatically sent to the rest of the list
  • You do not set a metric to determine a winner, you’ll have to decide that on your own
  • You’ll have to create a new campaign for the winning variant
  • And you’ll have to send to the whole list and exclude the groups that received the test

But let’s not be so hard on poor Aweber, there are a couple of *hidden* qualities to it. I don’t know if they see this because I’d be using this as a marketing differentiator.

  • Since you don’t set a goal for the test or a metric, you can manually run any kind of test you want (they just don’t know what you’re trying to achieve)
  • You can send up to 4 variants in a test, so it’s not limited to A/B


The homework is simple:

  1. Head over to your ESP
  2. Explore what options are offered (can you only test A/B or more variants? What items can you test?)
  3. Now that you know what the possibilities are, you can plan your next test