A/B Testing

Not sure which subject line will lead to the best click-through rate? Trying to decide the most compelling CTA copy? Need some data to back up a discussion about the best hero image?

A/B testing takes the guesswork out of sending campaigns.

How it works

Once you've decided what you want to test, and how you'll measure the success of your campaign, you'll add your content and establish the parameters of the test.

Winning metric

The winning metric is what we measure to determine the 'winning' variant. You'll select this during the configuration stage. The options available to you will depend on the channel you're using to run your test.

  • Click-through rate: The percentage of recipients who open your message and click at least one link inside.

  • Read rate: (WhatsApp only): The percentage of recipients who open and read your message.

  • Open rate: (Email only): The percentage of recipients who open your message.

The 'winning' variant will be the variant with the best result for whichever metric you select.

Test size

You must have 100 recipients in a list or audience to be able to send them an A/B tested campaign.

During the configuration stage, you'll choose what percentage of your recipients you want to test your variants on.

The minimum size for your test list is 100 recipients, but remember that the larger your test size, the more accurate your results will be.

This is especially important if you're comparing more than two variants. Eight variants tested on 10% of a 100-recipient list won't lead to very statistically significant results!

We recommend that you test your campaign variants on a minimum of 10% of your recipients.

Test duration

By default, the test duration is set to two hours. You can increase this as required.

The longer you test your variants for, the more accurate your A/B test results will be. However, it's important to remember that the length of test window will affect the time that the winning variation is sent out, and factor this in when testing your campaign.

For example, if you know your best sending time is 11 AM on Thursdays, you might decide to A/B test your campaign for 24 hours at 10 AM on Wednesday, so that the winning variant is sent at the optimal time.

Winner fallback

Sometimes A/B tests simply aren't conclusive. This can happen for a few reasons:

  • There's not enough data for the test to be conclusive. For example, if you selected the 'Click-through rate' metric, but nobody clicked through on any of your variants. This can happen if the test duration was too short, or the testing sample size was too small.

  • Negligible differences (less than 1%) in the results. For example, if you selected the 'Click-through rate' metric, and your first variant achieved an 8.1% click-through rate, while your second variant achieved an 8.7% click-through rate.

During the configuration stage, you'll choose what you want to happen if your test is inconclusive:

  • Manually select the fallback winner: This option allows you to choose which variant you'd prefer to send.

  • Send all test variations equally to remainder recipients: This option allows us to distribute the variants from the test equally among the remaining recipients.

How to A/B test a campaign

Follow these steps to set up and send an A/B test campaign.

What you'll need

  • An installed Email or WhatsApp channel.

  • At least two (and up to a maximum of eight) message templates with different content.

  • A list or audience of opted-in contacts.

Step one: Create your A/B test campaign and add content

  1. Go to Engage customers > Campaigns.

  2. Click Create new campaign.

  3. Name your campaign.

  4. Add any tags to your campaign.

  5. Under 'Channels', select the channel, then the specific channel instance you want to send your campaign from.

  6. Configure the sender.

  7. Under 'Recipients', select the list or audience that you want to send your campaign to.

  8. Under 'Content', select your first message template from the drop-down.

  9. Under 'Language' select the language that you want to send that template in.

  10. If the message template contains variables, select the attributes that will be used to populate the variables. For each variable, enter a default value to use if no attributes are found.

  11. Below the campaign preview, click Add variation for A/B testing.

  12. Select your second template from the drop-down.

  13. Continue to add template variations to your campaign. When you're done, click Next.

Step two: Configure your A/B test settings

Now that you've created your A/B test campaign and added the templates that you want to test, it's time to configure the rest of your settings.

  1. Under 'Winning metric', select either Click through rate, Read rate (WhatsApp only) or Open rate (Email only).

  2. Under 'Test size', drag and drop the bar between the test versions and the 'winner' to determine the percentage of recipients who will receive a test version of the campaign.

  3. Under 'Test duration', enter the number of hours you want to run your test for before the winning version is sent.

  4. Under 'Winner fallback', either select Manually select the fallback winner, then choose the variation you want to send in event of an unclear test result, or select Send all test variations equally to remainder recipients.

  5. Click Preview and test.

Step three: Preview and test your A/B test campaign

The final step before you can start testing your campaign on your audience is to preview your campaign content, and send test variations to yourself or your team.

  1. From the 'Preview' screen, go to the 'Preview and test' panel on the right-hand side.

  2. From the Variation drop-down, select the template you want to preview and test.

  3. From the Simulated contact drop-down, select the contact you want to send a test campaign message to. If you're sending an email, pick the contact's email address. If you're sending a WhatsApp, pick the contact's number.

  4. Click Send a test message.

  5. The recipient will receive a test email. Check that all the templates look correct, that any variables have been replaced, and that any media and buttons are displaying correctly, and give it a final review for spelling and grammar.

Step four: Send your A/B test campaign

Ready to send your A/B test campaign? Click Send campaign in the top right-hand corner of your screen, then choose either:

  • Send now to immediately send your test campaign variations to a percentage of your recipients.

  • Schedule campaign to wait until a set date and time to send your test campaign variations to a percentage of your recipients.

Sending phases

  • Once your A/B test campaign has started to send, the status will change to In progress.

  • Once the A/B test is complete, the status will change to Waiting while the results are analyzed.

You've just scheduled an A/B test! Time to sit back, relax, and watch the results roll in.

A/B test results

Once your A/B test has been completed, you can analyze the test results in the Campaign performance report.

The winning variant will have a 'Winner' flag displayed next to its name.

  1. Go to Engage customers > Campaigns.

  2. Find the A/B tested campaign that you want to analyze.

  3. Click View report.

  4. In the overview section, switch between variations to see how each variation performed during the testing phase.

  5. To see information about the test, click the A/B Test tab. Here, you'll be able to see:

    • A/B Test Details: Variations, Test size, Winning metric, Test duration (with exact timestamps), and result status.

    • A/B Test Results: Variation type, Subject (if testing emails), Template language, and Variables.

FAQs

What channels can I A/B test on?

WhatsApp and Email.

Can I run an A/B test on multiple channels?

No, you can only run an A/B test on a single channel at a time. You can't combine multiple channels within a single A/B test.

What is the minimum audience/list size for running an A/B test?

Your audience or list size must include at least 100 contacts. The minimum test size is 10%.

How are variants split between test recipients?

Test recipients are chosen and split at random.

What happens when an A/B test is inconclusive?

If an A/B test is inconclusive, the system will either send the fallback variant selected during the configuration stage.

How many message template variations can I test in a single campaign?

You can test a maximum of eight message template variations in a single A/B test.

Last updated