Unlock Your Email Data with A/B Testing

Unlock Your Email Data with A/B Testing

A/B testing is vital to any email marketing strategy. It may seem like a lot of work, but a good A/B test can improve your email data immensely, and it doesn’t have to be a labor-intensive project.

Also referred to as split testing, A/B testing is comparing two versions of something to see which one performs better. In digital marketing, you’ll mainly be using this tactic for email.

So why should you A/B test, and how do you do it?

Why Am I Doing This?

Using data to make decisions will continually improve your bottom-line results.

Even though you may think that your beautiful graphic or clever slogan is best for your business, the actual reaction from your users may be a different story. A/B testing will improve your sales and conversions while also saving you time and effort.

How to Gather Email Data with an A/B Test

If you’ve never done an A/B test, here’s a quick rundown.

Basically, it’s as simple as changing one variable and sending two different versions of the same email to see which performs better. This works best if you have a sample size of at least 1,000 people, but it can be done with fewer.

Once you’ve finished producing and editing your email, split your send list in half. Name one version “Email A” and the other “Email B.” Now, change one (and only one) variable for the B version.

Here are some potential variables to choose from:

  • Subject line
  • Call-to-action (CTA)
  • “From” line (name of email sender)
  • Time of send (day of week or time of day)

Many email marketing programs, like MailChimp, have built-in A/B testing features. If yours does, use that instead of manually splitting your list.

How Do I Test Those Variables?

These variables are important pieces of your email that can potentially make or break your open and click rates. We’ll go over some A/B testing measures as well as some general best practices.

Subject Line

Testing the subject line is pretty self-explanatory. Just switch it up a bit.

Maybe one version has a personalization field (an automatically generated first name pulled from your send list), while the other is more generic. We suggest using personalization because, statistically speaking, it will yield better open rates.

But you can even test different types of personalization. For example:

  • (Friend), did you hear about our sale?
  • Did you hear about our sale, (Friend)?

A/B testing is a key marketing strategy

Call-to-Action (CTA)

The call-to-action (CTA) button lends itself to many different tests. You can test the size, color, shape, and text of your CTA while keeping these CTA best practices in mind:

  • Does it look clickable?
  • Is there a clear value to the click?
  • Is it the logical next step on your page?

If you’d like to go more in depth about how to create clickable CTAs, read this post.

“From” Line

You can go with quite a few different options for your “from” line.

Personalization is good here, too. Statistics show that people are much more likely to open an email from a person than from a company. If you had a latex store called Vandelay Industries, you wouldn’t want to use the company name in the “from” field. Instead, you’d want to use “Art Vandelay” or “Art at Vandelay Industries.”

That’s a great A/B test right there — trying both of those “from” lines.

Time of Send

The time of day and day of the week that you send are both crucial to your email strategy. Early in the morning (6 a.m.-ish) tends to work the best overall, but you could easily get better results at night, depending on your customers.

Try different times, then different days. If any of your customers work during the day and you’re sending to their work email, then an early-morning send is great. On the other hand, personal email addresses are opened at all times of the day and tend to be more specific to the product.

Why Test Only One Variable?

Ideally, you could test everything at once and have the best results at your fingertips.

Unfortunately, it doesn’t work that way. If you change more than one variable, you won’t know why your open or click rates changed. Was it the different-colored CTA? Or the personalized subject line?

How Can I Use the Results of My Email Testing?

Compile the results of your tests into a spreadsheet. Measure these stats for each test you run:

  • Open rate
  • Bounce rate
  • Unsubscribe rate
  • Click rate

The more testing, the better. But you can get a pretty good idea of what your customers prefer from even one test. If you have time to do several tests on a single variable, you’ll be even farther ahead.

Nail down the best option for each variable, and use that going forward in your emails. If you find that your customers are more likely to open an email at 6 a.m. on a Tuesday than at 7 p.m. on a Thursday, send more emails at 6 a.m. on Tuesdays.

A/B Testing Disclaimer

Ultimately, human beings aren’t necessarily rational in their decision-making. So, no matter many statistics you measure, there will always be a factor that you can’t account for.

Maybe some of your recipients open all their emails to zero out their inbox. Or maybe they didn’t even read the subject line. We can never know exactly what drives people to act in a certain way.

However, these A/B testing measures are scientific enough to help us discern what is and isn’t working.

Optimize Your Email Marketing

Want to go beyond A/B testing and optimize your email marketing even more? Click below to check out our Complete Guide to Email Marketing.