A/B Test Results: Which Email Sender Name Is Better?

A/B Test Results: Which Email Sender Name Is Better?

I have a hard time imagining how I would do certain things in life without technology. It feels very millennial of me to admit that, but oh well, I’m not perfect.


I’m miserable at reading maps, and I’m positive I’d never get from Point A to Point B without my trusty app. As I’m training for my first half marathon, I’m super grateful for my smartwatch.

I can track my distance and share it with others, so I feel safer running alone. I can compete with friends to motivate myself on those days when the couch is calling. And I can visualize my progress through mountains of data.

Running the ROI

Everyone at 9 Clouds digs into data in some way.

I geek out over a 30-second improvement in my running pace. Others follow the weather and its effect on their gardens. And one of us uses our powers of analysis to be really awesome at Dungeons and Dragons. (Don’t judge.)

This last month, while I’ve been slogging through miles and miles in the South Dakota heat, our team has been poring over email analysis.

This stems from one of our core values as a company: We believe in transparent services based on data. Data is in the 9 Clouds manifesto. We just can’t stop thinking about return on investment (ROI), whether it’s for a client’s email or for our own fitness endeavors.

Who Should Send Your Email?

This summer, we set out to answer one question: which email sender name is most preferential? Should your marketing message come from someone’s first and last name (“Dave Dealer”) or just their first name, plus your company name (“Dave at Your Dealership”)?

The most important rules in A/B testing are that you change only one variable at a time and that you send to a statistically significant number of people (which is why it took us all summer to figure this out).

After sending emails to thousands of recipients, we had our answer! We found that Version B (“Dave at Your Dealership”) had a higher open rate in six out of 10 tests and a higher click-through rate (CTR) in eight out of 10 tests.

Version A (“Dave Dealer”)

  • Avg. open rate: 22.8%
  • Avg. CTR: 11.7%

Version B (“Dave at Your Dealership”)

  • Avg. open rate: 23.13%
  • Avg. CTR: 14.3%

“Dave at Your Dealership” reigned supreme in the Summer of 2018 Email A/B Test Fest, which prompted an agency-wide adjustment to our sends.

This might sound like a ridiculous amount of time spent on such a small nuance, but for an improvement of up to 2.6% for an email’s click-through rate, we think it’s worth all that time and effort.

See the results of our most recent email A/B tests here!

We Analyzed Our Best and Worst Automotive Emails, and Here’s What We Found

Solve Problems with Data

No matter what the data we’re working with, we’re a team of people who love to solve a problem. I love that about the folks I work with (it means they won’t make fun of me for tracking my runs 🏃🏻‍♀️ with such meticulous detail).

If you want a team that will handle your marketing with as much thought and dedication, get in touch. We’d love to have another data set to compare.