We Analyzed Our Best and Worst Automotive Emails, and Here’s What We Found
All right, nerds, this one’s for you. We’re going deep into the weeds with some hardcore email data talk.
If you’re a cool cat who’s been following our blog for a while, you’ve likely heard us talk about A/B testing, identifying benchmarks, and email strategy for automotive dealers — among a slew of other digital marketing-related topics relevant to the auto world.
Identifying marketing strategies and benchmarks doesn’t happen overnight. In fact, we recently dedicated three months to performing an in-depth analysis of our auto clients’ marketing emails and also created our annual automotive benchmarks report.
We analyzed the best- and worst-performing emails from 10 different 9 Clouds automotive clients sent over a two-year period. This created a launching point to take our email testing and improvements even further than before.
Here’s what we learned.
We didn’t just do this email analysis for fun (although it was pretty fun).
The goal of this project was to identify trends among the best and worst emails, create hypotheses based on those, and A/B test those hypotheses in order to improve our year-to-year benchmarks across all auto clients.
While each client and its target audience is nuanced, we wanted to create a high-level picture of what generally is and isn’t working for automotive email. Once we had an idea of those factors, we could then refine our findings as needed.
Before we could begin, we needed to outline the project basics:
- 60 Emails: We selected the three best-performing and the three worst-performing emails for each client.
- 10 Auto Clients: We picked 10 different 9 Clouds automotive clients to use for this email analysis.
- 6 Key Metrics: We considered six different variables when examining the emails.
- 2 Years of Data: We gathered information within a two-year time frame.
We selected the “best” and “worst” emails based on open rate and click-through rate (CTR), which were pretty simple to determine thanks to HubSpot‘s email analysis tools. Our analysis made sure to exclude automated emails, such as workflow emails, form follow-up emails, or internal lead notification emails, since those tend to have drastically different results from regular “one-off” emails.
Once the process was outlined, we compared the following six variables in each “best” or “worst” email:
- Subject Line
- Preview Text
- From Name
- Open Rate
- Top-Clicked Link
In the weeds of this work, there were some sweet surprises and helpful takeaways. More than anything, we identified several opportunities for A/B testing.
The most glaringly evident finding from this study is that 9 Cloud client emails are steadily improving over time.
Of the 30 emails in the “best” group, 83% were sent in the second half of the two-year period. That’s an encouraging starting point! We also found that our average email open rates are improving, though our average email CTRs decreased slightly over the two-year period.
Here’s just a glimpse of some of the other key findings from our automotive email study.
1. Subject Line Options Are Endless
In this 9 Clouds study, many of the subject lines included personalization tokens, emojis, and a specific dollar amount.
We took an in-depth look at which variables most frequently appeared in which group (“best” vs. “worst”), made hypotheses based on that data, and implemented them into our A/B testing strategy.
2. Sender Name Impacts Open Rate
This analysis hinted that emails sent from “Dave at Car Dealership” contributed to a higher open rate than emails from just “Dave Thompson” or the ever-generic “Car Dealership.”
We’ve put this hypothesis to the test already, and our suspicions were confirmed. Run on over to this blog post to read all about it, and try out this practice for yourself!
3. More Links Result in a Higher CTR (Maybe)
Not counting email header logos or text links in disclaimers, emails in the “best” group had more clickable links in the email on average than those in the “worst” group.
This is interesting, because while the choice paradox argues that recipients respond best to emails with fewer options and that having too many calls-to-action (CTAs) kills CTR, these findings suggests that people like more choices in their auto emails.
What’s that sound? Oh, it’s the sound of an A/B testing opportunity!
4. Customers Click on CTA Buttons More than Other Link Types
What do people most often click: CTA buttons, text links, or images?
We found that CTA buttons are overwhelmingly the most-clicked type of link, claiming 59% of all links in the 60 emails we studied. Text links were the top-clicked link in 23% of the emails, while images were the top-clicked link in 16% of the emails.
Considerations and Mitigations
This project wasn’t foolproof. But remember, the purpose of it was to paint a picture of overall trends.
Here are a few considerations and mitigations we’ve kept in mind throughout this auto email analysis.
1. Recipients Aren’t Taken into Account
Sending the right message to the right person is perhaps the most important part of inbound marketing. The person who receives an email plays a major part in the email’s performance — possibly more than any other factor.
However, for this analysis, we did not include the recipient list as a contributing factor to the metrics (simply because there was no efficient way to do so).
2. All Clients Are Different
All auto dealers are not the same, so people won’t respond to their emails in the same way. The types of vehicles that dealers sell, their buyer personas, their local economy — these all vary, so automotive marketing emails can’t all be done in the exact same way.
Unbounce reminds us not to assume that “wins” apply across all customer segments. That’s something 9 Clouds always keeps in mind as we A/B test, solidify, and refine our best practices.
3. High Open and Click-Through Rates Are Not the Ultimate Goal
While they are certainly indicators of success, high open rates and CTRs are not the only goal of automotive email marketing.
In each email you send, can you identify a clear, quantifiable action for the recipient to take that will contribute to your higher goal of a vehicle detail page (VDP) view or a vehicle sale?
If you haven’t identified a clear goal in sending an email, don’t send it. Unless it has a helpful purpose, it’s not valuable to your customer or worth your time.
What You Can Learn from This Study
Are we far enough into the weeds? I think so.
Listen, don’t take our findings from this project with anything more than a grain of salt for your own email efforts. The whole point of this was for us to learn.
What you can take from this is how important, enjoyable, and manageable it is to analyze your own content, identify areas of testing and improvement, and create benchmarks and standards based on your findings.
Why? So you can become better, of course — just as we strive to do for ourselves and our automotive clients.
If you stuck with us on this excursion but found yourself a little lost in the scratchy weeds of email analysis, don’t be discouraged. You’re in the right place to keep learning! Learn more about email marketing with us, or request a free digital audit.