February 09, 2021   |   Vol. 20   |   Issue 2
Jana Darling, MGI Managing Director, Account Services

Better Results Through Email Testing

There is no such thing as too much testing when it comes to email marketing. Opportunities for testing abound, but one of the easiest ways to check and optimize the performance of your email campaigns is to conduct A/B tests. Not only do they provide quick, actionable information and insight, they are vital to conducting a successful email program.

A/B testing consists of changing one variable in an email, sending to a limited audience, measuring the response, and using the results to send the best-performing version to a wider audience.

The first step in building your testing strategy is to understand what metric you want to improve on.

Let’s say you want to improve open rates. A/B tests could include:

  1. Day of week or time of day deployment. In a recent test for a client, an identical email was deployed at two different times of the day:
    1. 10:00 AM: 14.75%
    2. 7:00 PM: 16.18%
    3. The clear winner was the 7:00 PM segment, which had nearly a 1.5% higher open rate.
  2. “From” line. A test using an individual’s name versus the association’s name in the "from" line resulted in an open rate over 3% higher. Notably, it helps if the individual’s name is recognizable by your members.
    1. Association name: 13.78%
    2. Association program director’s name: 17.07%
  3. Subject line or preheader text. Creating a sense of urgency is a proven marketing tactic, and with good reason—it works! In this example, using urgent language in the subject line increased the open rate by just over 1%.
    1. Last Chance for Exam Before Tax Law Changes!: 10.96%
    2. Register Now for the July Exam: 9.94%

If you have strong open rates but want to improve click rates, try these A/B tests:

  1. Design, layout, or message formats. An email using only the association’s logo and text was tested against a more complex design using multiple graphic elements. Both contained the same copy and call to action.
    1. Simple format email: 7.53%
    2. Complex design email: 11.30%
    3. The more complex design had almost 3.5% higher clicks.
  2. Images, including placement in headers. An email using the image of a person in the header was tested against a text-only header.
    1. Header graphic that included a photo of a person: 1.76%
    2. Header graphic that included only text: 1.07%
    3. The header with the image of a person outperformed the other version by approximately 0.7%.

Whatever you choose to test, it’s critical that you follow these best practices:

  • Test only one element at a time.
  • Use a statistically significant sample size.
  • Allow ample time to determine the A/B test winner.
  • Use the test results to make informed decisions.

Testing should be an ongoing and integral part of any email campaign. Even if you think your email engagement levels are strong, you won’t know if they could be better unless you continually test different options. You could find what works for some emails and audiences may not work for others.

A/B testing is an easy and cost-effective tool to ensure your organization’s emails are performing at optimum levels. Try any or all of the recommendations provided above; you truly have nothing to lose and everything to gain.

Need help developing your email testing or overall marketing strategies? Contact Jana Darling, Managing Director, Account Services, at jdarling@marketinggeneral.com.

Request a Membership Consultation