ABA Banking Journal
No Result
View All Result
  • Topics
    • Ag Banking
    • Commercial Lending
    • Community Banking
    • Compliance and Risk
    • Cybersecurity
    • Economy
    • Human Resources
    • Insurance
    • Legal
    • Mortgage
    • Mutual Funds
    • Payments
    • Policy
    • Retail and Marketing
    • Tax and Accounting
    • Technology
    • Wealth Management
  • Newsbytes
  • Podcasts
  • Magazine
    • Subscribe
    • Advertise
    • Magazine Archive
    • Newsletter Archive
    • Podcast Archive
    • Sponsored Content Archive
SUBSCRIBE
ABA Banking Journal
  • Topics
    • Ag Banking
    • Commercial Lending
    • Community Banking
    • Compliance and Risk
    • Cybersecurity
    • Economy
    • Human Resources
    • Insurance
    • Legal
    • Mortgage
    • Mutual Funds
    • Payments
    • Policy
    • Retail and Marketing
    • Tax and Accounting
    • Technology
    • Wealth Management
  • Newsbytes
  • Podcasts
  • Magazine
    • Subscribe
    • Advertise
    • Magazine Archive
    • Newsletter Archive
    • Podcast Archive
    • Sponsored Content Archive
No Result
View All Result
No Result
View All Result
Home Retail and Marketing

5 Crucial Email Testing Steps

January 3, 2018
Reading Time: 6 mins read

By Meg Goodman

You’ve developed your email creative, agonized over the copy, and had your team run quality control tests to make sure the email is rendering correctly. All that’s left is scheduling and hitting send, right?

Wrong.

It’s time to set up an A/B test.

Modern marketing is synonymous with data-driven marketing, and the best method for getting quick, actionable data to drive more impactful email is A/B testing.

In a survey conducted by Econsultancy and RedEye, 74% of survey respondents who use a structured approach to conversion testing improved their sales.

A/B testing email allows you to identify which combination of creative, tech, and time lifts response. With email marketing doing much of the heavy lifting for win-back or cross-selling initiatives, it’s crucial to understand what mixture of elements makes up the secret sauce for making more informed decisions.

How to set up a statistically sound testing plan in 5 steps.

A/B testing requires less data to find statistical significance than other testing methodologies and it’s often supported by email marketing automation and sender platforms. However, not properly setting up an A/B test can cost you by leading you to make decisions based on faulty data.

One way to reduce risk is by starting your testing plan with an A/A test to ensure your A/B testing tool is calling the right winner with confidence.

Step 1: Check your tools with A/A testing.

A/A tests use A/B testing tools to pit two identical versions of an email against one another. Why would you want to run a test where email A and email B are identical?

In most cases, A/A tests are recommended to double-check the effectiveness and accuracy of your A/B testing software. By sending two identical emails using an A/B testing tool, you’re able to judge whether your testing tool is accounting for natural variances in behavior.

For example, if you ran an email test and email A had a 3% higher conversion rate than email B, your A/B testing tool might call email A the winner. However, is a 3% difference statistically sound? That depends on whether the A/B testing tool is requiring a confidence score of 95% or greater before calling a winner.

Definition: A confidence score or statistical significance is a way of mathematically proving that a certain statistic is reliable. A 95% confidence score is widely considered a best practice, representing only a one-in-20 chance that the results you see are due to random chance and misattributed.

When running an A/A test, your results should come back as statistically inconclusive—and your email sender should not call a winner—if the program is set to require a 95% confidence score.

If your A/A test calls a winner, you most likely have one of two situations:

  1.  Your testing software is using statistical inference and not requiring your sample size to be large enough to give you statistical significance confidently. If this is the case, you should either select new software, manually alter how the software calls a winner by requiring a statistical significance score of 95%, or be sure to run a test long enough to reach your required sample size to confidently call a winner.
  2. Your A/A test fell into the one-in-20 chance that a false positive occurred. Even a 95% confidence score can leave a very small open window that a false winner was called. Run another A/A test to confirm that the first situation mentioned is not occurring.

Step 2: Create a hypothesis for A/B tests.

Strengthen the performance of your email campaign by identifying which elements help or hinder conversions and by testing a hypothesis.

When you’re brainstorming hypotheses, remember to only test variables that you believe will move the dial and increase performance. A/B testing for the sake of testing—and without considering your business goals—can lead to inefficient testing and lost time.

Will switching the color of the button from red to green really increase response? Or, is it more likely that a new call to action or messaging strategy will have a greater impact?

You won’t know the answer until you test.

TIP: Only test one variable at a time so that you’re able to better identify what caused a lift in response.

Possible variables to test include:

  • Subject line
  • Subject line character length
  • Special characters or emojis in the subject line
  • Sender name
  • Day of week sent
  • Time of day sent
  • Links vs. buttons
  • Image-based CTAs vs. HTML CTAs
  • Social sharing icons
  • Preheader text
  • Personalization
  • Header image height
  • Using lists, bullets and/or numbers
  • S. note
  • CTA placement, design or color
  • Short copy vs. long copy
  • Promotional copy vs. straightforward
  • Headlines or subheads
  • Design
  • Images vs. plain text
  • Offer
  • Audience segment

Step 3: Choose the distribution size.

Email senders that support A/B testing will allow you to select a distribution size for your test groups. A 50/50 distribution will provide you the greatest data integrity for an A/B test.

Depending on your email sender, you may have to select random draw to make sure individuals of any similar criteria are not lumped together when the A/B test runs.

Step 4: Gather data.                         

While A/B testing requires less data than other testing methodologies to draw statistically sound conclusions, the population size still needs to reach a certain threshold.

Most email senders do not have a fixed horizon (a set point in time) to call statistical significance, which means the testing can and often will waiver between significant and insignificant at any time during an email campaign if the emails are automated.

As a result, you will need to predetermine the sample size of your test so that once you have enough data for statistical significance you can close the test and begin analysis on the entire group of data instead of just one moment.

To determine the sample size you need for statistical significance, you can do the math—or you can use this handy calculator (pictured below) to tell you how large your sample test size needs to be to draw a sound conclusion.

Step 5: Analyze.

Once your email has been sent and you’ve collected enough data to determine statistically significant outcomes, it’s time to start analyzing the results.

Collect the following data from your A/B test:

  • Open rate
  • Click-through rate
  • Unique clicks
  • Unique open rate
  • Click-to-open rate

Note: While other email metrics like bounce rate and spam score are important, they do not indicate conversion performance and shouldn’t be used to evaluate the winner in an A/B test.

 

When declaring a winner, your email sender will evaluate the performance of the conversion act, which is usually defined as clicking a button or hyperlink in the email. But make sure your email sender is evaluating the data based on the confidence score you defined when picking your sample size. Some email clients will not require a confidence score of 95% and give you a false winner based on less data.

To elevate your analysis and reporting, track more than the initial conversion reported. For example, did email A have a higher conversion rate—defined as the number of people who clicked on the call to action button, but less overall sales when you track people’s actions through your site?

That may indicate a bottleneck in the user experience that needs to be adjusted.

Only by creating a structured approach to conversion testing can you improve product adoption and customer retention by driving better email performance. By using a five-step A/B testing plan like this one, you’ll see better results over time. You’ll also have the ability to better articulate what is working—as well as opportunities for constant improvement that will move the needle forward.

Meg Goodman is the Managing Director of relationship marketing agency Jacobs & Clevenger. She has brought measurable, data-driven results to a variety of major financial institutions. When she’s not riding her motorcycle, you can connect with Meg on LinkedIn or Email: [email protected].

Tags: Click metricsData analysisEmail marketingEmail testing
ShareTweetPin

Related Posts

Podcast: AI, third-party risk and the future of partner banking

Podcast: AI, third-party risk and the future of partner banking

ABA Banking Journal Podcast
September 11, 2025

From artificial intelligence to other new technologies to regulatory expectations, how is the partner bank sector shifting?

CDIAC members discuss economic conditions, top policy issues

Bank survey: More workers seeking financial wellness resources from employers

Financial Education
September 11, 2025

Twice as many U.S. workers today look to their employers for guidance and resources around near-term financial needs compared to just two years ago, according to a new survey by Bank of America.

Safeguarding assets: Strategies to address collateral devaluation

From cautious optimism to renewed concerns

Commercial Lending
September 9, 2025

Commercial and industrial loans reverse course in the April 2025 Senior Loan Officer Opinion Survey.

Bank marketers as revenue generators

Bank marketers as revenue generators

Retail and Marketing
September 4, 2025

The alignment of marketing departments with commercial priorities is a shift that makes dollars and sense.

Survey: Consumers increasingly turn to AI for financial advice

Survey: Consumers increasingly turn to AI for financial advice

Financial Education
September 2, 2025

Thirteen percent of respondents said they use AI for banking and financial services on a daily basis, while 59% said they use it occasionally, according to J.D. Power.

The $16 trillion opportunity: How wealth managers can serve women better

The $16 trillion opportunity: How wealth managers can serve women better

Retail and Marketing
August 25, 2025

Using data-driven insights, wealth managers can develop new ways to serve expanding client groups, including women and younger investors. Given the amount of money at stake and dissatisfaction with current offerings, this is a significant opportunity.

NEWSBYTES

FDIC surveys banks on anti-money laundering compliance costs

September 12, 2025

Preliminary: Consumer sentiment fell 2.8 points in September

September 12, 2025

ABA DataBank: Rate forecasts solidify following August CPI

September 12, 2025

SPONSORED CONTENT

The Connectivity Dividend

The Connectivity Dividend

September 1, 2025

Building Trust with Every Transaction

September 1, 2025
10 Essentials of a New Loan Origination System

10 Essentials of a New Loan Origination System

August 29, 2025
Planning Your 2026 Budget? Allocate Resources to Support Growth and Retention Goals

Planning Your 2026 Budget? Allocate Resources to Support Growth and Retention Goals

August 1, 2025

PODCASTS

Podcast: AI, third-party risk and the future of partner banking

September 11, 2025

Demographic trends shaping the U.S. banking outlook

July 30, 2025

Podcast: How institutional banking helps build one regional bank’s strategy

July 24, 2025

American Bankers Association
1333 New Hampshire Ave NW
Washington, DC 20036
1-800-BANKERS (800-226-5377)
www.aba.com
About ABA
Privacy Policy
Contact ABA

ABA Banking Journal
About ABA Banking Journal
Media Kit
Advertising
Subscribe

© 2025 American Bankers Association. All rights reserved.

No Result
View All Result
  • Topics
    • Ag Banking
    • Commercial Lending
    • Community Banking
    • Compliance and Risk
    • Cybersecurity
    • Economy
    • Human Resources
    • Insurance
    • Legal
    • Mortgage
    • Mutual Funds
    • Payments
    • Policy
    • Retail and Marketing
    • Tax and Accounting
    • Technology
    • Wealth Management
  • Newsbytes
  • Podcasts
  • Magazine
    • Subscribe
    • Advertise
    • Magazine Archive
    • Newsletter Archive
    • Podcast Archive
    • Sponsored Content Archive

© 2025 American Bankers Association. All rights reserved.