Email Marketing Woes: Email A/B Testing to the Rescue
Let the emails compete!
Let the emails compete!
Published : Mar 13, 2023
Email marketers have a hard job - their marketing emails must be opened, clicked through, and then lead the user to conversion. It's not just typing out a few words, pretending to be the prospect's buddy, and asking them to buy something.
This is a process of trial and error. You have to write and rewrite emails to get them right. Therefore there can't be a one size fits all arrangement.
That's where email A/B testing comes in.
59% of organizations use A/B testing on their emails - and there are plenty more who should perform these split tests to make their email funnels convert better and make their email campaigns more lucrative.
In this article, we will explore what A/B testing is, why it is important, how to perform an A/B test, what you can test, and give you some great examples of A/B tests.
Also Read: Sales Engagement is a Piece of Cake
Let's say you have one variation of an email. You send it and see how your audience responds. Simple.
However, what if you have more than one variation? What if you have several ideas for what you want to say in each of the emails in your email campaign? What if you aren't sure whether to focus more on the price of your product or its features or where to add the CTA button?
You don't necessarily have to be unsure, sometimes you may have more than one idea or you may be wondering what will work best.
This is where you can test customer behavior with email split testing or A/B testing to find the winning version. A split test takes two versions of an email with one element different in both variations and tests it to see which one performs better.
When you find the winning version, you can use emails like that more often or mass email that version since it is performing better.
This is how you optimize your email marketing campaigns.
How do you figure out which email is performing better?
You look at the metric you are measuring and see which of the emails shows a statistically significant result. For example, if you are testing email subject lines, you are comparing open rates, if you are testing the CTA, you will be comparing click-through rates.
In a nutshell, A/B testing is trying out two different versions of an email on a small sample of your email list. Whichever email scores higher in the test is then used for the mass email campaign and sent to the whole email list.
You can also test emails and use your winning email as an example for the future. Then, use an email similar to that in subsequent campaigns.
Sometimes, you may want to create a few variations and either test them all against each other or select two versions worth testing and test those.
Let's look at the example below:
This shows an A/B test for a very minor element - the emoji in the subject line. However, small things like this can affect the open rate and it is important to ensure you are using the version of an email that is most preferred by your audience.
In this email, recipients preferred the subject line without the emoji.
Email A/B testing is important because you won't always have the answers to everything.
You need to learn more about your target audience's behavior and this is the only way you will find out. A/B testing gets rid of the guesswork and helps you make data-driven decisions instead of just relying on your gut feeling.
Once you start testing your emails and different elements in your emails, it will help you design more effective marketing campaigns in the future.
Here is what A/B testing does.
Whether you are sending marketing emails or explaining a new feature, you need to ensure your emails are engaging and are adding value to your buyer's journey.
However, if there is a small element that is placed incorrectly, maybe you can make a small change to your email to make it more engaging and keep your audience attracted.
For example, your CTA may have been placed as a hyperlink and your prospects may respond better if it was placed as a button. You may have been using very dry to-the-point copy but your audience may prefer it if you were a bit funnier.
You will find out what your audience prefers once you A/B test your emails.
A/B tests help you enhance user engagement and consequently develop better campaigns.
If you are testing your subject line like 70% of people, you can increase your open rate by finding the winning formula. Perhaps you need to add an emoji, add a name to the subject, use one word differently, or use a funnier tone - the possibilities for the test are endless.
For example, one subject line may be "The most luxurious matte makeup" while the other version may say, "Feel like a queen, it's all in the sheen!" They both have very different tones and while one is direct, the other has a bit of a creative edge.
If you use both on the same email and conduct a split test, you will find out what your audience is likely to click on.
What if you sent one email without even considering a test to a large list of 25,000 prospects only to find out that your campaign failed miserably? Isn't it better if you A/B test emails on a small sample and then use the better-performing email to send to your whole email list?
Testing your emails will minimize the risk of losing out on business and engagement when you can do much better if you simply test elements in your email.
This will increase your conversions and make your campaigns more successful.
As mentioned above, testing your emails will increase conversions as you will be able to send an email campaign after properly determining what your audience prefers.
Let's say you are testing an offer via email - you send one email offering a 20% discount and another offering a free gift with a purchase. After a few days, you discover that the 20% discount is working much better than the free gift.
You can then alter the remainder of your campaign and mass email the 20% discount to the email list. This will give you a higher rate of conversions than if you offered most of your list a free gift with their purchase.
Now, let's talk about the metrics that you measure with A/B tests.
When you conduct an A/B test, how do you determine whether your test is successful? You need to look at the metrics that you are measuring to see which one shows a higher percentage.
Here are the metrics that you can measure with A/B testing:
If you are testing the sender name, subject line, or preheader text, you will be using open rates as the measuring metric. For example, you may test adding the company name to the sender name against a subject line that only includes the sender's first name.
For example, if I get an email that says Diana from LinkedIn Premium, it tells me who is sending the email and may increase my chances of opening it opposed to if the sender name only said Diana.
To calculate the open rate, you will see the number of emails that recipients opened out of the total sent as a percentage.
Whichever email has the higher open rate is the winning email and you should use that version for the rest of your email list.
You can also take the winning email as an example for future emails - so if you added your company's name to the sender name, you may want to use that as a regular practice.
If you are testing internal elements in your email such as the copy, placement of the CTA, tone, or anything else, the measuring metric will be the click-through rate. You will want to see how many people click on the links in your email and actually go through to the next step.
If the internal elements of your email are effective, your click-through rate will likely be higher.
With the conversion rate, you want to see how many people actually clicked on your CTA and made a purchase or went through with completing the call-to-action. It is arguable that the conversion rate depends on the landing page that your email leads to - as that is where prospects convert.
However, A/B testing is part of conversion optimization, so you should also see if your winning email is leading to better and higher conversions. If your email is taking people to the landing page, it is doing its job.
For example, let's say your email has a high open rate and a high click-through rate, but has a low conversion rate. This way you can pinpoint that there may be something missing on the landing page that your traffic is landing on.
However, if your conversion rate is high from the traffic that is coming from your email but compared to the number of people you sent the email to, the click-through rate is low.
That means the few landing on your landing page are converting but you aren't able to get many to your landing page. This means your email is probably lacking in some aspect.
To conduct an A/B test, you need to follow these steps:
Make a hypothesis regarding your A/B test. For example, you can anticipate that Email A will get a higher click-through rate than Email B because your audience is likely to prefer a more light-hearted conversational tone.
Write the first version of your email as the control version (this will remain constant) and will be tested against all the other versions.
Change one element in the control version and create a second testing version. You will test this version against the control version. You can create numerous testing versions, but they will be tested one by one against the control version.
Set up the split test on your email marketing software by selecting a small portion of your email list and specifying the duration of this test.
Send the emails.
Analyze the results. Your winning email must show a statistically significant difference in the measuring metric for the test to be considered successful.
Retest your hypothesis. For example, let's say you had a control version A and you tested both Email B and C against the control version. Email C showed the most significant results. Rerun the test with only version A and version C to see if you get the same results. This eliminates the likelihood of external factors affecting your outcome.
If the test is successful and one email does show a statistically significant preference, send the winning version to the rest of your email list.
However, there are things that you need to look out for in an A/B test.
A/B tests can give you important insights into consumer behavior, but there are a few things you need to be wary of.
You will need to use email marketing software to successfully conduct an A/B test. The software will simultaneously send your emails to small portions of your email list. This prevents duplication and the effects of other variables, such as sending time, on your results.
You can only test for one variable at a time. Your emails must not be totally different from one another, but must only have one element different. For example, it can be the placement of the CTA, the CTA itself, the subject line, the tone of the copy, the discount offered, etc. So, if you choose to test the subject line, both of your emails will be exactly the same except for the subject line. This way you can tell which element caused a change in the measuring metric.
3. You must send the email to the same type of target audience. If the characteristics of your sample group differ for each email, you will not be able to determine whether one email has done better than the other. Perhaps one email was sent to a group of 50+ executive-class people while the other was sent to 20-34 year old travelers. You won't be able to determine whether your email caused the difference in results or it's just because you are sending the emails to a different target audience.
4. You must conduct the test for a few days in order to allow your audience to respond. Too short of a test will not accurately tell you whether changing the chosen element has made a difference.
5. Also, conduct the tests simultaneously, because conducting them at different times can show you skewed results based on factors such as sending time.
6. Make sure that your landing pages are great because email testing is a part of conversion optimization. If you have great emails, but your landing pages are not optimized to convert, you will be losing out on a lot of business.
There are a number of things you can change in a test version, but as mentioned above - remember, only one variable at a time.
You can test different versions of the subject line to see which one spikes up open rates more. For example, you create two subject lines for one single email. Everything else in the email must remain the same. Your A/B test results will show you which email subject line is working better and is more appealing to your recipients.
You might want to test a serious subject line with a funny one or one that mentions one pain point/benefit versus another pain point/benefit. You can also try including the company name in one subject line while the other can be without it.
This can show you whether mentioning the company causes a higher number of recipients to open due to brand image, goodwill, or other positive emotions or whether the situation is vice versa.
However, to keep the test fair, it would be a good idea to keep the subject line length short enough to fit onto mobile devices.
If you test one subject line which doesn't fit onto mobile screens and the other does, you really aren't A/B testing them properly as most people check emails via phone and they won't be able to see the full subject line for one email while they can for the other.
In this email, the subject line remains the same, and the email body remains the same but you want to test whether your audience prefers plain text emails or emails with images. One email will be plain text and the other may include static images. You can also test an email with text-only versus multiple images, text, etc. or you can try one with one image and another with several images.
Version A will be compared to Version B to see which version inspires higher click-through. Do remember that including a lot of images in your emails can also affect email deliverability and many email clients may consider this to be spam.
This time, you won't be sending the exact same copy to your recipients. You will keep the email subject lines and the purpose of the email the same. For example, you may be writing to explain a new feature. You can write different emails and then see which one has statistically significant results in terms of click-through rates.
Your body copy can have two different tones, different information about the same product/service or the two versions may have different word order.
You can also try to add social proof in the form of customer testimonials to one while another might be without it. However, remember that only one element - such as the body copy - should be different while everything else needs to remain consistent.
For example, you can't change the copy and the CTA because then you won't be able to test which element has caused the change in metrics.
The preview text is the two to three-liner that appears under the subject line when an email hits the Inbox.
This part, just like subject lines, is also a determinant of the open rate. However, the text is visible in different ways on different devices. Desktop viewers may not see all of it unless they receive notifications for their mail. Those checking their email on mobile usually see the text whenever they get a notification for a new email.
You can A/B test which text entices your audience more and has the higher open rate.
You can change the call-to-action in each of your emails by wording it differently. A good example is using the traditional "Buy Now" to "I really need this!" or " Let's do this!"
This will tell you whether a certain CTA has a better click-through rate and eventually conversion rate than the other.
A/B testing can also involve testing visual elements such as the layout. You can try columns, vertical alignment, a horizontal build, etc. Determine how much of your email a person views through the click-through rate to figure out which email layout seems more attractive to your audience.
You can also try placing your CTA in different areas of your email to see where it seems to be more visible to your audience.
You can try offering a different discount code in each email to see whether what the code says has an impact on conversion. For example, try 10%OFF against LoveChips10.
You need to test this for a few weeks as people may not be ready to buy immediately but may eventually get around to it.
You can also try offering an exclusive code in your email while in the other email, you offer people 50% off (or another percentage) automatically at checkout.
Email marketing is about introducing your audience to a great offer and then enabling them to avail of it. However, what if you are not sure which offer would have a positive effect on most subscribers? This can occur if you have two offers in mind but aren't sure which one to entice recipients with.
Here you will use split testing in your email marketing campaigns to see which offer is most preferred. As explained above, you can send one email with a percentage off the original price of a product and another one offering a free gift with the product.
Create two emails and test them against one another to see which offer has the highest conversion rate. However, you need to make sure that everything else is the same in your email campaigns including the subject lines for both of the emails.
This will give you a clear depiction of which offer is most suitable without the influence of other factors.
Let's look at a few popular A/B testing examples.
The image below shows an email campaign that was a/b testing subject lines. In the first version they simply mentioned the title of the blog post while the second version included the words "Check out my recent post". The version with just the blog title got more opens, a higher CTR, and better conversions.
This example by Microsoft had the hypothesis that changing the color scheme from this drab and dull scheme to a brighter one would increase click-throughs.
Though Version B looks pretty, but Version A won the test because the white background with the purple button draws the audience into the information included. Version B has a cropped image of the laptop while everything is clearer and better placed in Version A.
This third example shows an email that requires the recipient to purchase pendants in order to help save seals (a donation would be made from the proceeds). The first email shows copy with an emphasis on the story of the seals but the second version shows the products and centers around the product instead of the story of the seals.
Version B won the split test as it was important to show the products in order to get customers to click through.
Let's look at what we learned about A/B testing for email campaigns.
While there were several important points covered in this article, let's look over a few of the main points that signify how A/B testing helps optimize email campaigns.
A/B tests are an important part of email marketing because businesses may be missing out on conversions by not testing different elements in their email campaigns and determining the better version.
Just like a control group, you need to have a control email that you test against all test versions.
You can test many elements of your email but you can only test one element at a time. For example, you cannot test the CTA and the internal copy of an email. You will be unable to pinpoint which element caused the change in metric.
Email subject lines are one of the most popularly tested elements. When you test two different subject lines, the better version will have a higher open rate.
Make sure you send your test emails and control version at the same time to a random sample from your email list to prevent the influence of external factors such as sending time or target audience.
6. Your A/B test is successful if it shows a statistically significant difference in results. A minor difference can be a coincidence. For example, in the image above, the difference is of only 1%. You may need to rerun the test to see if the next run can show a higher percentage difference.
A/B testing is a powerful tool that can help craft an excellent email marketing strategy. Whether you are trying to build a strong personal brand or figuring out what to do best for a company as an email marketer, successful tests can help you decipher a lot about customer behavior and psychology.
Email marketing becomes much simpler when you have a great email marketing service that can help you set up A/B tests and analyze the results within minutes. Pribox offers a great service that includes all email marketing features including crucial add-ons that make the email marketing process even easier - email verification tools and an AI content creator.
You can check out what Pribox offers and sign up for free today!
Reach more customers with your cold emails
Table of Contents
In a nutshell
Subscribe to our Newsletter!
Digital advice costs money but we send it to
your inbox for free.
Book a quick demo of our email marketing tools and watch as we transform your leads into loyal customers.