Silverpop - Are You Testing Apples and Oranges?
It appears you are using an older version of your browser. This site was developed to be progressive and future-compatible. Please take a minute to upgrade your browser for an optimal experience.
Skip to content
  • Subscribe:

Are You Testing Apples and Oranges?

blog post thumbnail image
by: Loren McDonald (@LorenMcDonald)
02 August 2010

Testing, especially A/B testing of subject lines, is one of the most popular topics among email marketers. Subject-line testing is also probably the easiest and most popular test to execute. Perhaps because of that, the results are the most misinterpreted too.

A perfect example of this is a recent test, submitted to the conversion-optimization website Which Test Won. The test intended to compare open-rate performance of single versus multiple-topic subject lines.

Whether you should focus a subject line on one topic or include more than one is a common question among content publishers, but it's also relevant for retailers wondering whether to promote single or multiple products in a subject line.

Setting up the test correctly is critical, however, if the results are going to be meaningful and dependable for future use.

Below are the two subject lines tested by the UK-based company SmartFocus. Which one do you think won?

A. Changing mobile behaviour survey results - 5 key CRM questions, customer engagement tips

B. 1 in 5 emails will be opened on mobile - our survey results predict user behaviour

In this test, B won with a 21 percent lift in opens and 45 percent increase in unique clicks. On the Which Test Won site, 81 percent of visitors voted for subject line B while 19 percent picked A.

So, why do I question the results? Because, as a number of commenters on the site noted, subject line B was simply a much more interesting subject line, as the 81% number proved.

Here are two areas where I think this test got off track:

1. Problem: The test didn't actually test what it set out to test - single versus multiple topics. In this case, A was actually much different from B.

Solution: Make sure you are comparing the same variables. In this case, better subject line variables might have been these:

- A) 1 in 5 emails will be opened on mobile | 5 key CRM questions

- B) 1 in 5 emails will be opened on mobile

- C) 5 key CRM questions

- D) 5 key CRM questions | 1 in 5 emails will be opened on mobile

An A-B-C-D test approach like this would have accounted for any bias in the order of the content and the type of content used. For example, if both A and D had the highest performance, and if the test were repeated multiple times with different content, then a multiple topic subject line could be considered the winner.

2. Problem: It was a one-time test.

Solution: As mentioned above, any test should be conducted multiple times to minimize variables that could affect the outcome. If the test results are similar after multiple tests, your confidence will be significantly higher than after just a single test

Remember: Testing subject lines is a fairly easy process, but make sure that your test is truly comparing apples to apples.


Sign up Now!

Subscribe to IBM Marketing Cloud's Digital Marketer Newsletter!

Popular Categories

Top 5 Posts


To give you the best experience, this website uses cookies.

Continuing to use this website means that you consent to our using cookies. You can change your cookie settings in your browser at any time.
Find out more here or by clicking the Cookie Policy link at the bottom of this page.