Commentary

Seeing What Flies: How To Test Email Creative And Lists

Testing is a time-honored direct marketing practice. To this day, marketers test ideas, both online and offline, using “math developed a century ago,” Stefan Thomke and Jim Manzi wrote a few years ago in the Harvard Business Review.

But a shocking number of marketers don’t. For example, half do not test email, GetResponse reported in a study earlier this month.

Hard to believe -- but it’s in line with other recent surveys. In effect, it means that email blasts go out with little insight into whether subject lines, offers, calls to action, headlines, body copy, landing pages and email lists will work.  

Yes, you can rationalize that wasted emails don’t cost much (as opposed to misdirected direct mail packages). But what about the annoyance factor and the failure to engage with the customer and build a relationship?

You can’t afford to buy into this folly. Above all, you have to test creative and other elements to determine whether they work for a mobile device, said Jay Schwedelson, CEO of Worldata, in a recent Data & Marketing Association Webinar. But he advises not to do a straight 50-50 split. "On the surface A/B testing for email sounds like a great idea," he says in an email. "The problem is that when you A/B test you wind up sending 50% of the audience you are emailing the lesser performing version of your campaign.  I prefer to recommend a different setup.  Take 20% of your overall audience and do an A/B test with that group.  Then wait 24 hours and see the open and initial click rate on those drops.  You will see statistically relevant data that should provide a clear path on which version is best.  Then rollout to the remaining 80% with the 'winner.' This scenario allows you to send your best version of your campaign to 90% of your audience instead of just 50% of your audience."

Each element is important. For example, “your call to action is obviously going to have a direct affect on how many people buy your product or click through to your landing page,” Kissmetrics writes in an evergreen blog titled A Beginner’s guide to A/B Testing. “Your subject line, on the other hand, will directly effect how many people open your email in the first place."  

“Think about this when you’re deciding which things to test first,” Kissmetrics continues. “If not many people are opening your emails, then you’ll likely want to start with your subject line. You’ll likely want to test the more important parts first. Your headline and call to action will likely have a great impact on conversions than the images you use or your body text. Test those things first, and then test the others in greatest to least importance.”

And lists? “The larger your test sample, the more accurate your results will be,” Kissmetrics reports. “Make sure that the split is done randomly, too. Hand-picking recipients (or even using two lists from different sources) is a great way to skew your results. The goal here is to gather empirical data to figure out which version of your A/B test material really works best. 

Comm100 adds that marketers should follow these best practices when conducting an A/B test. We quote:  

  • Remove inactive users.
  • Remove highly active users.
  • Sort alphabetically.
  • Do not sort by join data.
  • Split your highly active or highly inactive users separately.
  • Check the percentage of email service provider addresses. 

Comm100 also recommends that you test send date and time, “from” addresses — and image.

“As with any email test, make sure that you're testing a pure A/B split of your list and don't make any other variable changes other than the change in the image,” it writes. “You're looking to see if making a change in how you display and choose images can improve your email performance. You can't do that if you change other factors at the same time.”

Now let’s look at newsletters — emails driven by content instead of offers. This is almost impossible if you’re doing hard news, because news is random and not always popular. Let’s say you post an article on a regulatory issue. Not everyone will want to read it, but some will need to read it. 

Still, over time, you’ll get an idea of what types of headlines and blurbs will get people to click through to your stories, and what kind of news you should pursue. And in the case of how-to content, you surely can test subject lines, headlines, blurbs and "from" lines. As you go, you may even use those insights to create versioned newsletters.

Granted, testing is now more difficult, thanks to channel overload. “Business experimentation in such environments suffers from a variety of analytical complexities, the most important of which is that sample sizes are typically too small to yield valid results,” Thomke and Manzi write.

But that’s no reason to avoid testing. Testing remains a critical activity for retailers. When done correctly, it can save you from bad ideas — and pinpoint the good ones.   

 

 

 

 

 

 

Next story loading loading..