Commentary

Getting Creative With The Scientific Method: Secrets To Strong A/B Testing

  • by September 27, 2011

Creative folks who find themselves in the email industry are spoiled by incomparable access to performance metrics offering immediate feedback on creative choices. Sending just one campaign a week offers the creative team 52 opportunities a year to test, learn and refine marketing and program strategies.   

For programs in their infancy, it's easy for team members to run tests on a headline or image here and there. We see many organizations engage in sporadic testing, and more often than not, the organizations aren't appropriately staffed to review the results or send them to their creative team. Testing is only valuable when email teams can use the results to inform subsequent campaigns.

So, what can organizations do to ensure meaningful testing? Get scientific with it! 

As it turns out, what we learned in fifth grade science can actually apply to our adult professions in marketing. Following the scientific method in creative testing can lead to meaningful results and, ultimately, big boosts in campaign performance.  

advertisement

advertisement

Step 1. Define the question. We see a lot of tests go out with two very different subject lines or two (or sometimes three) completely different creative treatments. The trouble with a test like this is that it's hard to identify the variable that caused a certain version to win. Always identify a testing target -- and hold other variables constant -- in order to understand your results. 

Here are a couple examples of tests to conduct: 

- Long subject line vs. short subject line

- Order of content in subject line

- Items in the navigation

- Buttons vs. underlined text links

- Primary messages (When testing the primary, the hero could feature different creative treatments, but the rest of the test email should be pretty similar to the control.) 

Step 2. Form a hypothesis. You should have some idea about what the results of the test will show. By spending the time to assess what you're testing and reflect on results you've seen in the past, you'll be able to have some ideas about how the results will turn out in each test you run. By making this a habit and testing hypotheses on a regular basis, you'll be able to refine your understanding of the way your customers interact with your messaging. 

Step 3. Conduct a controlled test. There are a couple of ways to do tests. The easiest is a 50/50 split of your list. Others will send a test to a smaller subset of their list, then a day later, send the remainder of their list the winning campaign. When you start to get into more complex ways of segmenting your list, always remember to keep a control group.

Step 4. Analyze the data. The first step toward measuring progress is creating benchmarks for your program. If it's a relatively new program, testing during the first few sends is still beneficial because it helps you develop a sense of what "normal" means for your program. Your program can continue to improve when you make time to review the results on a regular cadence. At a minimum, look at performance metrics monthly. Regardless of whether your results are examined by an analyst, strategist or account manager at your email service provider, make sure that the new learning gets passed along to copywriters and designers who can do something with it on the next campaign.

Step 5. Retest. Flukes happen -- in science and in email. Errors happen, too. Before rethinking your entire email creative strategy because of one testing result, run the same test again. As always, make sure to  keep your results reliable with a control.   

Step 6. Enjoy the learning!  After designing email campaigns for a while, it's easy for us to get complacent -- to think that our "gut feeling" about what subscribers like is always correct or  to believe that a design that follows best practices will always win. After years of working with email clients, however, our team members continue to find value in testing results, and there are some results that greatly surprise us and allow us to rethink our efforts. 

Those moments of reconsideration are where real creative growth happens. And to think it can all come from science!  

Please share some surprising or successfully A/B tests that your organization has run!

Next story loading loading..