Many thanks to everyone who responded to my informal survey about what you'd like to see in the Email Insider. Your feedback was very helpful and will be the basis for these columns for a long time to come.
One common theme among the responses was a desire to see case studies and results. The reason you don't see more of them in this and other columns is that clients are not eager to share their hard-won knowledge with the outside world, particularly their competitors. But fear not, there are ways to get around this obstacle.
Assuming your goal is to see how your program stacks up against others, and get ideas to improve your program quality and results, I recommend three things:
advertisement
advertisement
This E-mail | YTD Average or Previous Year Average or Previous Campaign Average | |
Delivered | ||
Opens | ||
Response (CTR) | ||
Unsubs (Unsub/Clicked) | ||
STAF (STAF/Clicked) | ||
Links Clicked/Responder |
The elements of your snapshot will vary depending on your program, e.g., you should include purchase or conversion data if applicable. It's a simple thing, but by comparing a single effort to the program overall, it causes you to ask why this e-mail was better or worse than the average. This, in turn, should inspire you to develop theories to test in the future. As you look at your overall results for the year (this data for all e-mails in one table), you can draw additional insight from the star performers and the dogs.
Some test ideas will come up in the creative process: the client/manager doesn't like your approach and suggests another; you believe your approach is best. The knee-jerk reaction of direct marketers in this situation is, "Let's test it." It should become your mantra as well. Another way to ensure that you are learning from your efforts is to develop a list of things you'd like to test and work your way down the list. Remember to test big, and you'll be on your way to major program improvement.