Dear Email Diva,
Our company has two newsletters we've been sending for 11 months. The first few months we tested/rearranged the layout, reevaluated content, etc. Now,
I'm at the point where I need to start analyzing the numbers to see if these newsletters benefit our audience and our company. Since I've been seeing a decline in opens and click rates, I've been
diligently researching ways to improve our newsletters. Should I be comparing our performance to industry standards? Our distribution list is on the smaller end and I don't know if we fall into the
"average" category, so would it even make sense to compare to industry standards/metrics? I guess it wouldn't hurt to know this info. Do you know if there's a Web site/resource that publishes these
This is a question that causes all email marketing pundits to get up on their soap boxes. The
problem is both whether your program is comparable to "the average," and whether they calculated their stats the same way you calculate yours.
I feel I've cited this study a million times,
but of course I forgive you if you don't read every Email Diva column. (Every other is just fine.) StrongMail and the EEC did an Email Metrics and Bounce Management report (go to the Email Experience Council Whitepaper Room
) that shows how differently marketers and email applications calculate the numbers. For the "open" metric, 50% of those surveyed calculate it as Unique Opens/Delivered, 8% calculate it as Total
Opens / Delivered, 4% calculate it as Total Opens / Mailed, 17% calculate it as Unique Opens / Mailed and 21% have another method altogether. These differences (combined with differences in what is
considered "delivered") mean that the open rate for the same mailing can range from 12% to 35%!
One possible workaround for this issue is to get the stats from your Email Service Provider.
They may collect aggregated data for all clients and, if so, different calculation methods won't be a concern. You still have the issue, however, of whether your program is comparable to others.
In the direct marketing industry, we were always fighting the assumption that a 2% response is the industry standard. If one is mailing a fabulous offer to the house file (active
members/donors/buyers), a 30% response might be reasonable. For an acquisition mailing to new donors with an unknown company/product, a .5% response could be perfectly acceptable.
Nevertheless, we have this burning desire to see how we measure up. One excellent source is the Marketing Sherpa Email Marketing Benchmark
[link to ttp://www.sherpastore.com/email-benchmark.html]. Rather than an aggregation of raw data, it is a survey of marketers, and has many granular breakdowns that may get you closer to
comparing your apples to their apples.
Good for you for testing in the early stages and seeking to know whether "these newsletters benefit our audience and our company." This information
cannot be found through comparisons, but by looking at your data and getting feedback from your audience. Aggregate your stats from all campaigns on a spreadsheet and sort by click-through rates (or
your most relevant statistic) to determine which were the most and least successful campaigns. Look at the percentage of clicks on each link to see which types of content are generating the most and
least interest. Read the "reply to" emails that typically end up in the bit bucket. Ask employees who have the most client contact what they think of the newsletter and for anecdotal customer
feedback. Send a survey to customers. (Keep it short, include requests for free-form as well as specific responses, and offer a small incentive.)
Comparing your results to industry
standards will never tell you whether the effort is worthwhile for your company. Direct marketers obsess about response rates but don't concern themselves with industry standards. The only standard
that matters is: did I make money/was I able to acquire new customers at an acceptable cost?
Email can be a brand vehicle as well, so these bottom-line metrics may not apply. Determine
the measures of success for your program, identify the data to track it, and work for continuous improvement against your own
Good Luck!The Email
Send your questions or submit your email
for critique to Melinda Krueger, the Email Diva, at firstname.lastname@example.org. All submissions may be published; please indicate if you would like your name or company name withheld.