I have found that email marketers have a tendency toward black-and-white beliefs where there is a lot of grey. The classic example is "What is the best day to send email?" But the list goes on. Should you use double opt-in or single opt-in? Should you segment by demographics, psychographics, or behavioral data? Should those extra fields on your registration page be required or not? What is the best way to build your list?
Or there is the hot topic over on the EEC blog around the introduction of the render rate. Some suggest we should just fix the open rate by standardizing it -- which was discussed extensively within the roundtable before we agreed on the proposed metrics. The problem is that depending on the industry and the company looking at open rates, everyone wants to look at these numbers different ways. There is no "best practice."
The term "best practice" asserts that a particular technique, method or process has proven itself to be superior to alternative techniques, methods, or processes for delivering the intended outcome time and time again. They are proven. The results are consistent. 99 times out of 100; if you follow the best practice you will be in better shape than if you hadn't.
Too often in email marketing, these assertions are made based on the results of a single case study. Worse still, they are made based on case studies where no alternatives were sufficiently tested. These aren't best practices, they are "good ideas" that may or may not apply. Just because something worked for Amazon.com does not mean it will work for your company.
Teaching People to Ask Good Questions
So, who is at fault? The people answering these questions or the people asking them?
Any good professor will tell you that the most challenging part of his or her work is asking good questions. I was reminded of this watching the video of Dr. Michael Wesch of Kansas State University accepting his award for Professor of the Year in 2008. You may already be familiar with his work as the creator of "Web 2.0 ... The Machine is Us/ing Us," which has been viewed on YouTube nearly 8 million times.
We would be remiss to blame it all on stupid questions. As someone who has conducted research and written whitepapers on the best day to send email, every time I hear someone ask, "What is the best day to send email?" part of me wants to reply, "That is a STUPID question!" No, it's just naïve. And there is a difference. Naïveté is based on lack of experience that leads the person asking the question to assume there is a simple answer where there isn't one.
Whether we ignore, dismiss, or provide simplistic answers to naïve questions, we perpetuate the problem. Stop! It is hurting our industry. Ask a question about the finer points of sender authentication and you will struggle to find someone without an opinion. All the while, we struggle to convince the c-suite of the value of email. We are all adept at quoting studies from the DMA showing that email marketing has phenomenal ROI, but we cannot explain why sending "just one more email" is a bad idea. Who's naïve? The CFO who just twisted your arm into sending another email, or the person who gave that CFO the DMA study and expected them to understand the nuances of how customers might perceive that one additional messages?
When my kids have a fight and one comes pleading his innocence, I always ask the same question, "How many people does it take to fight?" Whether they believe it or not, they know the only answer that will get them out of their bind is "two."
Take a minute and read Tim Walker's blog on "The Power of Naïve Questions." He explains how, if we are honest, naïve questions force us to reassess our assumptions and use of insider language to answer questions. People asking naïve questions are trying to understand the basic concepts of our area of expertise. Our job is to teach them to ask good questions. If we want to improve the image of email marketing, this is imperative. It not only takes time, but it is a heck of a lot harder to do than simply providing the "right answer."