I'll be honest. I've gone back and forth over the years on the value of marketing mix models, all the time seeking to understand where, when, and how they add the most value to marketing managers looking for insight into more effective and efficient marketing. I think I've seen all the warts. So you'd think that I'd have given up by now and moved on to something more innovative that might address some of these issues like artificial intelligence; agent-based modeling; systems dynamics, etc. Nope. Call me a MMMM (Marketing Mix Model Masochist) if you will, but all this criticism …
Traditionally, "time lag to conversion" has been defined as the amount of time between the last online display ad the consumer interacted with, to the first conversion activity that was performed. And most ad servers have delivered a "Publisher Time Lag Report" to their clients detailing the number of users that convert within an hour, a day, a week, a month, etc., by publisher. This was very meaningful when consumers clicked on online ads and in so doing clearly "marked" their intent, and it allowed marketers to make meaningful conclusions around which publishers refer users who convert and/or convert quickly.
It wasn't that long ago that direct marketers were a breed apart: data-obsessed spreadsheet jockeys who were constantly tweaking the knobs and dials of campaigns to yield incrementally better results. Now, of course, everybody's data-obsessed; everybody's busy tweaking the knobs and dials. If you're marketing digitally, you're a direct marketer -- period.
I recently spoke to an audience of search marketers on the topic of operationalizing the valuable results and insights that marketers and their agencies can discover through the attribution management process, and how frustrating it can be trying to translate these into repeatable, ongoing processes if you don't know where to start. Here are a few highlights from the advice I shared with them.
In the digital space we have something that no other medium has ever had before: the ability to collect empirical census data on the behavior of machines. Since consumers access the Internet by using one of these machines, the ability to empirically track machine behavior has been one of the lynchpins of the success and growth of the online medium. But I have always believed that this capability was both a blessing and, in some ways, a curse.
At a conference I recently attended, each session began with a two-minute "rant" by someone who was willing to say, "I've had enough of this and I'm not going to keep quiet anymore." If I were to rant, it would be about online survey cheaters. Since I don't rant well, my energy is better used sharing what we can do to keep these people out. I like to call this "Quality BlackJack" because there are 21 areas where you should implement quality controls. If you do so, and improve those controls continually, you can win the Quality Jackpot. You may …
To read more articles use the ARCHIVE function on this page.