Last week I asked the question, “Will Big Data Replace Strategic Thinking?” Many of you answered, with a ratio splitting approximately two for one on the side of thinking. But, said fellow Search Insider Ryan Deshazer, “Not so fast! Go beyond the rebuttal!”
I agree with my friend Ryan. This is not a simple either/or answer. We (or at least 66% of us) may agree that models and datasets, no matter how good they are, can’t replace thinking. But we can’t dismiss the importance of them,either. Strategy will change, and data will be a massive driver in that change.
Both the Harvard Business Review and the New York Times have recent posts on the subject. In HBR, Justin Fox tells of a presentation by Vivek Ranadive, who said, “I believe that math is trumping science. What I mean by that is you don't really have to know why, you just have to know that if a and b happen, c will happen.”
He further speculates that U.S. monetary policy might do better being guided by an algorithm rather than bankers: “The fact is, you can look at information in real time, and you can make minute adjustments, and you can build a closed-loop system, where you continuously change and adjust, and you make no mistakes, because you're picking up signals all the time, and you can adjust.”
The Times’ Steve Lohr also talks about the recent enthusiasm for a quantitative approach to management, evangelized by Erik Brynjolfsson, Director of the MIT Center for Digital Business, who says Big Data will “replace ideas, paradigms, organizations and ways of thinking about the world.”
However, Lohr and Fox (who wrote the excellent book, “The Myth of the Rational Market”) caution about the oversimplifications inherent in modeling. Take, for example, some of the potentially flawed assumptions in Ranadive’s version of an algorithmically driven monetary policy:
- Something as complex as monetary policy can be contained in a closed loop system
- The past can reliably predict the future
- If it doesn’t -- and things do head into uncharted territory, -- you’ll be able to “tweak” things into place as new information becomes available.
Fox uses the analogy of a Landing Page A/B (or multivariate) test as an example of the new quantitative approach to the world. In theory, page design could be left to a totally automated and testable process, where real-time feedback from users eventually decides the optimal layout. It sounds good in theory, but here’s the problem with this approach to marketing: You can’t test what you don’t think of. The efficacy of testing depends on the variables you choose to test. And that requires some thinking. Without a solid hypothesis based on a strategic view of the situation, you can quickly go down a rabbit hole of optimizing for the wrong things.
For example, most heavily tested landing pages I’ve seen all reach the same eventual destination: a page optimized for one definition of a conversion. Typically this would be the placement of an order or the submission of a form. There will be reams of data showing why this is the optimal variation. But what about all the prospects that hit that page for which the one offered conversion wasn’t the right choice? How do they get captured in the data? Did anyone even think to include them in the things to test for?
Fox offers a hybrid view of strategic management that more closely aligns with where I see this all going -- call it Bayesian Strategic management. Traditional qualitative strategic thinking is required to set the hypothetical view of possible outcomes, but then we apply a quantitative rigor to measure, test and adjust based on the data we collect. This treads the line between the polarities of responses gathered by last week’s column – it puts the “strategic” horse before the “big data” cart. More importantly, it holds our strategic view accountable to the data. A strategy becomes a hypothesis to be tested.
One final thought. Whether we’re talking about Ranadive’s utopian (or dystopian?) vision of a data driven world or any of the other Big Data evangelists, there seems to be one assumption that I believe is fundamentally flawed, or at least, overly optimistic: that human behaviors can be adequately contained in a predictable, rational, controlled closed loop system. When it comes to understanding human behavior, the capabilities of our own brain far outstrip any algorithmically driven model ever created -- yet we still get it wrong all the time.
If Big Data could really reliably predict human behaviors, do you think we’d be in financial situation we are now?
Gord -- great column (as always). It's great to have someone championing Big Thinking as an overarching control over Big Data.
A book you might want to add to your reading list is Nassim Taleb's latest "Antifragility." He continues to hone a very compelling case against the limitations of models and historical data.
Great column. Predictive modeling based on "big data" can be hugely valuable. But despite some definitions of the later that imply an absence of hypothesis, that is definitely not the case. 100% agree with notion that big data can't stand on it's own. Done correctly (which is an important qualifier!), it is the combination of data and strategic thinking that generates the best results. As you say, it's not either/or.
Closed loop designs have been around since WWII and back then "operations research" was the darling of behavioral science.
Today, computational models where big data is present have the means to discover predictive patterns in transactional data.
However, if none of your customers buy a product, stay at a hotel, take a flight, etc, simply because the number 4 was somehow involved, then your model needs to know that this is an unlucky number in Chinese cultures and that is over 1 billion consumers.
Show me a big data model that takes into account this kind of behavioral attribute.
In my previous post, forgot to mention that this actually occurred during a big data study of the transaction behavior of a bank's customers where a large percent were Asian.
If the world were truly as rational as some would believe, retailers would have long ago shifted to a world where they don't rely on local employees and everything can be controlled remotely. Sure, retailers have tried, but the results have been dismal. Instead, retail successes are found in places where technology is used to enhance, not replace, individual productivity and initiative. Like the modern supply chain, most big data applications work for a broad spectrum of applications if and only if environmental conditions remain exactly the same...and the more tightly managers try to exert control, the more black swans appear, as if by magic. As customer service professionals know, a single thing done wrong can undo a thousand things done right.
You will always need a sniff test from a human with experience.
There has to be a qualitative approach to this question. Both data and strategy are tools to companies seeking to improve results. Methodologists have long recognized that the theory one uses shapes the kinds of questions it generates and is able to answer well. Crudely put, religious theory is designed to answer questions about God, while Newtonian physics, a scientific theory, is designed to answer questions about mass, velocity and the like. They don't work well if one tries to use them to generate useful results in areas in which they are poorly equipped to do so. The assumptions built into theories is what makes them effective, or not. Closed loop theories can work well with biology, but not always in other areas and they have been debunked for social phenomena as teleology. Societies are simply not necessarily self-correcting. For strategy and data to work well effectively, some self-awareness by the project leader of the assumptions being used is necessary as those will need to be considered when reviewing the results. This goes beyond simply running a series of different questions in parallel. Rather, it touches on issues and biases that can be illustrated by words like non-white or female, where cultural expectations are hidden in definitions of 'otherness'. Without a qualitative approach, perfectly logical results a+b=c can be obtained that are flawed in that they contain correlations that have explanations external to the results the questions are obtaining. Using the Newtonian analogy again, the shortest distance in space is a curve.
Thanks all for your very thoughtful comments. And thanks for the reading suggestion Scott. It's been added to my reading list!