There were many innovative tech companies at the event that seemed to have the capability to improve on the current measurement of media and consumer behavior through the use of artificial intelligence, machine learning, data blending and data discovery tools. And many of these companies have the ability to merge all platform, device, app, software and site data on one user interface and run it all in or close to real time. Today’s tech environment facilitates storytelling and data visualization to better leverage business intelligence, which is exactly what we in the media space are seeking.
Needless to say, I was impressed by just about everyone I spoke to. Check out what execs from many of the companies represented at Gigaom Data Structure had to say in this video.
Let’s begin in the area of data blending. According to a recent Gigaom Research study, “Data blending is the term used to describe the performance of analytics on a collection of data sets, each emanating from a different data source.” They say that data discovery products currently on the market are now capable of real time or close to real time data blending, allowing users to pull in data, quickly mash it up and analyze it. Some companies like cloudera offer the ability to serve many different types of user workloads with access to the same data set within a single interface. This interface, according to Cloudera CEO Tom Reilly, “can manage all types of analytics and identify trends in customer experience.”
I wonder, can we take television data (Nielsen, STB or other), online, mobile and tablet (name your sources) and maybe even print, retail or transactional data, and place it all on one interface and data blend? I know there are some companies offering this capability now in our industry, but with limited datasets and outputs, that still seems to be fairly silo’ed.
It still seems very futuristic to me, but in fact is fairly common in applications today. Some firms, such as AlchemyAPI, are in the business to “make computers more human” according to CEO Elliot Turner and others like Watson Solutions’ Stephen Gold offer “cognition as a service.”
But what does this really mean for media? I think the possibilities are endless. We may be able to finally pinpoint how a viewer fully interacts with a piece of content, including the soft measurements of sentiment and engagement. We may even be able to use AI to predict which pieces of content are most effective at driving human behavior, whether for tune-in, reaction, affection or to create a call-to-action.
Combined with a level of AI, machines can be programmed to learn from experience. In this way data can be mined more efficiently and with greater precision to create software applications. Tim Tuttle of Expect Labs explained, “We built mind meld to listen to conversations and find information for you. Now we want to take that technology and apply it to any data you have.” Think of Siri, who seems to retain knowledge with each interaction. Siri is just the tip of the iceberg in machine learning capabilities. At some point we may rely solely on the machine to map out data insights. I hope I am retired by that time.
Data Discovery Tools
“When it comes to big data understanding, we are in the Dark Ages,” said Donald Farmer, vice president of product management, Qlik. This is exacerbated by unnatural interfaces that once had a purpose under old media but no longer apply. Farmer gave the example of the keyboard QWERTY system that was originally set up to keep typewriter keys from jamming. We still use this unnatural keypad interface even though our devices today do not have the problem of typewriter keys jamming up. How to we advance from old legacy systems and processes?
This and many other questions are a part of a huge data surge that impacts many businesses, including ours in media. Attending the Gigaom event, I realized that we in media are not the only ones grappling with processing, measurement, analysis and data-wrangling issues.