Digesting 'Big Data' Shouldn't Produce Heartburn

Globally, businesses created 1.8 zettabytes of data in 2011, according to research firm IDC. Data gives marketers the ability to better understand the relationship between brands and human behavior, but many just can't fathom the enormity of the bits and bytes generated from emerging online media.  Two white papers published this month try to put the concept into context. 

The first, published this week by Winterberry Group, Interactive Advertising Bureau (IAB), and IBM calls the movement "big data." It examines data-drive mediums, such as audience optimization, channel optimization, advertising yield management and targeted media buying. The ability to generate mounds of data isn't exactly new. Other industries, such as the electronics supply chain or inventory and automation systems powered by radio frequency identification technology do the same. Online marketers have just begun to experience this phenomenon.

One of the biggest hindrances to these new ways of using data for the online advertising industry remains old business processes and ad sales reps not having the technology experience to sell the inventory, according to the white paper.



Companies continue to suffer from an antiquated culture focusing on "traditional media management" even though they may use new digital channels, but marketers must learn how to combine, sift and sort through the garbage, according to the white paper. Traditional methods force data into silos separate from other marketing channels and resources, when the data should be combined allowing marketers to get a complete view of the consumer.

BCG Perspectives, published by the Boston Consulting Group this month, also highlights the importance of online data. The white paper titled The Evolution of Online-User Data looks at shifting campaign strategies, rich media use, change in use of ad exchanges and demand side platforms, and mounting financial pressure on publishers.

BCG also points to concerns about accuracy in identifying different genders from data collected by the same cookie only different providers, a common occurrence when computers are shared, according to the white paper. In fact, the white paper makes note of a list of concerns marketers should become aware of before launching campaigns, along with data classifications and descriptions of what the paper calls the next generation of data collection strategies that should emerge during the next three years. 

2 comments about "Digesting 'Big Data' Shouldn't Produce Heartburn".
Check to receive email when comments are posted.
  1. Andre Szykier from maps capital management, January 18, 2012 at 6:24 p.m.

    Zeta, exa,and gig "bytes" of big data is not the problem or the headache.

    You have to determine in each case the "half-life" of data. In other words, at what point does the value of the information decay by 50 percent.

    A few examples:

    Smart grid data (machine to machine) has no human intermediary beyond someone or some system observing aggregate and in some cases, for a specific hardware asset, a state of health. A transformer on a pole needs to be monitored across 10 metrics in real time, perhaps once a minute. A transformer vault, once every 5 seconds and a grid tie, once a second.

    Capturing this data means that you need to analyze during the transmission (data in motion), not after it goes into a data repository. As a result, the pedabytes of real time data per unit of time reduce to gigabytes of storage for forecasting in a DW. So there is no need for a big data analytics requirement.

    Gaming: Online and mobile game vendors want to know how players are using the game, who they play against, which level of play and other factors that may stimulate purchase of virtual goods offered from the game. Real time data analytics over 50 million players of which 10 percent are active is not even in tens of gigabyes and stored historically, one could use simple data appliance like what Vertica does for the gaming world.

    Financial data has even lower volumes (even if we include stock transactions per day). Again, no big data here.

    My point is that analysts think that they need to capture and analyze zetabytes of data. Wrong.

    What you need is capturing data in motion using in memory analytics on the transactions and aggregate DW for results.

    As an example, check out

    to see what this means

  2. Sarah Federman from Telmar, January 19, 2012 at 9:49 a.m.

    What a good White Paper. Telmar really believes in using data to help agencies find the right-target at the right time. With TelmarMatterhornROI and Media360, all agencies have the ability to integrate the best of what's out more data comes, we'll just open up the funnel.

Next story loading loading..