Improving Online Ad Campaign Measurement

  • by October 22, 2001
The Interactive Advertising Bureau (IAB) has reached agreement with ABCi, an industry leader in the online third party verification arena, for ABCi to create and maintain the ABCi/IAB master industry list of spiders and robots. The list will be updated monthly and is available to IAB members and ABCi clients free of charge. It is the third step in a chain of recommendations, which will culminate in guidelines that the IAB will issue by the end of the year, all geared to improving the quality of online ad campaign measurement. "The use of the ABCi/IAB master list of spiders and robots will go a long way towards eliminating inconsistent counts. The primary benefits of the list will be to reduce the time, effort and cost to publishers, third party ad servers and agencies associated with report reconciliation, and to provide advertisers with accurate measurement metrics," noted Robin Webster, President & CEO of the IAB.

"By utilizing this approved and constantly updated master list, publishers will be able to filter out robot traffic from their reporting to advertisers, instilling greater confidence in the medium."

Bruce Smith, president and CEO of ABCi added that spider and robot activity is believed to be the largest single contributors to inaccurate measurement. "ABCi is also in the process of quantifying the percentage of spider and robot activity on the Web in an IAB-sponsored study."

ABCi has maintained a robot list since May of 1995, compiled from over 8,000 audits and 200 trillion transactions. In addition to its own ongoing work in the area, the company invites other organizations to provide suggested updates to the list, provided that proper documentation is supplied so that their auditors can analyze the activity to assure that the user agent is, in fact, robotic. Conversely, challenges to the list are also welcomed, accompanied by proper documentation. As part of the IAB/ABCi relationship, ABCi will compile information and identify the impact of robotic activity based on ongoing audits.

Spiders and robots automatically search Web content for information. They are used for a number of reasons including: retrieval of content information for commercial search engines, gathering of content pages for personal offline viewing, tracking advertising activity by Web site and by advertiser and identification of "Whats new information". At any given time there may be a multitude of agents on a site. Without this master list, sites may have a problem distinguishing between non-human spider and robot activity and consumer interactions, thus skewing the metrics that are reported to advertisers.

The IAB's latest effort to improve the quality of online campaign measurement has been a multi-step process. After assembling a task force of 11 major companies consisting of third party ad servers and publishers, the IAB commissioned PricewaterhouseCoopers to conduct a study of the process that each of these companies use to measure ad impressions, unique users, clicks, total visits and page impressions.

A key finding from this initial study is that there is a vast difference in how these terms are defined by the participating companies. In response, the IAB last week issued a comprehensive glossary of hundreds of terms in common use. Next steps in the process will include additional analysis and research conducted by ABCi on the effect of spiders and robots on the two different methods of ad serving: client-initiated and server-initiated, followed by the release of measurement guidelines by year-end.

Next story loading loading..