Just an Online Minute... Spiders and Robots
Spiders and robots automatically search Web content for information. They are used for a number of tasks including retrieval of content information for commercial search engines, gathering content pages for offline viewing, tracking advertising activity and identification of "What's new information".
At any given time there may be a multitude of these agents on a site. Without this master list, sites may have a problem distinguishing between non-human spider and robot activity and consumer interactions, thus skewing the metrics that are reported to advertisers.
The ABCi/IAB list will be updated monthly and is available to IAB members and ABCi clients free of charge. It is the third step in a chain of recommendations, which will culminate in guidelines that the IAB will issue by the end of the year, all geared to improving the quality of online ad campaign measurement.
IAB President/CEO Robin Webster said the primary benefits of the list will be to reduce the time, effort and cost to publishers, third party ad servers and agencies associated with report reconciliation, and to provide advertisers with accurate measurement metrics. "Publishers will be able to filter out robot traffic from their reporting to advertisers, instilling greater confidence in the medium."
Next steps in the process will include additional analysis and research conducted by ABCi on the effect of spiders and robots on the two different methods of ad serving: client-initiated and server-initiated, followed by the release of measurement guidelines by year-end.