Over 60% of the top 1,000 programmatic publishers and nearly 170,000 total publishers have implemented ads.txt so far, making it one of the most successful industry-wide initiatives ever.
The rapid adoption of ads.txt, introduced by the IAB in May 2017, highlights the benefits it brings to the digital advertising ecosystem.
But ads.txt in its current form still has limitations: It does not apply to CTV or mobile in-app, it lacks granularity in the level of information available, and it is a manual process that can be susceptible to errors.
So what should “ads.txt 2.0” look like?
Getting To The Specifics
The main reason why ads.txt has proven to be popular is because it makes detailed insights publicly available in an ecosystem where transparency was not a standard practice a few years ago — all with just a simple text file.
Few industries currently match the level of visibility generated by ads.txt.
While making information publicly available is valuable, was it the primary goal of ads.txt? In other words: is the objective of ads.txt to provide a global map of relationships between publishers and SSPs, or to provide buyers with comprehensive information on the journey of each ad request they receive?
Right now, ads.txt is closer to the former.
A new iteration of ads.txt would prove even more valuable if it could provide data specific to each ad request, detailing the list of partner(s) involved in the transaction. This would further cut down fraud.
For this to be possible, all of this information would need to be passed in the bid request, instead of existing in public pages, which lack specificity. The future IAB Standard OpenRTB 3.0 is laying the ground for this.
Going Beyond Self-Declaration
Ads.txt is currently self-declared. The assumption is that publishers want to list only trusted sellers they are working with to minimize fraud. But we have seen this is not always the case for less premium publishers that may be tempted to list as many partners as possible to increase the chances of a DSP bidding on their inventory.
A future version of ads.txt could include some form of control or vetting to deter this practice.
One way to do this: Have a single repository owned by a committee that would review the information provided by publishers and verify it (e.g. actual contracts could prove business relationships). Another option many are discussing would be a global and decentralized repository, based on blockchain technology. The vetting process would then be the responsibility of a community of users.
Whether it is centralized or decentralized, having a single location for all this information would facilitate its access and improve version history, which is lacking under the current standard.
We would need to ensure that the controlling body is fully independent with no relationship to any publisher, advertiser or intermediary partner and that those in charge of the vetting process are not comprised by a handful of influential individuals that would control all decisions.
Gaining Additional Insights
The information contained in the current standard is limited to a list of SSPs or publishers authorized to sell and their associated “publisher code.” A future standard could go further and add more detail.
For instance, it could include all the possible paths and intermediaries involved.
In addition, it could be used to bring full visibility on the fees taken by each partner involved in the transaction. Ads.txt can bring greater clarity to many opaque corners in the ecosystem by building on its strong foundation.
Identifying the main objectives of ads.txt 2.0 and prioritizing them will determine what shape the standard will take in future. The range of possibilities is large and creating the best possible standard will go a long way to bring additional layers of transparency to the industry.