Last week, Adweek reported on a study conducted by IPG Mediabrands’ research arm IPG Media Lab and CHEQ, a brand safety company focused on cyber security. It showed consumers blame brands for advertising next to inappropriate or offensive video content, considering the placement intentional.
The results reveal how important it’s becoming for advertisers to know and control where their brands are showing up across the internet.
Now, a new aggregation project housed at that Tow-Knight Center for Entrepreneurial Journalism at CUNY’s Craig Newmark Graduate School of Journalism, run with the aid of Trust Metrics, hopes to make the process of connecting with trusted, quality publishers easier for advertisers.
The project will aggregate signals collected from websites through Trust Metrics crawler-based technology to compose a more complete picture of which outlets are the most reliable in truthfulness of content, lack of bias and the overall quality of the publication.
In a Medium post introducing the project. the organizes wrote that neutrality is no long an option.“Every major player in these ecosystems is forced to make judgments about sources, because a small but impactful number of those sources is attempting to manipulate technology and ad companies and ultimately, the public conversation.”
The group stated its first task “will be to analyze the methodology, standards and output of the many initiatives that are generating signals and standards of quality in news.”
Following that, it will consider the signals currently collected by Trust Metrics and decide which should be used in the new project. Finally, the team will decide how new signals can be created based on what is missing.
Websites will be analyzed for the following criteria: “overall site quality, ad environment, safety (hate speech, violence, profanity, pornography, and an overall safety measurement), user-generated content and context (site category and a number of news and politics-specific subcategories).”
A human team will then analyze the results, training the crawlers how to look for contradicting content across the same publication.
Ultimately, the team hopes the final grouping of information will be another useful tool for advertisers hoping to gain more control over how their brands are represented online.They write, “For perspective, this project tackles only a very small slice of a much bigger problem around credibility, trust, and quality that is being addressed in many ways by journalists, academics, technology companies, publishers, government, civil society and citizens themselves.”