
IBM Watson Advertising plans to
embark on a six-month research project to explore whether artificial intelligence (AI) can detect and mitigate bias in advertising.
The research will rely on IBM’s open-source AI
technology and use campaign performance data from the Ad Council, IBM’s initial partner.
Data will be sourced from the Ad Council’s “It’s Up to You”
COVID-19 vaccine education campaign, as well as the “Love has No Labels” campaign, a movement to promote acceptance and inclusion of all people.
Through the use of
advanced technology, IBM plans to research areas such as audience segmentation, creative messaging, performance optimization and campaign impact.
Bias can occur when the AI training
datasets include many more data points for one group compared with another. There are algorithms to detect and mitigate bias in AI systems, but they are very complex, and different data types such as
images and text require different techniques for detecting them.
Industry experts believe bias in advertising has been prevalent from the onset of the industry.
In 2020, IBM released data during a conference that estimates 85% of AI
professionals believe the industry has become more diverse during the past few years.
Seventy-four percent of AI professionals believe diversity has not improved, but say the industry must
become more diverse to reach its potential. And two in five AI professionals said they face hurdles in implementing AI-reported challenges in building the technology tech or getting their organization
to adopt it.
The research is intended to achieve a better understanding of how to potentially prevent bias with the use of AI, and to ultimately help organizations more effectively create and
execute campaigns.
IBM has studied bias in advertising for years, and many other organizations have as well. There are countless research papers focused on bias in advertising, and
many explore how algorithms play a part.
One published in May 2021 looks at gender bias in social-media advertising in the context of STEM careers. Results suggest that advertising
algorithms are not gender-biased, but economic forces might lead to unintended and uneven outcomes.
"A gender-neutral strategy is less likely to reach women because women are more likely to
react to advertising," the authors write. "Targeting them is more expensive and economic forces unintentionally favor men."
While one potential solution is to run separate campaigns for men and women, the authors
write, anti-discrimination legislation in many countries does not allow companies to target employment ads to only one gender. As it turns out, laws that are designed to avoid discrimination can
eliminate a fairly simple way to correct the bias in online targeting on Facebook and other platforms.
IBM plans to release its initial research findings this fall.