Microsoft Bing Suggests Child Porn, Study Finds

Microsoft's Bing search engine served child pornography and suggested additional search terms for illegal images, according to research commissioned by TechCrunch.

The research, done by AntiToxin Technologies, found that Bing served photos of nude children and would recommend related keyword terms to searchers looking for child pornography or as suggested terms when the images were clicked on.

“Clearly these results were unacceptable under our standards and policies, and we appreciate TechCrunch making us aware,” wrote Jordi Ribas, CVP Bing and AI products, in an email to Search Marketing Daily. “We acted immediately to remove them, but we also want to prevent any other similar violations in the future. We’re focused on learning from this so we can make any other improvements needed.”

Since being made aware of the incident, Bing has cleaned up and fixed the identified issues, following its usual takedown processes and now is looking at similar queries that may also be a problem.  

Bing also took steps to eliminate related search suggestions, and made changes so users can report problem image and video content.

Since internet content continually changes, Bing also uses a combination of automated techniques like PhotoDNA and human moderation.

TechCrunch commissioned AntiToxin to investigate how illegal child content gets served on Bing. The search research on desktop, conducted from December 30, 2018 through January 7, 2019, used the version of Bing with “Safe Search” turned off. Safe Search aims to filters out adult images and videos.

During that time, AntiToxin’s research found that terms like “porn kids,” “porn CP” and “nude family kids” all served illegal child exploitation imagery. It also found that Bing led people to this type of content even when not searching on these specific search terms.

For example, when researchers searched for “Omegle Kids,” referring to a video chat app popular with teens, Bing’s auto-complete suggestions included “Omegle Kids Girls 13” that revealed extensive child pornography when searched. If someone clicked on those images, Bing served more illegal child abuse content in its Similar Images feature.

 

Next story loading loading..