Google And The Path Of Objective Opinions

Guillaume Chaslot, a computer programmer who spent some time working on recommendations at YouTube, began tracking them in 2016. He believes the algorithm leads searchers down a specific path -- and not the one the person searching for answers usually takes.

For some phrases, like “vaccine facts” or “global warning,” the algorithm pushes those searching for information toward conspiracy theories or anti-media videos, reports the MIT Technology Review.  Chaslot believes the algorithms tend to favor videos of “more divisive politicians, who talk in an aggressive, bullying manner,” per the report.

This theory would suggest that even when people search for answers to information, they are led down the path of the algorithm and not necessarily one filled with objective opinions.

Google might argue that theory --  at least for now. In an effort to provide an unbiased view, both Google and Bing said they would provide more than one answer to search queries in featured snippets on their respective engines. On Google, the answers can serve up in text or video, but all come from third-party websites such as publishers.



Chaslot still believes it’s a problem, so he built a website called AlgoTransparency that allows those searching for answers to view where YouTube’s algorithm takes the person searching when following its recommendations.

Using search terms, AlgoTransparency’s robot crawls all the top recommended videos and keeps track of the ones that are recommended most often, according to the website. That’s whether someone searches for videos about mass shootings, science or other select topics. The report suggests that Chaslot has “cherry-picked” a few terms to scrape during the initial development phase.

For example, YouTube’s most recommended videos for the query “is the earth flat or round” include NASA Live Earth From Space feed. The most frequent terms mentions in the title of these videos based on the query include “flat,” “NASA,” “Proof,” and “Secret,” according to AlgoTransparency’s website.

He told MIT Technology Review that the YouTube algorithm seems to favor more divisive politicians, who talk in an aggressive, bullying manner in videos.

Chaslot worked at YouTube in 2011 and then at Google until 2013. In a Twitter tweet, he claims that he was fired for trying to give users more control over the algorithms that recommend content.

There don't seem to be ill feelings between the two, according to reports -- he simply wants to make YouTube viewers think more about how recommendations are used to increase views and lead those searching for answers down a specific path.

Next story loading loading..