Google is attempting to clarify policies and prevent extremist videos from surfacing on YouTube, its video hub, with new guidelines. In a blog post, Sunday, Google said that it would identify and remove videos promoting terrorism and
extremism on YouTube that violate its community guidelines.
In the case of offensive videos that don’t meet its standard for removal, Google said such videos would carry warnings and
will not be available for advertising, recommendation, endorsement, or comments. Videos like these are not allowed to carry advertising currently, but had no other restrictions.
Google has
already pledged more resources to root out offensive videos, but now the company is saying that it will also recruit experts from non-governmental organizations to help it eliminate video content like
hate speech, self-harm, and terrorism. The company will also work with counter-extremist groups to help identify content aimed at radicalizing or recruiting extremists.
advertisement
advertisement