Tumblr Bans Sexually Explicit Content

In the wake of losing its place in the App Store over child porn, Tumblr has decided to ban any sexually explicit content from its platform.

Starting Dec 17, adult content will not be allowed on Tumblr, regardless of how old you are,” the social network stated Monday. Last month, the Oath-owned social app saw its app removed from Apple’s App store.

It wasn’t until CNet’s Downoad.com reported the reason for the expulsion that Tumbr copped to its child-porn predicament.To help rid its network of sexually-charged material, Tumblr is now encouraging users to flag it as such.            

The network defines “adult content” as photos, videos, or GIFs that show real-life human genitals or female-presenting nipples, and any content depicting sex acts.

Some exceptions to Tumblr's new rule include images of female-presenting nipples in connection with breastfeeding, birth or after-birth moments, and health-related situations, such as post-mastectomy or gender confirmation surgery.

Written content, such as erotica, nudity related to political or newsworthy speech, and nudity found in art, such as sculptures and illustrations, will remain fair game.

After Dec. 17, blogs that had previously been self-flagged or flagged by Tumblr as “explicit” will still be overlaid with a content filter. Although some of the content on such blogs might now be in violation of Tumbr’s policies -- and flagged accordingly -- these publishers will still be able to post content that is within the platform’s policies.

Following its removal from the App Store earlier this month, Tumblr insisted that every image uploaded to its network is scanned against an “industry database” of known child sexual abuse material.

The problem arose when “a routine audit discovered content on our platform that had not yet been included in the industry database,” Tumblr said at the time.

Tumblr isn’t the only platform that struggles with child pornography.

Over the past quarter, Facebook recently reported removing nearly 9 million user images of child nudity. To do so, the social giant relies on a machine-learning tool it rolled out over the last year that identifies images that contain both nudity and a child.

Next story loading loading..