Bluesky Increased Content Moderation By 17x In 2024

While major social-media platforms like X and Meta have slackened content-moderation efforts and slashed third-party fact-checking, up-and-coming microblogging app Bluesky amplified its efforts in 2024.

Bluesky processed almost 6.5 million content-violation reports -- a 17x increase compared to 2023.

According to Bluesky's new moderation report, the company's Trust and Safety team -- led by former Twitter Trust and Safety head Aaron Rodericks -- grew to about 100 moderators to meet the rapid growth of the platform's user base, which increased by 23 million users in 2024.

As the decentralized app grew, the proportion of users submitting reports remained fairly stable from 2023 to 2024, with 5.6% of active users creating one or more reports in 2023 and 4.57% of active users creating reports in 2024 -- roughly 1.2 million users.

advertisement

advertisement

Bluesky says 3.5 million reports dealt with individual posts and 47 thousand reports were a result of violative profile pictures or banner photos on accounts.

Lists also received a high amount of reports, at 45,000 -- followed by DMs (17,700), feeds (5,300) and Starter Packs (1,900).

The types of content being reported included anti-social behavior, harassment, trolling and intolerance as well as spam (excessive mentions, replies, or repetitive content), misinformation and impersonation, clear violations of the law, and nudity and/or improperly labeled adult content.

Bluesky moderators took down 66,308 accounts, while 35,842 were taken down automatically. 

In addition, the company fielded 238 requests from law enforcement, governments, and legal firms across Germany, the U.S., Brazil and Japan.

Bluesky responded to 182 of these requests and complied with 146. The company also submitted 1,154 confirmed CSAM reports to the National Centre for Missing and Exploited Children (NCMEC).

Due to X's unwillingness to comply with an order from Brazil's Supreme Court, the platform was banned in the region in late August, resulting in a massive migration of X users to Bluesky. As a result, Bluesky received up to 50 thousand reports per day, which led to the company's first backlog in moderation reports.

“To address this, we increased the size of our Portuguese-language moderation team, added constant moderation sweeps and automated tooling for high-risk areas such as child safety, and hired moderators through an external contracting vendor for the first time,” Bluesky reports.

With content-violation reports flooding Bluesky's moderation team -- which is staffed 24/7 and reviews a plethora of graphic content -- the company says it has been providing psychological counseling to alleviate the burden of viewing this content.

In 2025, Bluesky plans to accept moderation reports directly from its app. And like X, this will allow users to track actions and updates more easily. Later, it will also support in-app appeals.

Last week, with Meta loosening its content-moderation policies, Bluesky strategically announced a new funding push in an attempt to position itself as a “free-speech” alternative to Meta and X -- two platforms that have spawned migration to Bluesky.

According to the company, 12 million of its 23 million new users in 2024 joined the app over the month following Trump's presidential election win. During this time, X owner Elon Musk also endorsed the president-elect.

Bluesky, which is now valued at $700 million, is trying to stand out much bigger billionaire-operated social platforms by making its content-moderation practices as transparent as possible, while building out its user interface to compete with X, Threads, and Mastodon.

Next story loading loading..