Bluesky released a moderation report for the past year on Friday, noting the significant growth the social network experienced in 2024 and how that impacted the Trust & Safety team's workload. It also noted that the most reports came from users who reported accounts or posts of harassment, trolling, or intolerance. This issue has plagued Bluesky as it has grown, sometimes even leading to mass protests over individual moderation decisions.
The company's report does not mention or explain why it took or did not take action against individual users, including those on its most blocked list. .
Bluesky has become the new destination for former Twitter/X users for a variety of reasons, and the company added more than 23 million users in 2024. Throughout the year, the social network benefited from several changes in X, including changes to how blocking works and the decision to train its AI based on user data. Other users left X after the results of the US presidential election, based on how X owner Elon Musk's politics began to dominate the platform. Users of the app soared even while X was temporarily banned in Brazil in September.
To meet the demands of this growth, Bluesky has expanded its moderation team to approximately 100 people and continues to hire, the company said. The company has also begun offering psychological counseling to its team members to help them with the difficult task of constantly being exposed to graphic content. (Humans are not built to handle this kind of work, so we hope that AI will tackle this area one day.)
There were a total of 6.48 million reports to Bluesky's moderation service, a 17x increase compared to 2023, when there were only 358,000 reports.
Starting this year, Bluesky will begin accepting moderation reports directly from the app. Similar to X, this allows users to more easily track actions and updates. In the future, we will also support in-app disputes.
When Brazilian users flooded Bluesky in August, the company was viewing as many as 50,000 reports per day at its peak. This slowed response to moderation reports and required Bluesky to hire additional Portuguese staff, including through contract vendors.
Additionally, to combat the influx of spam, Bluesky began automating reporting for more categories than just spam, which sometimes resulted in false positives. Yet, through automation, we were able to reduce processing time for “high-confidence” accounts to just “seconds.” Before automation, most reports were processed within 40 minutes. Now, even if human moderators don't always handle the initial decision, they stay in the loop to deal with false positives and appeals.
According to Bluesky, 4.57% of active users (1.19 million people) will create at least one moderation report in 2024, down from 5.6% in 2023. Most of these reports (3.5 million) were about individual posts. 47,000 account profiles were reported, many of which were profile pictures or banner pictures. The listing has been reported 45,000 times. DM received 17,700 reports, feed and starter pack received 5,300 and 1,900 reports respectively.
Most reports were about anti-social behavior such as trolling and harassment, a signal from Bluesky users who wanted a less toxic social network compared to X.
Other reports were in the following categories, Brusky said.
Misleading content (impersonation, misinformation, or false claims of identity or affiliation): 1.2 million Spam (over-references, replies, or repetitive content): 1.4 million Unwanted sexual content (appropriately Unlabeled Nudity or Adult Content): 630,000 Illegal or Urgent Issues (Clear Violation of Law or Bluesky Terms of Service): 933,000 Other (issues that do not fit into the above categories): 726,000
The company also provided updates to its labeling service, which adds labels to posts and accounts. Human labelers added 55,422 “Sexual Person” labels, followed by 22,412 “Rude” labels, 13,201 “Spam” labels, 11,341 “Intolerance” labels, and 3,046 “Intolerance” labels. Added “Threat” label for .
In 2024, 93,076 users filed a total of 205,000 appeals against Bluesky's moderation decisions.
Additionally, there were 66,308 account deletions by moderators and 35,842 automatic account deletions. BlueSky also responded to 238 requests from law enforcement, governments, and law firms. The company responded to 182 of these and responded to 146. Most of the requests came from law enforcement agencies from Germany, the United States, Brazil and Japan.
Bluesky's full report also details other types of issues, including trademark and copyright claims and child safety/CSAM reports. The company noted that it has submitted 1,154 verified CSAM reports to the National Center for Missing and Exploited Children (NCMEC).