Bluesky released an audit of the past year on Friday, noting that the social network experienced significant growth in 2024 and how that affected the workload of its trust and safety team. It also noted that the largest number of reports came from accounts or posts where users reported harassment, trolling, or intolerance — an issue that has dogged Bluesky as it has grown and even sometimes led to large-scale protests against individual moderation decisions.
The company's report did not address or explain why it did or did not take action against individual users, including those on the most blocked list.
The company added more than 23 million users in 2024 as Bluesky became a new destination for former Twitter/X users for a variety of reasons. During the year, the social network benefited from several changes at X, including the decision to change the way lockdowns work and train artificial intelligence based on user data. Other users left X after the results of the U.S. presidential election as X owner Elon Musk's politics began to dominate the platform. When X was temporarily banned in Brazil in September, the app's user base also surged.
To meet the demands of this growth, Bluesky said its audit team has grown to approximately 100 auditors and is continuing to hire. The company also began offering psychological counseling to team members to help them cope with the demanding job of constantly being exposed to graphic content. (We hope that artificial intelligence will one day solve this problem, because humans are not designed to handle this kind of work.)
The Bluesky audit service received a total of 6.48 million reports, a 17-fold increase from 358,000 reports in 2023.
Starting this year, Bluesky will begin accepting audit reports directly from its app. Similar to X, this will allow users to track actions and updates more easily. Later, it will also support in-app appeals.
In August, when Brazilian users flooded Bluesky, the company was receiving as many as 50,000 reports a day. This resulted in a backlog in processing audit reports and required Bluesky to hire additional Portuguese-speaking staff, including through contract vendors.
In addition, Bluesky began automating reporting for more categories than just spam to help it address the influx, although this sometimes resulted in false positives. Still, automation helps reduce processing time for "high certainty" accounts to "seconds." Before automation, most reports were processed within 40 minutes. Human moderators are now kept informed of false positives and appeals, if not always handling the initial decision.
Bluesky said that 4.57% of active users (1.19 million) submitted at least one audit report in 2024, down from 5.6% in 2023. The majority of these (3.5 million reports) were for individual posts. Account profiles were reported 47,000 times, usually profile pictures or banner photos. Lists were reported 45,000 times; DMs were reported 17,700 times, and feeds and starter packs received 5,300 and 1,900 reports respectively.
Most of the reports were about antisocial behavior, such as trolling and harassment - a signal from Bluesky users that they wanted to see a less toxic social network than X.
Bluesky said other reports covered the following categories:
The company also provided an update to its tagging service, which involves tags added to posts and accounts. Human taggers added 55,422 "Sexual Imagery" tags, followed by 22,412 "Rude" tags, 13,201 "Spam" tags, 11,341 "Intolerance" tags and 3,046 "Threat" tags.
In 2024, 93,076 users filed a total of 205,000 appeals against Bluesky's review decisions.
Administrators deleted 66,308 accounts and 35,842 accounts were automatically deleted. Bluesky also handled 238 requests from law enforcement, government and law firms. The company responded to 182 of those requests and complied with 146 requests. Most of the requests are said to be law enforcement requests from Germany, the United States, Brazil and Japan.
Bluesky's full report also delves into other types of issues, including trademark and copyright claims and child safety/CSAM reports. The company noted that 1,154 confirmed CSAM reports have been submitted to the National Center for Missing and Exploited Children (NCMEC).