The Staggering Scale of Content Moderation: Insights from a Single Day in Europe

08.06.2024|Christian Kreutz

According to the Digital Services Act (DSA) legislation, Very Large Online Platforms (VLOPs) and two search engines (VLOSEs) with over 45 million users are mandated to submit daily reports on their content moderation decisions. The University of Bremen took this opportunity to conduct a comprehensive analysis of a single day's data dump.

The complete one-day dataset downloaded on November 5, 2023, encompassed 7,143,981 decisions. The final dataset, focusing solely on social media, included 2,195,906 content moderation decisions across 37 variables.

This staggering number highlights the incredible challenge of content moderation for both platforms and law enforcement authorities. With over 2 million decisions made in just 24 hours, it becomes evident that relying solely on manual work is impossible, as it not only has dire psychological consequences for content moderators but also presents algorithmic challenges in accurately programming decision models to handle the vast plurality of cases. Highly accurate algorithms for content moderation are virtually impossible, especially considering the differing laws across jurisdictions and the wide range of perspectives on what constitutes a violation, such as in the case of artistic expression.

Interestingly, platforms employ different strategies to counter the magnitude of content requiring moderation. TikTok heavily relies on automated content moderation, while Twitter claims to conduct moderation entirely manually. YouTube, on the other hand, adopts a mixed approach (see image at the bottom).

PlatformTotal
Facebook903.183
Pinterest634.666
TikTok414.744
Instagram111.379
YouTube114.713
Snapchat11.505
X5.384
LinkedIn332

The differences extend beyond the number of decisions each platform makes, with LinkedIn making only 332 decisions in a single day, which seems remarkably low compared to the likely true number of violations.

The problem remains: There is no benchmark for evaluating which social network is most effective in decreasing disinformation? As long as data remains largely closed, we can only speculate. The Digital Services Act (DSA) is a good first step towards increased transparency, but it is important to recognize that these social network platforms are inherently flawed in their design, driven primarily by corporate profit motives.