Embed Notice
HTML Code
Corresponding Notice
- Embed this notice@p Content moderation and user safety
Decentralized social networking platforms introduce new challenges and difficulties for user trust and safety.
By nature of the Fediverse, operators of an instance are solely responsible for moderation of its content. As there is no form of centralized governance or moderation across the Fediverse, it is impossible for an instance to be "removed" from the Fediverse; it can only be defederated per an instance operator's choice, which makes that instance's content inaccessible from the operator's instance.
Individual instances are responsible for defining their own content policies, which may then be enforced by its staff. Moderation of a Fediverse instance differs significantly from that of traditional social media platforms, as moderators are responsible not only for content posted by users of that instance ("local users"), but also for content posted by users of other instances ("remote users").
With toxic or abusive content being common in the Fediverse,as well as available moderation tools and the legal or financial impetus to moderate content lacking in comparison to those of centralized social media platforms,the Fediverse exhibits shortcomings in child safety.
A 2023 study by the Stanford Internet Observatory's Cyber Policy Center found that,out of approximately 325,000 Fediverse posts analyzed over a two-day period, 112 were detected as instances of known child sexual abuse material (CSAM); 554 were detected as containing sexually explicit media alongside keywords associated with child sexual exploitation; 713 contained media alongside the top twenty CSAM-related hashtags on the Fediverse; and 1,217 contained text relating to distribution of CSAM or child grooming. The study additionally noted that, during a test run of its analysis pipeline, detection of its first instance of known CSAM occurred within approximately five minutes of runtime. All detected instances of CSAM were reported to the National Center for Missing and Exploited Children (NCMEC) for triage.