This just got raised with a 1-page comment
Linking to the PDF on GDrive and copying text below.
Bluesky Community Guidelines Input
August 28, 2025
Bluesky’s own discretionary content moderation rules are important. So is Bluesky’s empowerment of users in a broader ecosystem of participants in content moderation — including third party app developers, relay operators, labelers, and list makers. Our comment pertains to that larger ecosystem, rather than the rules for users on Bluesky’s own app. We write as scholars of platforms and regulation who focus on interoperability, “middleware,” and other competition-related responses to current incumbent platforms’ role in public discourse. The choices Bluesky makes will have significant consequences for these issues.
A flourishing ecosystem of distributed content moderation would be more achievable if certain costly work were at least somewhat centralized. In particular, it would be difficult for each small app developer or other participant in this ecosystem to independently — and redundantly — respond to legal content removal mandates or demands. Third party developers could more easily offer compelling services and diverse approaches to content moderation if they could build upon a corpus of “presumptively legal” user content. Such a corpus would not be subject to any special legal review by its operator. It would, however, exclude any content that the operator knows it has legal reason to
remove — such as known CSAM in the U.S. and content identified as unlawful via credible notices under the EU DSA or U.S. DMCA. Importantly, it would not exclude content solely for violating any particular provider’s Community Guidelines.This comment is not intended as a suggestion for Bluesky’s own app, or as a specific design proposal for storing and communicating about presumptively-legal content within the architecture of a distributed ecosystem. We will, however, note the significant legal, regulatory, and reputational problems faced by prior protocols like BitTorrent in which every network node was left to do its own legal compliance work. Whatever its design, the availability of a “presumptively legal” corpus of content from a credible provider (or providers) would meaningfully lower barriers to entry for participants in distributed content moderation systems.
To put it bluntly, we wish that Bluesky would fulfill this function. Given its size, resources, and role, the company is unusually well-positioned to identify and host or facilitate access to such a “presumptively legal” corpus. We appreciate that this is not a job Bluesky’s operators necessarily signed up for. And Bluesky can never guarantee the legality of any speech, or the accuracy of its legal removal decisions. This would be impossible even for content on the company’s own servers, and even more so for content distributed across multiple hosts. Issues like spam and varying national content laws make such a mechanism even more complex. This would not be a simple task.
But as Bluesky’s very existence demonstrates, an undertaking can be hard and nonetheless worth doing. We believe that this is such an undertaking.
Daphne Keller
Martin Husovec
Mallory Knodel