There is a line between subjective / product linked content moderation & policies, and what is labeled Network Abuse & Infrastructure Moderation. From the article:
Infrastructure providers such as Relays play a different role in the network, and are designed to be a common service provider that serves many kinds of applications. Relays perform simple data aggregation, and as the network grows, may eventually come to serve a wide range of social apps, each with their own unique communities and social norms. Consequently, Relays focus on combating network abuse and mitigating infrastructure-level harms, rather than making granular content moderation decisions.
An example of harm handled at the infrastructure layer is content that is illegal to host, such as child sexual abuse material (CSAM). Service providers should actively detect and remove content that cannot be hosted in the jurisdictions in which they operate. Bluesky already actively monitors its infrastructure for illegal content, and we’re working on systems to advise other services (like PDS hosts) about issues we find.
One place I’ve seen abuse is the PLC, with more than half of the accepted operations being bogus
Seems manageable right now, but it can easily become a problem if we see the same spike as we did in April. The storage requirements doubled from that event
Linking to the PDF on GDrive and copying text below.
Bluesky Community Guidelines Input
August 28, 2025
Bluesky’s own discretionary content moderation rules are important. So is Bluesky’s empowerment of users in a broader ecosystem of participants in content moderation — including third party app developers, relay operators, labelers, and list makers. Our comment pertains to that larger ecosystem, rather than the rules for users on Bluesky’s own app. We write as scholars of platforms and regulation who focus on interoperability, “middleware,” and other competition-related responses to current incumbent platforms’ role in public discourse. The choices Bluesky makes will have significant consequences for these issues.
A flourishing ecosystem of distributed content moderation would be more achievable if certain costly work were at least somewhat centralized. In particular, it would be difficult for each small app developer or other participant in this ecosystem to independently — and redundantly — respond to legal content removal mandates or demands. Third party developers could more easily offer compelling services and diverse approaches to content moderation if they could build upon a corpus of “presumptively legal” user content. Such a corpus would not be subject to any special legal review by its operator. It would, however, exclude any content that the operator knows it has legal reason to
remove — such as known CSAM in the U.S. and content identified as unlawful via credible notices under the EU DSA or U.S. DMCA. Importantly, it would not exclude content solely for violating any particular provider’s Community Guidelines.
This comment is not intended as a suggestion for Bluesky’s own app, or as a specific design proposal for storing and communicating about presumptively-legal content within the architecture of a distributed ecosystem. We will, however, note the significant legal, regulatory, and reputational problems faced by prior protocols like BitTorrent in which every network node was left to do its own legal compliance work. Whatever its design, the availability of a “presumptively legal” corpus of content from a credible provider (or providers) would meaningfully lower barriers to entry for participants in distributed content moderation systems.
To put it bluntly, we wish that Bluesky would fulfill this function. Given its size, resources, and role, the company is unusually well-positioned to identify and host or facilitate access to such a “presumptively legal” corpus. We appreciate that this is not a job Bluesky’s operators necessarily signed up for. And Bluesky can never guarantee the legality of any speech, or the accuracy of its legal removal decisions. This would be impossible even for content on the company’s own servers, and even more so for content distributed across multiple hosts. Issues like spam and varying national content laws make such a mechanism even more complex. This would not be a simple task.
But as Bluesky’s very existence demonstrates, an undertaking can be hard and nonetheless worth doing. We believe that this is such an undertaking.
I think that Bluesky PBC being able to work with a consortia of network operators to “share the load” of legal compliance would be very good!
Blacksky, EuroSky, and Gander are the first three that come to mind. NorthSky might also be a fit here, but they are PDS hosting first, so a slightly different category.
It’s not clear what “products” other than infrastructure EuroSky might operate, which actually makes them a pretty perfect fit to operate in consortia with bsky PBC on this cc @sherifea.com@ivansigal.bsky.social
I’ve brought this up before, but this infrastructure layer reminds me of NANOG
Thanks! I think in addition to sharing the load of deciding what should come down as illegal, there is a possibility to share the operational load of complying with laws like the DSA (providing notice to affected users, appeals. etc.).
We are definitely thinking/working on it, and two weeks ago Daphne and I had a long back-and-forth about it as well. She has since happily given us some specific responses to our preliminary legal analysis on the model.
I’ll add that @pfrazee.com provided some additional clarity this past week about the shape of the common foundation that their AppView/AppServer provides to others:
Yep. There’s something there about “whose users are they” and I think it needs to be opt in. EuroSky will definitely need to work with the DSA if it does PDS hosting.
Ultimately, I split this into long tail of self hosting, which is roughly defined as “people who run this stuff ad hoc AND don’t have an entity and just run something personally”. I, personally, wouldn’t recommend they run something like that without using the invite system / know who the people are, but I also think at a small scale there is limited risk.
Then there’s “we’ve formed an entity and want to do this properly”. Those entities need to lean in and do the work, and could be part of an early operators group / consortia which, yes, Bluesky pbc can help support, EuroSky might provide a shield across EU operators, etc.
In short: there are very few entities doing this work for real, and Blueskys actions or inactions aren’t actually holding things up. People need to want to ship products, with real entities, and do the work. That is going to be mostly gated by having a product and having funding.