Home

Donate
Perspective

'Composable Moderation' May Protect Bluesky from Political Pressure

Audrey Hingle / Nov 26, 2025

A blurred image of US President Donald Trump with the Bluesky app superimposed. (Official White House photo/Shutterstock)

The Trump administration, led by a President who was previously banned from major social networks for inciting violence and spreading disinformation after the 2020 US election, poses a particular challenge for the upstart platform Bluesky. As Erin Kissane noted in a recent article in Tech Policy Press, Bluesky was designed for openness and interoperability, yet it now finds itself as a single point of pressure. If it enforces its rules against harassment and incitement against official Trump administration accounts for some future infraction, it risks political retaliation. If it weakens its rules or shies away from enforcement, it may lose the trust of the communities who turned to the network for protection from coordinated abuse.

Composable moderation, which decentralizes rule-setting by letting users pick the moderation services that best reflect their needs and values, mitigates this problem. It shifts enforcement away from a single platform and into a distributed ecosystem of user-selected moderation services. With no central referee to target, political actors and influencers lose the ability to “work the refs” and pressure a singular trust and safety team into making decisions that favor their side.

Spreading the burden of moderation

“Bluesky the app” is the company’s shorthand for distinguishing its consumer-facing social app from the AT Protocol, the decentralized social networking protocol it is building. The app is just one client in what is intended to become a broader ecosystem of services built on the protocol. For now, however, Bluesky the company still carries the full responsibility for moderation and governance across the AT Protocol.

Centralized governance of a decentralized protocol cannot withstand sustained political or social pressure. When one company sets moderation rules for a network that is meant to be open and distributed, it becomes a single point of influence that governments, interest groups and powerful users can target. As AT Protocol’s Ethos statement makes clear, its long-term vision sits at the intersection of three movements: the early web’s open publishing model, the peer-to-peer push for self-certifying and decentralized data, and the large-scale distributed systems that underpin modern internet services.

Bluesky’s goal is for AT Protocol to embody the openness of the web, the user-control of peer-to-peer networks, and the performance of modern platforms. In the future, we could see photo-sharing apps, community forums, research tools and more all using the same identities and social graph. Bluesky is only one expression of the protocol, not the limit of it.

Composable moderation is the feature that will make that possible. Rather than treating moderation as a network-wide ban, it uses labels to describe issues with content or accounts, leaving individual apps to decide how to act on them. Following a letter from Daphne Keller, Martin Husovec, and my colleague Mallory Knodel, Bluesky has committed to this approach.

Instead of blocking someone in a way that removes them from every app built on the protocol, Bluesky will mark a suspended account with a label that only affects how that account appears inside Bluesky. Other apps can choose to hide the account, show it with a warning, or ignore the label entirely. This also keeps the user’s underlying account intact, because it’s stored on their personal data server or PDS, the place where their identity and posts live, which should only cut someone off for serious issues like illegal content. The result is a more flexible, decentralized system where no single app controls whether someone exists on the network.

Why this approach solves the potential Trump problem

The closest analogy to existing social media is to how Reddit operates: the platform sets a baseline of what is acceptable, but thousands of subreddit communities apply their own rules, filters, and enforcement styles on top. For example, r/AskHistorians expects in-depth, well-sourced answers that reflect current academic research, and moderators routinely remove unsourced or speculative replies that don’t meet that standard. Composable moderation takes that layered, community-defined model and implements it at the protocol level, so many different apps and services can choose the moderation approaches that fit their values.

And because moderation could be provided by many different apps and services, not just Bluesky, it would reduce the political vulnerability that comes from having a single company responsible for every enforcement call. Communities can also choose moderation services that reflect their own context and needs, giving vulnerable groups more control over the protections they rely on. And if one app or operator fails or comes under political pressure, others can continue enforcing their own standards without breaking the network.

Taken together, this shift could help Bluesky, and future AT Protocol services, navigate the pressures Kissane highlights, distributing power across the network rather than concentrating it in a single company.

Authors

Audrey Hingle
Audrey Hingle is the Editor-in-Chief of The Internet Exchange. Based in London, she is passionate about how people interact with technology and the internet, and how it can influence our lives for better (or worse). Audrey has over 15 years of experience working in strategic communications, and prev...

Related

Perspective
Trump Administration's Arrival on Bluesky Highlights Growing Pains for Open NetworksOctober 22, 2025

Topics