Opinion piece: The technical infrastructure of the fediverse — which includes Mastodon — has huge potential as a Twitter alternative that caters to the common good. However, to fulfil this potential, we must provide sufficient public funding and invest in diversity and inclusion as well as continuous training for content moderators.

One thing is clear, the question of social media platform governance has never been as hotly discussed as it is in the mainstream discourse today. Elon Musk’s purchase of Twitter and his subsequent reckless and boastful behaviour has served to drive home the challenge that what may be perceived as a public space can be bought by a private individual. A problem that has been passionately debated among digital rights and tech communities for years.

In other words: On the one hand, public discourse is now closely linked and increasingly driven by people’s engagement with major social platforms, which is why we’ve seen more and more regulation, notably in the EU with the recent Digital Services and Markets Acts (DSA, DMA) to address their influence. On the other hand, these platforms are anything but built in the public interest, they answer to the demands and whims of private shareholders.

What is the fediverse?

The Muskian chaos has boosted popularity (and many opinion pieces) of Mastodon, which is part of the fediverse. Mastodon first launched in 2016 and currently at roughly 7 million users and growing, around 4 million people joined since Musk announced his take-over of Twitter. The fediverse is a decentralised –federated– alternative of “microblogging” services that all use the same protocol to communicate with each other. Just as you can choose to set up your email account with Gmail, Gmx, Proton, or Posteo and write one another seamlessly; you can set up an account on Mastodon, Diaspora, Hubzilla, Funkwhale and more and share your messages and commentary across services. Contrary to Twitter, Facebook, Instagram or TikTok, which are all closed platforms.

The fediverse comes with a set of distinctive features, it does (a) not rely on selling user data for advertising, (b) sets content moderation rules by each instance that users can choose freely, and (c) is built on interoperability, which means switching between instances (or services) is possible and sometimes even encouraged. Say goodbye to the lock-in effects of large social platforms and shedding tears over losing a following you’ve built up over years. However, despite all this potential for independence and increased user control, we should not overlook some of the challenges.

Good intent does not necessarily translate into good outcomes

Let’s be clear, I’m all for interoperability, user choice, and alternatives to surveillance capitalism. Yet, that does not mean that services, just because they have the technical infrastructure to cater to the common, public good will automatically do so.

Right now, most instances on the fediverse are run and administered by folks based in Europe or North America and there by (seemingly) white male-presenting folks. Content moderation on the vast majority of instances is managed by a network of volunteers — because business models on the fediverse, largely, rely on donations. All of this is laudable, and I am convinced that most people mean well. Unfortunately, good intent does not necessarily translate to good outcomes — especially, when the burden to call out harmful or oppressive patterns remains with the people traditionally marginalised when the work should happen on the side of those with privilege.

Safeguards and inclusion must feature centrally

The thing is that volunteering and spending resources on maintenance comes with privilege. Sad (and frustrating) as it is, privilege is still quite firmly reserved for white men (hello, colonialism, hello, patriarchy). We’ve seen similar developments play out during the early days of the internet and continue to see in the Wikimedia community, as one example, that openly shares their commitments, as well as challenges to recruit volunteers globally. As is, many of the challenges, discrepancies, exploitative patterns we see on the internet today, are a consequence of a lack of safeguards. Safeguards that became more and more pressing as perspectives broadened; needs, conversations, and expectations diversified. Or: as more non-white, non-male people came online. Diversity and inclusion require work and a lot of reflection and unlearning of privilege.

Tracy Chou, founder of Block Party, an anti-harassment tool that allows users to control their own social media timelines without relying on the in-app reporting features recently shared their latest experience with Mastodon. They aptly pointed out on Twitter (no pun intended) that flagging messages as “racist” that call out white privilege, indicates that the practice of understanding and working to dismantle oppressive structures of historical power imbalances and racism will require concerted effort, training — and more diverse perspectives. The lack thereof is currently still one of the primary reasons many marginalised people choose not to migrate to the fediverse.

The potential of the fediverse to provide an alternative to social platforms that puts people and the common, public good over profit is tremendous. However, if we want to reap its potential, we need to ensure (1) that infrastructures are funded appropriately (and initiatives like the Sovereign Tech Fund may play their part), (2) that content moderation is recognised and rewarded for the sensitive, culturally aware, contextual work that it is, and (3) that we make a conscious and deliberate effort to bring everyone along, instead of yet again, focusing on those of us with historical privilege.

This text is licensed under a Creative Commons Attribution 4.0 International License