In a world where traditional social media platforms dominate the digital conversation, are decentralized alternatives emerging as a promising counterpoint to censorship or a breeding ground for hate speech?
BeInCrypto talks to Anurag Arjun, co-founder of Avail, a blockchain infrastructure pioneer who is passionate about how decentralization can potentially transform online speech and governance.
In October, X (formerly Twitter) suspended the Hebrew-language account of Iranian Supreme Leader Ali Khamenei for “violating platform rules.” The post in question commented on Israel’s reprisal attack on Tehran, reigniting global debates about the power centralized platforms hold over public discourse.
Many asked: Can it be that a nation’s supreme leader isn’t allowed to comment on airstrikes happening within his own borders?
Political sensitivity aside, the same thing happens all the time with everyday creators in much lower-stakes contexts. In the second quarter of 2024, YouTube’s automated flagging system removed approximately 8.19 million videos, while user-generated flagging removed only about 238,000 videos.
In response, decentralized platforms like Mastodon and Lens Protocol are gaining popularity. Mastodon, for example, saw a surge of 2.5 million active users since Elon Musk’s acquisition of Twitter in November 2022. These platforms promise to redistribute control, but this raises complex questions about moderation, accountability, and scalability.
“Decentralization doesn’t mean the absence of moderation—it’s about shifting control to user communities while maintaining transparency and accountability,” Anurag Arjun, co-founder of Avail, told BeInCrypto in an interview.
Decentralized platforms aim to remove corporate influence over online speech. These platforms allow the users themselves to define and enforce moderation standards. Unlike Facebook or YouTube, which face accusations about algorithmic biases and shadow bans, decentralized systems claim to promote open dialogue.
However, while decentralization removes single-point control, it certainly doesn’t guarantee fairness. A recent survey from the Pew Research Center found that 72% of Americans believe social media companies wield too much power over public discourse.
This skepticism applies to decentralized systems, where governance must remain transparent to prevent louder voices from monopolizing the conversation.
“Distributed governance ensures no individual or corporation unilaterally decides what can or cannot be said, but it still requires safeguards to balance diverse perspectives,” Arjun explains.
Without centralized oversight, decentralized platforms depend on community-driven moderation. This approach hopes to ensure inclusivity but also risks fragmentation when consensus is hard to achieve. Mastodon instances often have varying moderation rules, which can confuse users and jeopardize communities.
Wikipedia is a great example of successful community-led moderation. It relies on 280,000 active editors to maintain millions of pages globally. Transparent processes and user collaboration ensure trust while protecting free expression.
“Transparency in governance is essential. It prevents exclusion and builds confidence among users, ensuring everyone feels represented,” Arjun says.
Decentralized platforms face the challenge of balancing free speech with controlling harmful content like hate speech, misinformation, and illegal activities. A high-profile example is the controversy surrounding Pump.fun, a platform that allowed livestreams for meme coin promotions.
Misuse of this feature led to harmful broadcasts, including threats of self-harm tied to cryptocurrency price swings.
“This highlights a crucial point. Platforms need layered governance models and proof-verification mechanisms to address harmful content without becoming authoritarian,” Arjun explains
The seemingly obvious solution is to utilize artificial intelligence. While AI tools can identify harmful content with up to 94% accuracy, they lack the nuanced judgment required for sensitive cases. In any case, decentralized systems must combine AI with transparent, human-led moderation for effective results.
So the question remains: How do you protect people from harm or enforce any form of regulation without first agreeing on what constitutes foul play? Also, what would the community remold itself into if it were to organically police itself successfully?
Governance and New Censorship Risks
Decentralized governance democratizes decision-making but introduces new risks. Voting systems, while participatory, can marginalize minority opinions, replicating the very issues decentralization seeks to nip in the bud.
For instance, on Polymarket, a decentralized prediction platform, majority voting has sometimes suppressed dissenting views, demonstrating the need for safeguards.
“In an age when centralized control of information is a systemic risk, prediction markets offer a way of cutting through misleading narratives and viewing the unvarnished truth. Prediction markets are freedom preserving technology that move societies forward,” a blockchain researcher commented on X (formerly Twitter).
Transparent appeal mechanisms and tabs on majority power are crucial to preventing new forms of censorship. Decentralized platforms prioritize user privacy, giving individuals control over their data and social graphs.
This autonomy strengthens trust, as users are no longer at the mercy of corporate data breaches like Facebook’s Cambridge Analytica scandal in 2018, which exposed data from 87 million users. In 2017, 79% of Facebook users trusted Meta with their privacy. After the scandal, this number fell by 66%.
However, privacy can complicate efforts to address harmful behaviors. This ensures decentralized networks remain safe without compromising their core principles.
Arjun explains, “Privacy cannot come at the expense of accountability. Platforms must adopt mechanisms that protect user data while enabling fair and transparent moderation.”
Legal and Regulatory Concerns in Decentralized Social Media
A primary challenge for decentralized platforms is addressing legal issues like defamation and incitement. Unlike centralized systems such as X, which receive 65,000 government data requests annually, decentralized platforms lack clear mechanisms for legal recourse. Arjun emphasizes the importance of collaboration between platform creators and lawmakers.
“Engaging regulators can help establish guidelines that protect users’ rights while preserving the ethos of decentralization,” he says.
In authoritarian regimes, decentralized platforms provide a fighting chance to resist censorship. During the Mahsa Amini protests in Iran, for example, government-led internet shutdowns affected 80 million users, stressing the need for censorship-resistant networks. While decentralized platforms are harder to shut down, they are not immune to external pressures.
“Decentralization offers robust tools for resistance, but individual users remain vulnerable. Platforms must develop additional protections to shield them from persecution.“Decentralization began as a movement for user empowerment. To sustain that vision, platforms must prioritize inclusivity, transparency, and technological innovation,” Arjun concludes.
Overall, the future of decentralized social media hinges on addressing these hurdles with creativity and collaboration. If successful, decentralized platforms could redefine the dynamics of online speech, offering a freer and more resilient ecosystem for expression.
The question is not whether decentralization can work but whether it can evolve to balance freedom with responsibility in the digital age.
Disclaimer
Following the Trust Project guidelines, this feature article presents opinions and perspectives from industry experts or individuals. BeInCrypto is dedicated to transparent reporting, but the views expressed in this article do not necessarily reflect those of BeInCrypto or its staff. Readers should verify information independently and consult with a professional before making decisions based on this content. Please note that our Terms and Conditions, Privacy Policy, and Disclaimers have been updated.
Read the full article here