Deplatforming describes the action of preventing a person deemed to hold unfavorable views from using certain websites, forums, or social media platforms. Deplatforming is undertaken by any individual, group, or organization that wants to censor speakers with controversial opinions by removing their platform. Censorship may be a result of racist or extremist views or any other action deemed to violate a platform’s terms of service.
The effectiveness of deplatforming
The platform itself may be a social media account or any public venue where a large or influential group of people has assembled.
Deplatforming assumes that people who are prohibited from speaking about a certain topic will not have their message heard.
However, research has found that deplatforming has mixed results.
During a purge of alt-right accounts in 2016, tens of thousands of former Twitter users simply migrated to Gab.
Consequently, Gab become a haven for extremists and gained notoriety after a mass shooter posted his manifesto there in 2018.
Some argue that this is a classic case of the Streisand Effect, where attempts to suppress information have the opposite effect of making it more visible.
Conversely, a similar purge of discriminatory subreddits in 2015 was shown to be effective.
While a small percentage of banned users migrated to Voat, Reddit itself saw a significant decrease in new accounts promoting hate speech after the purge was completed.
Research has also conclusively determined that mainstream social media is the primary driver of traffic to websites with extreme content.
When this avenue is closed off, controversial figures lose their ability to whip social media users into a frenzy.
Unintended consequences of deplatforming
In limited cases, deplatforming can backfire on society. Consider these following instances:
After Gab users were eventually deplatformed, it managed to survive by using decentralized technologies.
Forced to innovate, Gab became a stronger, more unified, and more radicalized community that was more resistant to moderation and censorship.
Decentralization, so often lauded as a concept, also provided Gab users a free and unencumbered means of self-organization.
Although the audience had shrunk in relative terms, deplatforming caused terrorist activity to migrate to a less-visible and much less-regulated platform.
Here, deplatforming shifted users to an app that was arguably more suited to hate speech.
Telegram gives controversial individuals protection while giving them public and private messaging and broadcasting functionality.
This functionality solves what studies have called the “terrorist’s dilemma”, or the seeking of a medium with an ideal balance of operational security and public outreach.
- Deplatforming is the prevention of an individual, group, or organization holding defamatory views from disseminating those views to an audience.
- Deplatforming effectiveness is mixed, but it most beneficial for causes where large social media platforms drive traffic to third-party websites. Purges undertaken by Reddit and Twitter reduced the amount of hate speech on their platforms with an inconsequential number of users migrating to other platforms.
- Deplatforming can strengthen certain movements by forcing them to unite and innovate. It can also shift extremist groups to platforms that are better suited to their modus operandi.
Connected Business Concepts
Main Free Guides: