This is an academic overview derived from the Wikipedia article on Deplatforming. Read the full source article here. (opens in new tab)

The Unplatformed Discourse

An academic exploration of the practice of removing individuals or groups from public platforms, examining its historical roots, modern manifestations, and societal implications.

What is Deplatforming? ๐Ÿ‘‡ Explore History ๐Ÿ“œ

Dive in with Flashcard Learning!


When you are ready...
๐ŸŽฎ Play the Wiki2Web Clarity Challenge Game๐ŸŽฎ

What is Deplatforming?

Definition and Scope

Deplatforming, also referred to as "no-platforming," constitutes a form of boycott wherein access to platforms used for disseminating information or ideas is denied to an individual or group. This practice is particularly prevalent and discussed within the context of contemporary social media environments.

Public Discourse and Expression

The core of deplatforming involves the removal of an individual's or group's ability to express their views through established channels. This action directly impacts public discourse by limiting the visibility and reach of certain ideas or speakers, often based on perceived harm or controversy associated with their content.

Legal and Ethical Considerations

While deplatforming by private entities is generally distinct from government censorship under constitutional protections like the First Amendment in the United States, it raises significant ethical questions regarding free speech, platform responsibility, and the potential for bias in content moderation. The debate often centers on balancing the protection of vulnerable groups from harmful speech against the principle of open expression.

Historical Context

University Campus Precedents

The practice of deplatforming speakers on university campuses has historical roots dating back to the mid-20th century. Universities, through their own policies, began restricting access for speakers deemed unqualified or intending to use the institution as a platform for propaganda. For instance, the University of California's "Speaker Ban" policy, enacted under President Robert Gordon Sproul, targeted individuals based on their political affiliations or perceived ideological agendas.

  • In 1951, Max Shachtman, a socialist, was barred from speaking at UC Berkeley.
  • In 1947, former Vice President Henry A. Wallace was prohibited from speaking at UCLA due to his views on Cold War policy.
  • In 1961, Malcolm X was prevented from speaking at Berkeley in his capacity as a religious leader.

The British "No Platform" Policy

In the United Kingdom, the National Union of Students (NUS) formally adopted a "No Platform" policy as early as 1973. This policy aimed to prevent individuals or groups associated with fascism and racism from speaking on university campuses, reflecting a specific political and social context of the time.

Modern Campus Disruptions

In the United States, contemporary instances involve protests and attempts to disinvite speakers from college campuses. The Foundation for Individual Rights in Education (FIRE) has documented numerous such incidents, categorizing them by successful disinvitations, speaker withdrawals due to pressure, or "heckler's vetoes" where disruptions entirely prevent a speech.

  • Charles Murray (2017): A speech at Middlebury College was disrupted by protestors.
  • Ken Ham (2018): An invitation to speak at the University of Central Oklahoma was rescinded following pressure from an LGBT student group.
  • Christina Hoff Sommers (2018): Protesters attempted to halt a lecture at Lewis & Clark Law School.

Deplatforming of Invited Speakers

Campus Speech Controversies

The phenomenon of deplatforming extends to contemporary academic settings, where invited speakers often face protests and demands for disinvitation. These actions are frequently driven by student groups or faculty concerned about the speaker's views, which may be perceived as offensive, discriminatory, or harmful.

Statistical Overview

According to the Foundation for Individual Rights in Education (FIRE), hundreds of disinvitation or disruption attempts have been documented on American campuses since 2000. These incidents highlight a recurring tension between institutional commitments to free speech and the pressures exerted by activist groups seeking to control the boundaries of acceptable discourse.

Documenting the Trend

The challenges faced by speakers invited to campuses have been documented in various media, including films like "No Safe Spaces," which explores the experiences of figures like Adam Carolla and Dennis Prager with disinvitation efforts. These narratives underscore the perceived impact of deplatforming on open debate and academic freedom.

Deplatforming in the Digital Sphere

Platform Moderation Policies

Social media platforms have increasingly engaged in content moderation, leading to the deplatforming of users and communities. Actions range from banning specific subreddits on Reddit for violating anti-harassment policies to removing accounts of prominent figures for policy violations, such as hate speech or incitement.

  • Reddit: Banned several communities for violating its anti-harassment policies, with studies indicating a reduction in hate speech usage among displaced users.
  • Facebook/Instagram: Banned figures like Louis Farrakhan, Milo Yiannopoulos, and Alex Jones for violating policies against "dangerous individuals and organizations" and hate speech.
  • Twitter: Deplatformed Donald Trump following the January 6, 2021 Capitol events, citing concerns about promoting further violence. Elon Musk's subsequent policy changes led to the reinstatement of some previously banned accounts, though with a stated policy of "freedom of speech, but not freedom of reach."
  • Alex Jones: Faced widespread deplatforming across multiple platforms (Facebook, Apple, YouTube, Spotify, Vimeo, Twitter, Pinterest, Mailchimp, LinkedIn) for policy violations including hate speech, discriminatory content, and COVID-19 misinformation.
  • Andrew Tate: Banned from platforms like Twitter, Instagram, Facebook, TikTok, and YouTube for misogynistic comments violating hate speech policies.

Demonetization as a Tool

Demonetization, the practice of withholding financial compensation for content that remains available on a platform, is another method used to manage content. Platforms like YouTube may demonetize videos deemed unsuitable for advertisers, impacting content creators' revenue while allowing the content to persist, thereby creating a distinct form of indirect deplatforming.

Impact and Migration

Research suggests that deplatforming can reduce an individual's audience and influence. However, some studies also indicate that deplatformed content creators may migrate to alternative platforms ("alt-tech"), potentially fostering more toxic environments or reinforcing existing echo chambers. The effectiveness and consequences of deplatforming remain subjects of ongoing analysis.

Legislative and Policy Responses

United Kingdom's Approach

The UK government has introduced legislative measures aimed at addressing deplatforming. The Higher Education (Freedom of Speech) Bill proposes allowing speakers to seek compensation for deplatforming incidents on university campuses and empowers regulators to impose fines on institutions and student unions that facilitate such practices. Additionally, the Online Safety Bill seeks to prevent social media networks from discriminating against political viewpoints or removing "democratically important" content.

United States' Regulatory Debates

In the United States, discussions around deplatforming often involve proposals to regulate social media platforms. Some advocate for treating social media as public utilities to safeguard users' constitutional rights, arguing that online presence is essential for 21st-century participation. Efforts have also been made by Republican politicians to amend or repeal Section 230 of the Communications Decency Act, citing concerns that platform moderation policies are not politically neutral.

Perspectives on Deplatforming

Arguments in Support

Proponents of deplatforming often frame it as a necessary tool to combat the proliferation of hate speech, disinformation, and harmful ideologies. They argue that platforms have an editorial responsibility, akin to news organizations, to moderate content and prevent the amplification of damaging narratives. This perspective emphasizes the potential for deplatforming to reduce the reach and impact of malicious actors.

  • Mitigation of hate speech and disinformation.
  • Platform responsibility for content moderation.
  • Preventing the amplification of harmful narratives.
  • Distinction between private platform moderation and government censorship.

Critical Responses

Critics express concern that deplatforming practices by technology companies constitute a form of censorship that stifles dissenting opinions and creates a "chilling effect" on free expression. They argue that platforms may selectively remove ideologically disfavored accounts under the guise of policy enforcement. This perspective also raises concerns about the potential for such actions to impact paying customers and the broader implications for academic and public discourse.

  • Suppression of dissenting opinions.
  • Potential for ideological bias in moderation.
  • Creation of a "chilling effect" on speech.
  • Concerns about platform power and accountability.

Further Exploration

Academic Research

For a deeper understanding of the dynamics and implications of deplatforming, particularly concerning online platforms and extreme content creators, academic research provides valuable insights into user migration patterns and the effectiveness of such interventions.

Rogers, Richard (May 6, 2020). "Deplatforming: Following extreme Internet celebrities to Telegram and alternative social media". European Journal of Communication. 35 (3): 213โ€“229. doi:10.1177/0267323120922066.

Teacher's Corner

Edit and Print this course in the Wiki2Web Teacher Studio

Edit and Print Materials from this study in the wiki2web studio
Click here to open the "Deplatforming" Wiki2Web Studio curriculum kit

Use the free Wiki2web Studio to generate printable flashcards, worksheets, exams, and export your materials as a web page or an interactive game.

True or False?

Test Your Knowledge!

Gamer's Corner

Are you ready for the Wiki2Web Clarity Challenge?

Learn about deplatforming while playing the wiki2web Clarity Challenge game.
Unlock the mystery image and prove your knowledge by earning trophies. This simple game is addictively fun and is a great way to learn!

Play now

Explore More Topics

References

References

  1.  John Healy (January 8, 2021) Opinion: It took a mob riot for Twitter to finally ban Trump
  2.  Danny Crichton (January 9, 2021) The deplatforming of President Trump
A full list of references for this article are available at the Deplatforming Wikipedia page

Feedback & Support

To report an issue with this page, or to find out ways to support the mission, please click here.

Disclaimer

Important Notice

This page was generated by an Artificial Intelligence and is intended for informational and educational purposes only. The content is based on a snapshot of publicly available data from Wikipedia and may not be entirely accurate, complete, or up-to-date.

This is not professional advice. The information provided on this website is not a substitute for professional consultation on matters of free speech, platform policy, or legal interpretation. Always consult with qualified experts for specific guidance.

The creators of this page are not responsible for any errors or omissions, or for any actions taken based on the information provided herein.