The Deplatforming Debate: What We Can Learn from Research on ISIS and the Alt RightAmarnath Amarasingam / Jan 12, 2021
Following the decision of multiple social media platforms to remove or restrict Donald Trump’s accounts, the debate over the efficacy and ethics of deplatforming debate is renewed. As we consider current headlines, it is worthwhile to review the earlier work on the subject of how to deal with violent extremism on social media during the emergence of ISIS. What we have learned over the last six years might be useful today.
- One of the earliest studies that discussed the impact of suspensions of ISIS accounts wasThe ISIS Twitter Census: Defining and describing the population of ISIS supporters on Twitter, by Jonathon Morgan and J.M. Berger. They found that suspensions did have an impact on replies and retweets and overall dissemination. After suspensions, the die-hard supporters dedicated themselves to creating new accounts, but others winnowed away. Berger and Morgan observed that “it appears the pace of account creation has lagged behind the pace of suspensions.”
- On the specific question of how suspensions impact the Twitter network, Berger partnered with Heather Perez on another report, The Islamic State’s Diminishing Returns on Twitter:How suspensions are limiting the social networks of English-speaking ISIS supporters. This report explores how suspensions impact these groups, including major disruptions to dissemination and decline in follower counts.
- Another study by Audrey Alexander at The Combating Terrorism Center at West Point, Digital Decay: Tracing Change Over Ttime Among English-Language Islamic State Sympathizers on Twitter, similarly found that ISIS supporters were finding it hard to “gain traction” after Twitter took a harder stance on the group.
- Yet another study is by Maura Conway at Dublin City University and colleagues, Disrupting Daesh: Measuring Takedown of Online Terrorist Material and its Impacts, which specifically seeks to measure the impact of takedowns on things like community breakdown.
- A piece I return to often is by Brian Fishman, who works on counterrorism and dangerous organizations at Facebook:Crossroads: Counter-terrorism and the Internet. Fishman offers this mic-drop paragraph, which I try to remember in my own work:
- In the realm of far-right specific research, J.M. Berger published an important piece in 2018, The Alt-Right Census Defining and Describing the Audience for Alt-Right Content on Twitter, where he noted that suspensions of far-right accounts were leading to migration to platforms like Gab.
- Another important study from authors at the Georgia Institute of Technology, Emory University and the University of Michigan, You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech looks at the 2015 ban on several hateful subreddits. The study found “Through the banning of subreddits which engaged in racism and fat-shaming, Reddit was able to reduce the prevalence of such behavior on the site.”
- On the differences between how social media companies deal with jihadist groups versus the far-right, seeMaura Conwyay, Routing the Extreme Right Challenges for Social Media Platforms.
- On the subject of “online community”, you can read my thoughts on the Islamic State’s online community in my 2015 War on the Rocks post, What Twitter Really Means for Islamic State Supporters:Elizabeth Pearson at King’s College London has also discussed this issue here in a piece titled Wilayat Twitter and the Battle Against Islamic State’s Twitter Jihad.
- Elizabeth Pearson, a Lecturer at Swansea University affiliated with CYTREC, published Online as the New Frontline: Affect, Gender, and ISIS-Take-Down on Social Media, in 2016. She argued “for a reconsideration of how we assess the impact of suspension or take-down methods on other sites, recommending a shift toward the recognition of the power of affect, emotion, and online community, as well as quantifiable influence, such as numbers of followers and network” in order to understand the effects on the people in the networks.
- Another important piece is from Richard Rodgers at the University of Amsterdam, Deplatforming: Following extreme Internet celebrities to Telegram and alternative social media, which looks in part at how users who are deplatformed describe the mainstream platforms that removed them.
- For individuals who receive an enormous amount of meaning and purpose from being a movement leader in the online space, having that disappear overnight could have unpredictable impact. One such case I’ve written about before- the case of Aaron Driver, a young Canadian who joined a loose network of ISIL supporters from around the world and found great meaning in his role in that network.
- My research group just published a piece on how ISIS supporters reacted to a major online campaign against them, How Telegram Disruption Impacts Jihadist Platform Migration, in November 2019. It seems we may be going through a similar watershed moment for the far-right today.
Many of the conclusions from the research listed above are relevant to understanding the far-right. Extremists get an immense amount of social and psychological benefit from being connected to like-minded people. Disrupting these networks is ultimately a good thing, but we need to think about how it impacts these individuals and how they may respond. Unintended consequences are indeed consequences.