Taking the alt-right offline
No room for nazis
In the past year, deadly hate crimes have become common enough to be considered familiar. So common an occurrence that we feel ourselves sickened by their familiarity. The ease of acquiring firearms in the U.S. makes it the most common site of mass killings, but as the Quebec City mosque shooting of 2017 illustrates, Canada and its own Islamophobic, anti-migrant, colonialist currents are not without implication. When these crimes are committed, the circumstances bear two hallmark features: the murderer is acting as a representative of the alt-right, and the victims are marginalized because of racialization, their faith, their gender, or sexuality. It is not at all a stretch to call this violence an epidemic, and one of the many strategies being employed to address its symptoms aims to limit and put a stop to the spread of hate speech in online social circles. This is a strategy called deplatforming.
Deplatforming means exactly what it sounds like: depriving people of a platform to spread their ideas. Specifically, it refers to social media sites like Twitter, Facebook and Instagram banning users for using social media as a hateful propaganda machine. Not much imagination is needed to bring up examples of this behaviour – anyone who’s been on reddit has seen it happen, along with most people on Facebook who have dipped a toe into meme-sharing groups where about half are dominated by racist, misogynistic, intentionally harmful “jokes.” Some might even say encountering this type of garbage is part and parcel of exploring cyberspace. But as Mashable writer Rachel Kraus wrote in her “2018 Review: Year in Deplatforming,” it doesn’t have to be that way.
“2018 was the year we (kind of) cleaned up the internet [. . . ] if a bigot rants in a forest, and nobody is around to hear him, does anybody care what the [bigot] says?”
After the El Paso shooting on Aug. 3 – where 22 people were killed and 24 injured in a Walmart by a man armed with a hatred for migrant residents and an AR-15 – deplatforming became a public topic. Bennie Thompson, a Republican and Chairman of Homeland Security, issued a subpoena for the owner of 8chan, a message board site heavily linked with the alt-right. The owner, Jim Watkins – a US military veteran and staunch defender of “free speech” –was ordered to appear before the Security Committee on Sept. 5. Why is the Homeland Security Committee concerned with this little corner of the internet? Because the El Paso shooter, the shooter at the Tree of Life synagogue in Poway, California, and the shooter in Christchurch, New Zealand have one thing in common besides their alt-right ideological stance: they all posted manifestos or intentions to commit their crimes on 8chan prior to doing so. Says Andrew Taylor of USA Today, “at least three acts of deadly white supremacist extremist violence [had] been linked to 8chan in the last six months.” 8chan was later shut down, for good.
Fredrick Brennan is the creator of 8chan, no longer the owner, and in an interview with the New York Times written by Kevin Roose he explained that he spends a lot of his time trying to distance himself from what the site has become. Brennan started 8chan with the idea of a “free speech” utopia where people could say whatever they wanted, uninhibited by the “censors” on its predecessor 4chan. (For anyone who has been on 4chan, this might seem strange, since it doesn’t strike one as the most censored place on the internet.) But Brennan says now that he feels regret over his site becoming a watering hole for violent extremism, and that he like many of us finds himself sadly unsurprised when a shooting occurs. “Whenever I hear about a mass shooting, I say, ‘All right, we have to research if there’s an 8chan connection,” Brennan tells the Times. Memes are often made after these shootings praising the murderers, and users have taken to calling it a “high score” when a high number of people are killed.
Kraus in her Mashable article interviewed Steven Hanks from the Southern Poverty Law Centre (SPLC), a legal organization specializing in public service which did the work of documenting 1,020 active hate groups in the United States in 2018. Hanks says social media platforms “are some of the most effective vehicles for [hate speech] becoming mainstream,” and Brennan agrees that while shutting down 8chan (a move he publicly called for) might not stop the violence, he hoped that it would significantly decrease its frequency and ease of transmission. The Christchurch shooter going so far as to livestream his crime on Facebook is further evidence towards a public platform being a valuable tool for these terrorists.
Deplatforming depends on the cooperation of corporate entities that control who is able to use their websites. In the case of 8chan, Jim Watkins refused to answer Brennan’s call to take the site down, but when the web company Cloudflare decided to cut ties and suspend its services there was simply no way to keep things running. Something similar happened to the neo-fascist Proud Boys, banned from Instagram and Facebook and even forbidden by Paypal to use their services. YouTube also banned the group, but for reasons of “copyright infringement,” not hate speech, meaning they were able to come back once those issues were fixed.
So, does deplatforming work? Technology news site TechCrunch suggests it has measurable effects, citing a study of Reddit conducted at the Georgia Institute of Technology. Reddit’s dip into deplatforming work, consisting mostly of taking down subreddits that are full of hate speech, has according to this study reduced the hate speech of the same users by as much as 90 per cent. It also resulted in members of these subreddits leaving the site, or migrating to other subreddits where they did not go on to increase the overall amount of hate speech there.
It is still true, however, that those bigots who are forced to leave platforms will inevitably seek another platform, and the ones that take them in are much more resistant to any kind of moderation. Gab is one such site, and it is harder to take down than 8chan because while some companies like Paypal and Stripe have refused to work with it, others step in to fill the gap.
Still, some alt-right figureheads really have been taken out of the public eye by multiple bans, like neo-Nazi Milo Yiannopoulos who was banned from Twitter, Facebook and Youtube, and tells VICE in an interview that he “cannot put food on the table” anymore despite migrating to the alternate platform Telegram. Whether or not deplatforming is the preferred tool for the job, we can be at times reassured that the effort to “clean up the internet” and make hateful voices less deafening has some success.