Europe is betting the farm on policing social media. Those plans are already coming unstuck.
In Germany, home to arguably some of the world’s toughest online hate speech rules, politicians are struggling to rein in Telegram, a fringe social network laden with extremist, far-right influencers and COVID-19 conspiracy theorists.
After demanding — and failing — to force the platform to take action against piles of hateful and false content, often shared in groups with tens of thousands of followers, the country’s new coalition threatened this month to shut down the social network, even though its power to do so is limited, at best.
Berlin’s stand-off highlights how the real hardcore digital nastiness has migrated from mainstream platforms like Facebook, Google’s YouTube and Twitter to newly-popular fringe networks. Yet a spate of new online content rules across the European Union, United States and beyond are unlikely to stop such harmful content from spreading on alt-platforms where there’s almost no content moderation and whose audiences are tiny compared to those of Silicon Valley giants.
On platforms like Telegram, BitChute and Odysee, overt calls to violence from far-right extremists mix seamlessly with debunked claims about COVID-19, based on POLITICO’s review of thousands of messages, videos and comments across these three platforms, in English, French, and German, over the last three months. Anti-vaccine influencers with online followers in the hundreds of thousands share the latest political conspiracy theories, while white supremacist groups post inflammatory videos in attempts to radicalize would-be supporters online.
Without fail, none of these social media posts were subject to any content moderation, or ran with disclaimers that highlighted the claims were unproven.
This shift to fringe networks — in part due to the mainstream platforms clamping down on the worst offenders like QAnon followers in the U.S. or Querdenker supporters in Germany — is mostly passing policymakers by in their race to pass new laws to stop hate speech, harmful misinformation and digital radicalization. That could prove to be a massive misstep as these alt-platforms increasingly become a training ground for a new generation of extremists eager to leverage their support on these networks to foment dissent, either on more mainstream social media or in the offline world.
In Germany, the country’s online hate speech rules, known as NetzDG, have barely made a dent in the rise of hate, anti-Semitism and COVID-19 misinformation on Telegram despite politicians calling for almost a year for the platform to rein in such material. So far, the company, headquartered in Dubai, has ignored those demands, which include potential fines of up to €50 million for noncompliance, despite numerous real-world violent protests across Germany being organized on the site.
At the European Union level, separate proposals known as the Digital Services Act have primarily centered on policing the large platforms like Facebook and YouTube, requiring these big-hitters to carry out risk assessments on potentially harmful online behavior, as well as facing blockbuster fines if these firms don’t do more to protect people online.
While the likes of Telegram and Odysee may eventually fall within the EU’s proposed rules, which could be passed as soon as the summer, these companies have mostly ignored governments’ demands to moderate online content. They often claim people’s right to free speech in batting away officials’ claims that they aren’t doing enough to remove the hateful and violent content from their platforms. In the United Kingdom, where officials are finalizing their separate online content proposals, the role of fringe networks in promoting digital hate barely gets a mention.
Representatives for Telegram, BitChute and Odysee did not respond to requests for comment.
A shift to the fringe
Part of the problem is that misinformation has not gone away from the mainstream platforms where the vast majority of people spend their time online.
Even as Facebook, YouTube and Twitter take unprecedented action to scrub themselves from COVID-19 conspiracy theories and extremist content, mostly in non-English language, such material is still getting through their nets — and urging policymakers to pass rules that primarily focus on Big Tech networks.
Yet within alt-social media, whose ranks have swollen since 2020 after the mainstream platforms began to remove tens of thousands of extremists’ accounts, pages and groups, a microcosm of hate now flourishes beyond the rules of traditional content moderation or policymaking.
In the United States, white supremacist groups with Telegram channels in the tens of thousands routinely post calls for violence against elected officials and government efforts to impose mask mandates during the coronavirus pandemic. In Germany, online influencers with more than 200,000 supporters similarly use the encrypted messenger to spread anti-semitic attacks and claims that the country’s government is illegitimate.
On BitChute and Odysee, both video-streaming platforms, copies of the COVID-19 conspiracy theory document Plandemic rack up, collectively, tens of thousands of views, while claims of voter election fraud in the U.S., Germany and, worrying ahead of the upcoming French presidential election, in France routinely pop up without any disclaimers. Some of these videos have been subsequently shared on the mainstream platforms, but are often removed within days or weeks for breaking those companies’ community standards.
For now, such misinformation and hate has mostly stayed siloed in these fringe networks, rarely breaking out into the wider digital world. But law enforcement, social media experts and policymakers are growing wary that the alt-platforms are fast becoming a breeding ground for coordinated attacks, either in the online or offline worlds.
In Germany, for instance, hundreds of anti-COVID-19 rallies have been organized within Telegram channels, while in the U.K., far-right extremists have co-opted anti-vaccine online groups to sow messages of hate, including during real-world rallies, based on POLITICO’s review of social media activity. Conspiracy theory videos — in multiple Western languages — shared on fringe platforms are regularly peppered with unmoderated comments calling for violent protests against perceived government abuses associated with the coronavirus pandemic.
With lawmakers focused squarely on tackling misinformation and hate speech on Big Tech platforms, such tactics are getting little, if any, attention from new online content legislation — a potential misstep as these fringe networks become an ever-growing petri dish of online radicalization.
Want more analysis from POLITICO? POLITICO Pro is our premium intelligence service for professionals. From financial services to trade, technology, cybersecurity and more, Pro delivers real time intelligence, deep insight and breaking scoops you need to keep one step ahead. Email [email protected] to request a complimentary trial.