Report finds extremist content getting millions of views on TikTok as app works to rout it

After years of reports and warnings, TikTok is still failing to moderate neo-Nazi and other fascist content on its platform, allowing propaganda producers and distributors to garner tens of millions of views on hateful, violent content and grow a network of of pro-Nazi accounts, researchers with the Institute for Strategic Dialogue said in a new report published on Monday.

What You Need To Know

The pro-Nazi content includes Holocaust denial, the glorification of Nazi Germany leader Adolf Hitler, proposing Nazi-era Germany, Nazism “as a solution to contemporary issues,” support for white supremacist mass shooters, and “livestreamed footage or recreations of” massacres carried out by white supremacists, ISD researchers Nathan Doctor, Guy Fiennes, and Ciarán O’Connor concluded.

“Self-identified Nazis are openly promoting hate speech and real-world recruitment on TikTok. Not only is the platform failing to remove these videos and accounts, but its algorithm is amplifying their reach,” they wrote, later adding: “TikTok is failing to adequately and promptly take down accounts pushing pro-Nazi hate speech and propaganda. Although the platform manages to periodically take down accounts, this often comes weeks and months after flagrant rule-breaking activity, during which time the accounts were able to accrue significant viewership.”

WIRED was the first to report on the ISD’s findings.

“Hateful behavior, organizations and their ideologies have no place on TikTok, and we remove more than 98% of this content before it is reported to us. We work with experts to keep ahead of evolving trends and continually strengthen our safeguards against hateful ideologies and groups,” TikTok spokesperson Nicholas Smith said in an email.

In the first three months of 2024, TikTok removed 166 million videos that violated their content policies, including more than 1.6 million that violated the “safety and civility” policies that extremist content is governed by, a source familiar with the company's internal operations told Spectrum News. The source said that 88% of the videos that violated TikTok’s safety and civility policies were removed within 24 hours.

The ISD report found hundreds of accounts “which openly support Nazism and use the video app to promote their ideology and propaganda” and connected many of the accounts to an off-platform effort, coordinated on messaging apps like Telegram. Of a 50 account sample reported by ISD to TikTok — which collected over 6.2 million views combined — none were banned immediately with all of the reported content being determined to have “no violation.” Within a month of the reports, only 23 of the 50 openly Nazi accounts were banned. The now-banned accounts had garnered at least two million views collectively prior to their bans.

The TikTok source said the company is reviewing content from the report and pledged to remove anything that violates its guidelines.

The content produced and shared by these Nazi propagandists were also found to be promoted by TikTok’s algorithm. Using a brand new account created for the purposes of the report, the researchers observed just ten videos from pro-Nazi accounts and found TikTok’s “For You Page” “immediately began recommending similar content via algorithm from previously unsurfaced accounts.”

They also discovered that Nazis were using TikTok to drive viewers to Telegram channels where they could further access more extreme and far-right propaganda.

“Self-identified Nazis discuss TikTok as an amenable platform to spread their ideology which can be used to reach ‘much more people’, even if accounts are brand new or have a low following,” the researchers wrote.

According to the report, a Telegram account associated with this effort wrote that a TikTok account with zero followers could receive “more views than you could ever have on” the social media platforms X, formerly known as Twitter, and BitChute, a video-hosting site known for its pro-Nazi and hateful content.

“It just reaches much more people,” the Telegram user wrote.

Telegram did not immediately return a request for comment. The Southern Poverty Law Center, a nonprofit civil rights group, calls the messaging app a haven for extremist content. It was briefly suspended in all of Brazil last year for declining to cooperate with authorities investigating neo-Nazi activity.

Among the content found by researchers were videos celebrating the Christchurch, New Zealand, mass shooter who killed 51 Muslim worshippers in 2019, videos celebrating speeches by Hitler and Italian fascist leader Benito Mussolini, 119 videos that utilized a song with the lyrics “Kill the f*****s and Jew,” content “disseminating coded calls for violence against Jews and other pro-Nazi content,” and calls for the extermination of Jews. Content sharing “incomplete instructions to build improvised explosive devices, 3D-printed guns, and ‘napalm on a budget’” was also discovered by ISD researchers.

Content promoting the Christchurch mass shooter is of particular note. ISD published reports in March 2023 and August 2021 identifying the proliferation of content connected to the shooting, including footage of the gunman’s Facebook livestream he recorded during the attack. As they conducted their research in May of this year, ISD found footage of the attack on TikTok “which, despite being distorted to evade moderation, was still recognizable.”

And a search on TikTok for a song played by the Christchurch shooter on the livestream as he drove away after murdering 51 people at a mosque and Islamic center found seven “of the top 10 videos celebrated the attack, its motivation, the shooter or even recreated it in a video game.” Those seven videos alone garnered 1.5 million views.

Racist mass shooters since Christchurch have cited clips of the live stream and propaganda connected to those killings as inspiration for their own violence. The white supremacist gunman who murdered 10 Black people at a grocery store in Buffalo, N.Y., in May 2022 wrote in his own manifesto that his “path towards becoming a white supremacist terrorist began upon viewing on the 4chan website a brief clip of a mass shooting at a mosque in Christchurch,” according to a report from the New York state attorney general’s office.

The gunman who launched a deadly attack on a synagogue in Poway, California, in April 2019 and the shooter who killed 23 in an August 2019 shooting targeting the Latino community of El Paso, Texas, also cited the Christchurch shooter as an inspiration, according to a report from Georgetown University researchers.

Another prominent piece of content promoted by the Nazi accounts was the propaganda documentary “Europa: The Last Battle,” which is banned from YouTube and most mainstream social media platforms for its denial of the Holocaust and other antisemitic conspiracy theories. One Telegram channel dedicated to spreading the film with more than 12,000 subscribers featured “several posts encouraging users to promote the film – one tells subscribers to blanket TikTok with reaction videos to make the film go viral.”

ISD researchers found “countless promotions” of the film on TikTok, including videos with more than 100,000 views.

“TikTok searches for the film, as well as minor variations on its title, yield dozens of videos. Some posted clips using tweaked tags like #EuropaTheLastBattel,” the researchers wrote. “One account posting such snippets has received nearly [900,000] views on their videos, which include claims that the Rothschild family control the media and handpick presidents, as well as other false or antisemitic claims.”.

The TikTok source claimed that many of the videos referencing "Europa: The Last Battle" are condemning the movie, not endorsing it — therefore not meeting their requirements for deletions and bans. But the ISD report includes screenshots of videos promoting it. And while a cursory search of TikTok for the term "The Last Battle" produced some videos condemning it, it also produced numerous videos promoting the propaganda film.

The top result of that search, conducted by Spectrum News on Monday evening, was a TikTok posted in October 2022 that has garnered over 186,000 views and calls the film "perhaps the most important historical film of all time in understanding the world today" and includes a web address where you can find the film. That website in turn links to openly pro-Nazi Telegram channels that contain Nazi propaganda and encourages its nearly 14,000 subscribers to join real-world Nazi organizations in the U.S. and Europe.

While content moderation of extremist ideologies and violent content is something virtually all social media platforms have struggled with, TikTok has been pledging to do more about the spread of Nazism and other extremist ideologies on its platform for years and reports like the one released Monday continue to be published. In Dec. 2018, Vice News’ technology vertical Motherboard published a report titled “TikTok Has a Nazi Problem” documenting hate speech and support for neo-Nazi groups after the platform had been available to the public for a little over a year. Nearly six years later, the WIRED report published on Monday, first reporting the details of the ISD research, had the exact same headline.

In that 2018 Motherboard article, the writer notes that TikTok owner ByteDance had recently pledged at the time to expand its moderation team from 6,000 to 10,000. Today, TikTok says they have 40,000 employees working on content moderation for their hundreds of millions of global users. ISD’s researchers concluded in their report published on Monday that the company “may not be equipped to handle the coordinated nature of hate speech propagation” and that the tactics used by Nazi propagandists are outmaneuvering whatever safeguards that are in place for weeks or months before they are finally banned.

And even then, the researchers found the Nazi users “apply simple cost-benefit analysis” that their “content is reaching a sufficient audience to make the hassle of periodically creating new accounts worthwhile.” TikTok prohibits ban circumventions, but the researchers noted the pro-Nazi users seem capable of avoiding detection long enough to get millions of views on hateful content.

“Success is not about routing out all forms of Nazism the moment it appears on the platform but rather sufficiently raising the barrier to entry,” they wrote. “Instead, at present, self-identified Nazis are discussing TikTok as [an] amenable platform to spread their ideology.”