TikTok helped promote Germany’s far-right extremist political party to young voters ahead of last month’s EU elections, even when they were searching the app for other political parties or politicians, according to a new report shared exclusively with WIRED.
The report was written by researchers from the nonprofit organization AI Forensics and Interface, a European think tank specializing in information technology. Researchers found that in a quarter of cases, young users in Germany searching on the app for specific political parties and their politicians in the weeks leading up to the vote on June 5 were instead given suggestions for other parties. In the majority of these cases, they were given suggestions linked to Alternative for Germany (AfD), Germany’s leading far-right party.
It is already well documented that the AfD has successfully leveraged TikTok to spread extremism and disinformation to a younger audience, but the new research suggests that the far-right group, which was labeled “extremist” by a German court earlier this year, was aided by the TikTok algorithm itself.
TikTok, which was provided with a copy of the final report ahead of publication, didn’t dispute the research findings but said that it has, in the past, made some accounts linked to the AfD ineligible for search recommendations as a result of content violations.
“For the regular search, you will see AfD popping up more often, because the AfD is more present on TikTok, but for the search suggestions there’s also this algorithmic aspect where someone makes the decision to relate these two searches,” Martin Degeling, who tracks AI-based recommendation systems at Interface, tells WIRED. “You search for the Green Party, and the AfD pops up, you search for the CDU and the AfD pops up, [but] you search for AfD, no other party pops up.”
The researchers say that their findings prove no active collaboration between TikTok and far-right parties like the AfD but that the platform’s structure gives bad actors an opportunity to flourish. “TikTok's built-in features such as the ‘Others Searched For’ suggestions provides a poorly moderated space where the far-right, especially the AfD, is able to take advantage,” Miazia Schüler, a researcher with AI Forensics, tells WIRED.
“During elections, TikTok is not giving equal visibility to all the parties, and it's basically incentivized to create suggestions that are not based on the content inside the app,” says Salvatore Romano, head of research at AI Forensics, adding that further research the group has conducted in France, Poland, Italy, and other EU countries found that “similarly problematic content was being shown across countries.”
TikTok says that it had put in place country-specific tools to combat the spread of misinformation during the EU elections.
“We protect the integrity of our platform by proactively enforcing firm policies against election misinformation and hate speech, and connecting people to reliable information at our Election Centers, which received over 7.5 million visits before the EU election,” TikTok spokesperson Ariane de Selliers tells WIRED.
Most PopularThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian Chokkattu CultureConfessions of a Hinge Power UserBy Jason Parham GearHow Do You Solve a Problem Like Polestar?By Carlton Reid SecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty
GearThe researchers also found that in one-third of searches, users were presented with conspiratorial and clickbait-style search suggestions that had little to do with the terms being searched for.
When searching for content related to the Greens, a political party in Germany, the TikTok search suggestions included “Habeck’s wife leaves,” which is a reference to the leader of the Green Party and German vice chancellor Robert Habeck. The suggestion was included even though there is no substance to the claim and no related videos to be found on TikTok.
“The suggestion insinuates a gossip-like curiosity based on a spread of false information for the sake of a seemingly newsworthy headline that has little political implications,” the researchers wrote in the report.
TikTok did not respond to how or why these search terms were suggested.
Other examples include the promotion of random clickbait-style search suggestions such as “bisexual princess” when searching for the Social Democratic Party or fear-mongering suggestions like “Putin’s last warning” when searching for the Greens.
Research shows that even if users don’t click on any of the search suggestions in the app, simply seeing the suggestions is enough to make the terms stick in people’s brains, and the more extreme the suggestions the more likely they are to remember.
In research conducted last year by Interface and AI Forensics, the group found that users presented with a series of search suggestions most often picked the most suggestive headline available to click on. In one case, a headline titled: “Olaf Scholz Caught in a Club”—which was unrelated to any actual incident and didn’t have any corresponding videos on TikTok—also became the fake headline most people remembered.
“People actually remember these gossip or clickbait headlines that are kept in the system without actually having any video or any content backing it up,” Degeling said. “We see this as evidence that the search suggestions, just by themselves, whether there's any video related to them, actually stay in people's mind.”
Research by Interface and AI Forensics found that 67 percent of 18-to-25-year-old TikTok users in Germany used the search function on the app frequently, lining up with research from Adobe published earlier this year that found 40 percent of Americans are using TikTok as a search engine now and that some Gen Z users were relying on it more than Google. Search on TikTok is becoming an increasingly important way that users, particularly younger users, discover content on the app.
Within TikTok, search suggestions appear in multiple locations, but for this study, the researchers focused on results from the “Others searched for” option. This is a group of eight different search terms that appear below the initial results on the search page, and, on the face of it, appear to be linked to the search term just used.
TikTok said that many factors contribute to whether a search term is recommended, and which keywords are suggested, including comments and common searches made after watching a video.
The researchers did find that TikTok had taken some steps to limit the spread of inaccurate or incendiary search results for certain parties or politicians, but the moderation efforts were not applied consistently across the platform.
The researchers also said their findings appear to show that TikTok was employing a “blocklist,” because one in three search terms used for TK?! returned nothing in the “Others searched for” box. These terms spanned the political spectrum from the AfD to the left-wing Free Democratic Party and Greens.
TikTok told WIRED that it was “inaccurate to assume” TikTok used a blocklist but did not immediately respond to a follow-up question about how it chose which search terms returned blank results.
“If there is a moderation policy in place, it's not consistent,” says Romano.