Linda Yaccarino Says X Needs More Moderators After All

When Elon Musk took over Twitter, since rebranded as X, his favorite letter of the alphabet, he went on a firing spree. Chief among those ejected were people working on trust and safety, the work of keeping bad content, from hate speech to child exploitation, off the platform.

In front of a US Senate committee today, X CEO Linda Yaccarino appeared to tacitly acknowledge that Musk went too far in tearing down the platform’s guardrails, indicating the company was partially reversing course. She said that X had increased the number of trust and safety staff by 10 percent in the past 14 months and planned to hire 100 new moderators in Austin focused on child sexual exploitation.

Yaccarino spoke at a Senate hearing called to discuss social networks’ failure to curb child sexual abuse, alongside the CEOs of Meta, TikTok, Snap, and Discord. She also said multiple times that “less than 1 percent” of X users were under 18. That claim—and her announcement that after 14 months of Musk’s ownership and deep cuts to trust and safety, the company was now hiring new moderators—raised the eyebrows of social platform experts and former Twitter employees.

Theodora Skeadas, a former member of Twitter’s trust and safety team laid off by Musk in November 2022, says that even after making the hires Yaccarino boasted of, X is still woefully understaffed for a major social platform. “Unless their technical systems for flagging and removing content have really improved, 100 is not enough,” says Skeadas. “And that seems unlikely because they’ve fired so many engineers.” X did not immediately respond to a request for comment.

Bonfire of the Mods

Shortly after acquiring Twitter in October 2022, Musk laid off nearly half of Twitter’s employees, making deep cuts into the trust and safety teams. Researchers and civil society organizations that had built relationships with the platform’s trust and safety teams in order to alert them to hateful or problematic content quickly found themselves without anyone left at the platform to contact.

The platform was nearly banned in Brazil in the run-up to the country’s 2022 presidential runoffs, after the country’s Electoral Court worried that Musk would allow election-related lies to spread. A team of academic researchers found that hate speech spiked after Musk took the helm, and last September, ahead of a historic election year, X fired five of the remaining trust and safety workers focusing on combating mis- and disinformation.

Skeadas says that before Musk took over, there were about 400 Twitter staff working on trust and safety, plus some 5,000 contractors who helped review content on the platform. Most of those staffers and more than 4,000 of the contractors were laid off.

Even after Yaccarino’s claimed recent increase in trust and safety staff of more than 10 percent, the platform likely still has far fewer people working on keeping users safe. There’s “no way” the company has more trust and safety staff than it did before Musk, Skeadas says. “If there were 20 people left and they hired two people, then that is a 10 percent increase, but that’s still nothing compared to before,” she says.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuCultureConfessions of a Hinge Power UserBy Jason ParhamSecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'FlahertyGearHow Do You Solve a Problem Like Polestar?By Carlton Reid

Adding back 100 moderators, as Yaccarino claimed before the Senate today, would not be nearly enough to properly police content related to teenage users and child-exploitation-related content, even if they were solely focused on child sexual abuse, Skeadas says.

Matt Motyl, who formerly worked at Meta and is now a research and policy fellow at the Integrity Institute, a think tank focused on trust and safety, agrees. He’s also skeptical of Yaccarino’s claim that less than 1 percent of X’s users are under 18, which she used to suggest that many of the issues raised by the Senate committee were less relevant to X than for her fellow CEOs giving testimony.

“X doesn’t have much by way of age verification,” Motyl says, so it would be very easy for a young user to lie about their age in order to use the platform. A Pew Research study released in December 2023 found that 20 percent of 13- to 17-year-olds in the US say they “ever use" Twitter. The site could be hosting plenty of teens who are simply undercover and not captured in the metrics cited by Yaccarino.

However many teens there are on X in 2024, Motyl and Skeadas say they and the site’s other users are less properly protected than they should be.

Motyl says that Yaccarino, steward of a platform with some 300 to 500 million users worldwide, appears to be more interested in spectacle over substance. “One hundred moderators is nothing,” he says. “It’s trust and safety theater.”

About Vittoria Elliott

Check Also

The Hottest Startups in Helsinki in 2024

Helsinki’s startup scene evolved around behemoths such as Nokia, games giant Supercell, and food delivery …

Leave a Reply