Soon after Elon Musk took control of Twitter, now called X, the platform faced a massive problem: Advertisers were fleeing. But that, the company alleges, was someone else’s fault. On Thursday that argument went before a federal judge, who seemed skeptical of the company's allegations that a nonprofit’s research tracking hate speech on X had compromised user security, and that the group was responsible for the platform’s loss of advertisers.
The dispute began in July when X filed suit against the Center for Countering Digital Hate, a nonprofit that tracks hate speech on social platforms and had warned that the platform was seeing an increase in hateful content. Musk’s company alleged that CCDH’s reports cost it millions in advertising dollars by driving away business. It also claimed that the nonprofit’s research had violated the platform’s terms of service and endangered users’ security by scraping posts using the login of another nonprofit, the European Climate Foundation.
In response, CCDH filed a motion to dismiss the case, alleging that it was an attempt to silence a critic of X with burdensome litigation using what’s known as a “strategic lawsuit against public participation,” or SLAPP.
On Thursday, lawyers for CCDH and X went before Judge Charles Breyer in the Northern California District Court for a hearing to decide whether X’s case against the nonprofit will be allowed to proceed. The outcome of the case could set a precedent for exactly how far billionaires and tech companies can go to silence their critics. “This is really a SLAPP suit disguised as a contractual suit,” says Alejandra Caraballo, clinical instructor at Harvard Law School's Cyberlaw Clinic.
Unforeseen Harms
X alleges that the CCDH used the European Climate Foundation’s login to a social network listening tool called Brandwatch, which has a license to access X data through the company’s API. In the hearing Thursday, X’s attorneys argued that CCDH’s use of the tool had caused the company to spend time and money investigating the scraping, for which it also needed to be compensated on top of payback for how the nonprofit’s report spooked advertisers.
Judge Breyer pressed X’s attorney, Jonathan Hawk, on that claim, questioning how scraping posts that were publicly available could violate users’ safety or the security of their data. “If [CCDH] had scraped and discarded the information, or scraped that number and never issued a report, or scraped and never told anybody about it. What would be your damages?” Breyer asked X’s legal team.
Breyer also pointed out that it would have been impossible for anyone agreeing to Twitter's terms of service in 2019, as the European Climate Foundation did when it signed up for Brandwatch, years before Musk’s purchase of the platform, to anticipate how its policies would drastically change later. He suggested it would be difficult to hold CCDH responsible for harms it could not have foreseen.
“Twitter had a policy of removing tweets and individuals who engaged in neo-Nazi, white supremacists, misogynists, and spreaders of dangerous conspiracy theories. That was the policy of Twitter when the defendant entered into its terms of service,” Breyer said. “You're telling me at the time they were excluded from the website, it was foreseeable that Twitter would change its policies and allow these people on? And I am trying to figure out in my mind how that's possibly true, because I don't think it is."
Most PopularThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian Chokkattu CultureConfessions of a Hinge Power UserBy Jason Parham SecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty GearHow Do You Solve a Problem Like Polestar?By Carlton Reid
GearSpeaking after the hearing, Imran Ahmed, CEO of CCDH, was optimistic about the direction of the judge’s inquiry. “We were particularly surprised by the implication in X Corp.’s argument today that it thinks that CCDH should somehow be on the hook for paying for X Corp. to help neo-Nazis, white supremacists, and misogynists escape scrutiny of their reprehensible posts,” he says. “We can't help but note that X Corp. really had no response to our assertion that Musk changed X's policies to reinstate white supremacists, neo-Nazis, misogynists, and other propagators of hateful and toxic content.”
Breyer did not indicate Thursday when he would rule on whether the case could move forward.
Broken Trust
After taking over Twitter in late 2022, Musk fired much of the company's trust and safety team, which kept hateful and dangerous content as well as disinformation off the platform. He then also offered amnesty to users who had been banned for violating the platform’s policies. CCDH is among a number of organizations and academics who have published evidence showing that X has become a haven for harmful and misleading content under Musk’s watch.
The suit against CCDH was just one of many ways in which platforms have sought to limit transparency in recent years. X now charges $42,000 for access to its API, making analyzing data from the platform financially inaccessible to many researchers and members of civil society. For its part, Meta has wound down CrowdTangle, a tool that allowed researchers and journalists to track the spread of posts, and cut off researchers at New York University who were studying political ads and Covid-19 disinformation.
Both Meta and X filed suit against Bright Data, a third-party data collection service, for scraping their platforms. In January, Meta’s case against Bright Data was dismissed. “The Facebook and Instagram Terms do not bar logged-off scraping of public data; perforce it does not prohibit the sale of such public data,” wrote US federal judge Edward Chen in his verdict. “The Terms cannot bar Bright Data’s logged-off scraping activities.”
Bright Data spokesperson Jennifer Burns calls the platforms’ suits against the company “an effort to build a wall around publicly available data.”
Caraballo, of Harvard Law School, says Elon Musk appears to have decided lawsuits are a good strategy for silencing critics of his social platform. In November, X filed a lawsuit against the watchdog group Media Matters for America, accusing the group of trying to drive advertisers away from the platform by reporting how ads appeared next to neo-Nazi content.
The suit was filed in Texas, where anti-SLAPP laws that can be used to quash frivolous lawsuits do not apply in federal courts, which will make it more difficult for the case to be dismissed, says Caraballo. “I think it's incredibly concerning that this is part of that broader pattern, because these are the mechanisms that hold powerful companies accountable,” she says.
She guesses that while X might be able to move forward with a narrow version of its claim that CCDH breached its terms of service, “most of the claims will get tossed out.”
X did not respond to request for comment by the time of publication.