YouTube’s Rulings on Gaza War Videos Spark Internal Backlash

A month after Hamas militants from Gaza attacked an Israeli music festival last October, the Hebrew rap duo Ness & Stilla premiered “HarbuDarbu” on YouTube. The military hype song celebrates Israeli forces waging war in Gaza and has drawn over 25 million views; its critics have termed the song a violent and hateful anti-Palestinian “genocide anthem.” “One, two, shoot!” its refrain thunders.

Despite demands from employees and activists for its removal, “HarbuDarbu” has been allowed to stay up on YouTube. Crucially, YouTube determined that the song’s violent rhetoric targets Hamas, not Palestinians as a whole, and that as a US-labeled terrorist organization Hamas can be subject to hate speech without penalty, according to three people involved in or briefed on content moderation work at YouTube but not authorized to discuss it.

In the closely tracked decision over “HarbuDarbu,” YouTube’s trust and safety team consulted executives and reviewed internal and external expert interpretations of the lyrics, which include slang and clever phrases with debatable meanings. The ultimate finding was that one of the song’s opening lines, which describes rodents coming out of tunnels, shows that the song is about Hamas (which regularly uses tunnels to navigate and hide in Gaza) and therefore does not qualify as hate speech, according to the sources.

Employees who want the video removed say it should count as hate speech because, they contend, the lyrics urge violence against all Palestinians by mentioning Amalek, a Biblical term used throughout history to describe Israel’s enemies. Israel prime minister Benjamin Netanyahu used the term in remarks last October after the music festival tragedy, but his office later clarified that his intention was to invoke Hamas and not in any way call for the genocide of Palestinians.

The rationale behind leaving the video up and unrestricted, reported here for the first time, is a prime example of what a handful of employees at YouTube and across the rest of Google who spoke with WIRED believe to be a pattern of inconsistent moderation of content relating to Israel’s war with Hamas. The sources believe management at the world’s most popular video platform have been playing favorites and scrabbling to justify takedowns—or find exceptions to keep content up.

YouTube spokesperson Jack Malon did not dispute WIRED’s reporting on “HarbuDarbu” and other videos cited in this story. But he strongly challenges accusations of bias and calls it misleading to draw broad conclusions about YouTube’s enforcement approach based on “a handful of examples.” He adds that internal disagreements on such cases are common.

“We dispute the characterization that our response to this conflict deviated from our established approach toward major world events,” Malon says. “The suggestion that we apply our policies differently based on which religion or ethnicity is featured in the content is simply untrue. We have removed tens of thousands of videos since this conflict began. Some of these are tough calls, and we don’t make them lightly, debating to get to the right outcome.”

War Cry

Though disputes over what belongs on YouTube and other massive social networks have spilled into the public before, the war in Gaza has made reaching internal consensus on takedowns near impossible, sources say, just as decisions on what’s left up carry great importance in influencing public response to a crisis that’s left Israel on edge and Gaza in ruins.

Sources tell WIRED they wanted to bring more scrutiny to YouTube’s decisionmaking because they feel accountability has been limited even internally. In the past, YouTube staffers in emails, chats, and calls would summarize their logic to employees from other Google units. To avoid contentious discussions since October, that transparency is largely gone, the sources say. Malon says the flow of information has increased. But as one source puts it, the substance is now missing: “Here’s the decision, we’re moving on, let’s not dwell on it.”

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuCultureConfessions of a Hinge Power UserBy Jason ParhamGearHow Do You Solve a Problem Like Polestar?By Carlton ReidSecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty

At YouTube, algorithms and about 20,000 people working on moderation are meant to remove anything that promotes war, violence, harassment, or hate speech regardless of who posted the video, said Pedro Pina, the platform’s vice president for Europe, Middle East, and Africa, in a recent talk. “Our number one concern is to keep the platform safe,” he said.

Three sources from the company say that in moments of heightened tensions like this one, YouTube typically would lean on the worst-case view and remove or restrict a video such as “HarbuDarbu.” There’s certainly precedent for caution. YouTube has long removed music videos calling for deadly violence against racial, religious, and ethnic groups, including a Hebrew song purportedly from 2017 in which an unknown artist sings, "I'll cleanse my country of every Jew,” according to a copy that remains available for educational purposes. Yet with “HarbuDarbu,” YouTube has stood by its narrow interpretation of the targeted group.

YouTube’s Malon says its policies allow for greater criticism of governments and terrorist organizations. He declined to go into detail on specific videos, saying that it could provide users a roadmap around YouTube’s rules.

Employees contend that “HarbuDarbu” also could be removed for violating harassment policies even setting aside the war. The lyrics state, by some interpretations, “every dog’s day will come” while referring to fashion model Bella Hadid, pop singer Dua Lipa, and former porn star Mia Khalifa—all of whom have called Israel’s campaign genocidal. But YouTube has found that the apparently critical mentions of the stars do not rise to harassment, for unclear reasons. “They choose not to go with common sense,” one of the sources says.

Representatives for the Israeli rappers and three celebrities didn’t respond to requests for comment. In a post on X last December, Khalifa criticized “HarbuDarbu” for using a hip-hop genre known as drill that was popularized by Black artists in Chicago. “They can’t even call for genocide in their own culture, they had to colonize something to get it to #1,” she tweeted, referring to the song’s position on Israeli music charts.

Palestinian activists publicly have compared YouTube’s treatment of “HarbuDarbu” to that of “Hind’s Hall,” a song by US rapper Macklemore expressing support for protests against the war and describing Israeli occupation of Palestinian territories as apartheid. “I want a ceasefire,” he sings.

YouTube placed a content warning on the music video for “Hind’s Hall” when it debuted May 7 and restricted access to only those people claiming to be at least 18 years old, a barrier that can severely limit audience and a creator’s ad revenue. The warning requires viewers to acknowledge that a “video may be inappropriate for some users” before clicking to proceed. Its use on “Hind’s Hall” drew internal and external complaints. YouTube removed the warning on May 8 upon further review, Malon says.

The age restriction has stayed in place due to graphic violence, according to three of the sources. The music video shows fleeting clips of a child buried in rubble and another with a bloodied face. Macklemore on May 8 uploaded a version with only the song’s name displayed in the frame; it hasn’t been restricted but has less than half as many views.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuCultureConfessions of a Hinge Power UserBy Jason ParhamGearHow Do You Solve a Problem Like Polestar?By Carlton ReidSecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty

Representatives for Macklemore didn’t respond to a request for comment.

“HarbuDarbu” rightfully isn’t age-restricted, sources say, though they add that they believe it shouldn’t be online at all.

Exceptional Cases

The sources also expressed concerns that YouTube’s broader approach to managing the conflict has been one-sided. For instance, Hebrew-language videos showing Israeli forces interacting with purported Palestinians in visibly dehumanizing ways have been left up on YouTube, including one in which Israeli soldiers appear to kick and stomp a detainee. Malon says YouTube tolerates some graphic videos showing government violence because they may be valuable for the public to assess.

On the other hand, YouTube’s “violent extremist or criminal organizations” policy bars uploads from groups that the US or the UN label as terrorists and content “depicting hostages.” So when Hamas, a US-designated terror group that administers Gaza’s government, threatened to broadcast executions of Israeli captives last October, YouTube required its entire trust and safety team of hundreds of employees to quickly train on identifying and removing that content. While critics had called on platforms to take extreme precautionary measures, several think tanks informed the company that the likelihood of executions was remote.

The preparations, as two sources described it, also led to establishing around-the-clock staffing for weeks to monitor for any uploads of Israeli hostages and err toward removals. Hostages have died in Hamas’ custody but there ultimately haven’t been any known execution broadcasts.

The sources say they don’t fault YouTube for taking precautions. But they believe smaller-scale continuous monitoring adopted during other wars, such as Ukraine, would have been sufficient and not distracted workers from other vital projects. They also say moderation for violent content out of Ukraine has felt more even-handed, with removals prioritized on both sides of the war.

YouTube’s Malon reiterates that it’s false to suggest YouTube’s response to Gaza has been inconsistent with other conflicts. YouTube has gathered feedback from experts across the world and focused on relevant threats, he says.

But some notable groups allegedly have been excluded from formal processes. Google employees for years have suggested that YouTube should tap the Hebrew and cultural expertise of B’Tselem, an Israeli organization founded in 1989 that tracks violations of Palestinians’ human rights. But one of the sources says B’Tselem—scorned at times by Israel’s right wing—hasn’t been brought into YouTube’s priority flagger program, which allows activists such as the antisemitism-focused nonprofit Anti-Defamation League a dedicated line to advise on problematic content. Malon says B’Tselem hasn’t been denied entry; the group has not formally applied.

Being in the program has given 7amleh, an advocacy organization for Palestinians' online rights, an up close look at what it alleges in an April report as unfair bias. “We have firsthand experience of YouTube and Google employees telling us that videos clearly inciting violence and racism against Palestinians do not violate the companies policies, while educational videos posted by Palestinian creators have to be age restricted, label as graphic, or simply just taken down,” says Eric Sype, 7amleh’s US national organizer. (Its report also recommended that YouTube start sharing ad revenue with creators in the Palestinian territories, an exclusion that WIRED revealed last November.) YouTube’s Malon denies accusations of unequal treatment.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuCultureConfessions of a Hinge Power UserBy Jason ParhamGearHow Do You Solve a Problem Like Polestar?By Carlton ReidSecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty

Sources point to The Civil Front, a pro-Israeli channel launched last October 16 that has attracted over 2,000 subscribers, as a stark case of the double standards. It has been allowed to remain up despite various content violations that would normally merit removal. About 15 YouTube staffers, including senior leaders, collaborated on reviewing the channel, according to one of the sources, which they and others described as abnormal for such a small creator.

YouTube removed many of The Civil Front’s popular videos for violating policies against animal abuse, harassment, violent extremism, or attempts to circumvent rules, the source says. Among the takedowns was a video in which children sing in Hebrew, “Within a year we will annihilate everyone,” appearing to refer to Gazans, according to a translation by Middle East Eye, a channel associated with pro-Arab groups.

Malon says nothing was atypical in YouTube’s response in this case. The Civil Front didn’t respond to a request for comment.

Sources inside Google view a contrast in a case last week. Following a single warning in May for a video that appeared to glorify Palestinian militantism, YouTube shut down two channels tied to the organization Samidoun Network for unspecified violations of the violent criminal organizations policy. (Meta also censored Samidoun on Facebook and Instagram, and PayPal blocked the group in 2019. Israel has designated the group as terrorists.) Samidoun’s international coordinator Charlotte Kates says YouTube already rejected an appeal and that webinars and speeches uploaded over the past decade appear to be lost on the platform forever. Malon says it was a routine enforcement.

Accusations of uneven policing by YouTube are long-running, touching topics far beyond Gaza. For instance, months into the war, YouTube left up a video from Olajide Olayinka Williams Olatunji, a creator known as KSI with millions of followers, one of the sources says. It showed KSI’s reaction to AI-generated movie posters making light of police brutality and Nazism. KSI didn’t outright condemn the hateful memes but exclaimed “Oh my god,” and gazed disapprovingly. YouTube removed thousands of similar poster reactions from smaller creators, but the source was told that taking down a popular star’s clip could spark a firestorm, so it gave KSI a pass.

YouTube’s Malon says a creator’s popularity isn’t considered in policy decisions and pointed to penalties KSI described receiving in 2021. KSI didn’t respond to a request for comment, but after WIRED’s inquiry he made the video private—just as YouTube deliberations had been, until now. Sharing these rare inside accounts of how YouTube approaches a war anthem or just routine hate, sources hope, could finally bring the platform’s leadership to more consistent conclusions.

Some workers critical of YouTube’s enforcement have felt isolated internally and considered quitting as little changes favorably in their view. But as one source says, it’s also difficult for them to sit idly by and not be compelled to action as the war rages on.

About Paresh Dave

Check Also

The Hottest Startups in Helsinki in 2024

Helsinki’s startup scene evolved around behemoths such as Nokia, games giant Supercell, and food delivery …

Leave a Reply