April 12, 2024

How US Intelligence and an American Company Feed Israel’s Killing Machine in Gaza

Because it isn’t so much the bombs that kill but the list that puts civilians in the way of the bombs.

James Bamford
Palestinians standing next to a vehicle in the central Gaza Strip, where employees from the World Central Kitchen (WCK) were killed in an Israeli airstrike.
Employees from the World Central Kitchen were killed in an Israeli air strike on their vehicles in the central Gaza Strip. (Majdi Fathi / NurPhoto via Getty Images)

Glilot Junction, in the Israeli city of Ramat HaSharon, is a crossroad where the Trans-Samaria Highway bisects the Coastal Highway. It is also a key crossroad for Israel’s human and technical spies. On the east side of the highway is Hertzog Camp and to the west is Dane Camp. Together, the 462 acres of land make up Camp Moshe Dayan‑the most secret military base in the country.

Surrounded by a variety of intrusion-prevention devices and high steel fences topped with barbed wire, Camp Dayan is home to a number of highly sensitive military intelligence instillations, including its central training base; the advanced Targeting Center; Unit 81, a secret technology unit attached to the Special Operations Division; and the IDF Signals Intelligence College.

But by far the most secret is the headquarters of Unit 8200, which specializes in eavesdropping, codebreaking, and cyber warfare—Israel’s equivalent of the American National Security Agency. One of Unit 8200’s newest and most important organizations is the Data Science and Artificial Intelligence Center, which, according to a spokesman, was responsible for developing the AI systems that “transformed the entire concept of targets in the IDF.” Back in 2021, the Israeli military described its 11-day war on Gaza as the world’s first “AI war.” Israel’s ongoing invasion of Gaza offers a more recent—and devastating—example.

More than 70 years ago, that same patch of land was home to the Palestinian village of Ajleel, until the residents were killed or forced to abandon their homes and flee in fear during the Nakba in 1948. Now, soldiers and intelligence specialists are being trained at Camp Moshe Dayan to finish the job—to bomb, shoot, or starve to death the descendants of the Palestinians forced into the squalor of militarily occupied Gaza decades ago.

Earlier this month saw a continuation of that effort, with the targeting of three well-marked and fully approved aid vehicles belonging to World Central Kitchen, killing their seven occupants and ensuring that the food would never reach those dying of starvation. The targeting was precise—placing missiles dead center in the aid agency’s rooftop logos. Israel, however, said it was simply a mistake, similar to the “mistaken” killing of nearly 200 other aid workers in just a matter of months—more than all the aid workers killed in all the wars in the rest of the world over the last 30 years combined, according to the Aid Worker Security Database.

Such horrendous “mistakes” are hard to understand, considering the enormous amount of advanced targeting AI hardware and software provided to the Israeli miliary and spy agencies—some of it by one American company in particular: Palantir Technologies. “We stand with Israel,” the Denver-based company said in posts on X and LinkedIn. “The board of directors of Palantir will be gathering in Tel Aviv next week for its first meeting of the new year. Our work in the region has never been more vital. And it will continue.” As one of the world’s most advanced data-mining companies, with ties to the CIA, Palantir’s “work” was supplying Israel’s military and intelligence agencies with advanced and powerful targeting capabilities—the precise capabilities that allowed Israel to place three drone-fired missiles into three clearly marked aid vehicles.

Current Issue

Cover of April 2024 Issue

“I am pretty encouraged about talent here and that we are getting the best people,” Alex Karp, cofounder and CEO of the company, told a group soon after arriving in Tel Aviv last January. “What I see in Israel is this hybrid of talent that is qualitative and argumentative.” Immediately after the talk, Karp traveled to a military headquarters where he signed an upgraded agreement with Israel’s Ministry of Defense. “Both parties have mutually agreed to harness Palantir’s advanced technology in support of war-related missions,” said Executive Vice President Josh Harris.

The project involved selling the ministry an Artificial Intelligence Platform that uses reams of classified intelligence reports to make life-or-death determinations about which targets to attack. In an understatement several years ago, Karp admitted, “Our product is used on occasion to kill people,” the morality of which even he himself occasionally questions. “I have asked myself, ‘If I were younger at college, would I be protesting me?’” Recently, a number of Karp’s employees decided to quit rather than be involved with a company supporting the ongoing genocide in Gaza. And in London’s Soho Square, dozens of pro-Palestine protesters and health workers gathered at Palantir’s UK headquarters to accuse the firm of being “complicit” in war crimes.

Palantir’s AI machines need data for fuel—data in the form of intelligence reports on Palestinians in the occupied territories. And for decades a key and highly secret source of that data for Israel has been the US National Security Agency, according to documents released by NSA whistleblower Edward Snowden. After fleeing to Hong Kong in 2013 with a pocket full of flash drives containing some of the agency’s highest secrets, Snowden ended up in Moscow where, soon after he arrived, I met with him for Wired magazine. And in the interview, he told me that “one of the biggest abuses” he saw while at the agency was how the NSA secretly provided Israel with raw, unredacted phone and e-mail communications between Palestinian Americans in the US and their relatives in the occupied territories. Snowden was concerned that as a result of sharing those private conversations with Israel, the Palestinians in Gaza and the West Bank would be at great risk of being targeted for arrest or worse.

According to the Top Secret/Special Intelligence agreement between the NSA and Israel, “NSA routinely sends ISNU [Israeli SIGINT National Unit] minimized and unminimized raw collection…as part of the SIGINT relationship between the two organizations.” It adds, “Raw SIGINT includes, but is not limited to, unevaluated and unminimized transcripts, gists, facsimiles, telex, voice and Digital Network Intelligence metadata and content.”

The Nation Weekly

Fridays. A weekly digest of the best of our coverage.
By signing up, you confirm that you are over the age of 16 and agree to receive occasional promotional offers for programs that support The Nation’s journalism. You may unsubscribe or adjust your preferences at any time. You can read our Privacy Policy here.

Now, with Israel’s ongoing war in Gaza, critical information from NSA continues to be used by Unit 8200, according to a number of sources, to target tens of thousands of Palestinians for death—often with US-supplied 2,000-pound bombs and other weapons. And it is extremely powerful data-mining software, such as that from Palantir, that helps the IDF to select targets. While the company does not disclose operational details, some indications of the power and speed of its AI can be understood by examining its activities on behalf of another client at war: Ukraine. Palantir is “responsible for most of the targeting in Ukraine,” according to Karp. “From the moment the algorithms set to work detecting their targets [i.e., people] until these targets are prosecuted [i.e., killed]—a term of art in the field—no more than two or three minutes elapse,” noted Bruno Macaes, a former senior Portuguese official who was given a tour of Palantir’s London headquarters last year. “In the old world, the process might take six hours.”

The company is currently developing an even more powerful AI targeting system called TITAN (for “Tactical Intelligence Targeting Access Node”). According to Palantir, TITAN is a “next-generation Intelligence, Surveillance, and Reconnaissance ground station enabled by Artificial Intelligence and Machine Learning to process data received from Space, High Altitude, Aerial and Terrestrial layers.” Although designed for use by the US Army, it’s possible that the company could test prototypes against Palestinians in Gaza. “How precise and accurate can you know a system is going to be unless it’s already been trained and tested on people?” said Catherine Connolly of the Stop Killer Robot coalition, which includes Human Rights Watch and Amnesty International.

The most in-depth examination of the connection between AI and the massive numbers of innocent Palestinian men, women, and children slaughtered in Gaza by Israel comes from an investigation recently published by +972 Magazine and Local Call. Although Palantir is not mentioned by name, the AI systems discussed by the journalists appear to fit into the same category. According to the lengthy investigation, Unit 8200 is currently using a system called “Lavender” to target thousands of alleged Hamas fighters. But the magazine also reported that, while before the brutal attack on October 7 Israel’s rules of engagement tightly restricted the numbers of non-combatant casualties allowed in targeting a single alleged Hamas militant, such limitations have been loosened in the months since to the point of allowing for the slaughter of dozens of Palestinian noncombatants (including women and children) for each targeted individual. President Joe Biden’s December warning that Israel was losing international support because of its “indiscriminate bombing” of Gaza appears to have had no effect.

The +972 Magazine report details how the Israeli military uses powerful algorithms to sort through enormous volumes of surveillance data—phone, text, and digital——to come up with lengthy kill lists of targets. And adding to that haul would be the data from the NSA intercepts of Palestinians in the United States communicating with their families in Gaza—a process that continued after Snowden left NSA, according to a number of sources.

In 2014—more than a year after Snowden turned up in Moscow—Unit 8200’s targeting of innocent Palestinian in the occupied territories was so extreme that it even caused 43 veterans of the unit, including many serving in the reserves, to go public and accuse the organization of startling abuses. They declared that they had a “moral duty” to no longer “take part in the state’s actions against Palestinians.”

In a letter to their commanders, Prime Minister Benjamin Netanyahu, and the head of the Israeli army, they charged that Israel used the information collected against innocent Palestinians for “political persecution.” And in testimonies and interviews given to the media, they specified that data were gathered on Palestinians’ sexual orientations, infidelities, money problems, family medical conditions and other private matters that could be “used to extort/blackmail the person and turn them into a collaborator” or create divisions in their society.

Several years ago, Brig. Gen. Yossi Sariel, the current director of Unit 8200, published a book outlining a supposedly fictional and far-reaching AI system. But the journalists from +972 and Local Call discovered that the super-powerful target generation machine he wrote about then as fiction actually exists. “During the first weeks of the war, the army almost completely relied on Lavender,” they write, “which clocked as many as 37,000 Palestinians as suspected militants—and their homes—for possible air strikes.” And from the beginning, there was little attempt to verify or justify the thousands of names generated by the machine’s “kill list.”

“One source,” they write, “stated that human personnel often served only as a ‘rubber stamp’ for the machine’s decisions, adding that, normally, they would personally devote only about ‘20 seconds’ to each target before authorizing a bombing—just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as ‘errors’ in approximately 10 percent of cases and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.”

But it gets even worse. They found that the Israeli army deliberately and systematically bombed the homes of targeted individuals—killing entire families—simply because another AI algorithm told them the individual was there. “The result,” they write, “as the sources testified, is that thousands of Palestinians—most of them women and children or people who were not involved in the fighting—were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions.”

Such actions were likely contributing factors in the recent decision by the United Nations Human Rights Council to adopt a resolution calling for Israel to be held accountable for possible war crimes and crimes against humanity committed in Gaza. Twenty-eight countries voted in favor, with only six voting against the resolution. Still, in the White House and on Capitol Hill, there seems little concern for the dangers of Israel’s deadly weaponization of AI and the technology’s connection to the staggering numbers of innocent men, women, children, and whole families massacred in Gaza. Washington director for Human Rights Watch Sarah Yager told Politico, “Nobody has any insight including, I would say, U.S. policymakers on how Israel is conducting this war.”

For years, the United States has had strict regulations on the export of weapon systems to foreign countries because of the lack of accountability once in the users’ possession, and the potential for serious war crimes. Even Palantir CEO Alex Karp has argued that “the power of advanced algorithmic warfare systems is now so great that it equates to having tactical nuclear weapons against an adversary with only conventional ones.” Israel’s indiscriminate killing in Gaza offers the perfect example of why it’s time to also begin far stricter regulation of the export of AI systems, like those developed by Palantir. Systems that, as Karp suggested, are the digital equivalent of a weapon of mass destruction. After all, it’s not just the bomb that kills, but the list that puts you and your family under it.

Thank you for reading The Nation!

We hope you enjoyed the story you just read, just one of the many incisive, deeply-reported articles we publish daily. Now more than ever, we need fearless journalism that shifts the needle on important issues, uncovers malfeasance and corruption, and uplifts voices and perspectives that often go unheard in mainstream media.

Throughout this critical election year and a time of media austerity and renewed campus activism and rising labor organizing, independent journalism that gets to the heart of the matter is more critical than ever before. Donate right now and help us hold the powerful accountable, shine a light on issues that would otherwise be swept under the rug, and build a more just and equitable future.

For nearly 160 years, The Nation has stood for truth, justice, and moral clarity. As a reader-supported publication, we are not beholden to the whims of advertisers or a corporate owner. But it does take financial resources to report on stories that may take weeks or months to properly investigate, thoroughly edit and fact-check articles, and get our stories into the hands of readers.

Donate today and stand with us for a better future. Thank you for being a supporter of independent journalism.

Thank you for your generosity.

James Bamford

James Bamford is a best-selling author, Emmy-nominated filmmaker, and winner of the National Magazine Award for Reporting. His most recent book is Spyfail: Foreign Spies, Moles, Saboteurs, and the Collapse of America's Counterintelligence, published by Twelve Books.

More from The Nation

Police officers stand guard outside the Islamabad High Court in Islamabad, Pakistan, in August 2023.

Pakistani Judges Have Accused Military “Intelligence Operatives” of “Coercion or Blackmail” Pakistani Judges Have Accused Military “Intelligence Operatives” of “Coercion or Blackmail”

Six judges alleged that the Pakistan Army attempted to influence their verdicts, with now-imprisoned former prime minister Imran Khan at the center of the controversy.

Hasan Ali

Iranian protesters burn Trump photo

Biden’s Failure to Rejoin the Iran Nuclear Deal Was an Unforced Error Biden’s Failure to Rejoin the Iran Nuclear Deal Was an Unforced Error

The question remains: Could some version of it be salvaged in 2025?

Bob Dreyfuss

Wikileaks founder Julian Assange (L) and former Pakistan prime minister Imran Khan (R)

If Joe Biden Really Wants to Celebrate Press Freedom, He Should Free Julian Assange If Joe Biden Really Wants to Celebrate Press Freedom, He Should Free Julian Assange

Joe Biden will celebrate World Press Freedom Day tomorrow. But it is a safe bet that he'll have nothing to say about Assange or Imran Khan, both behind bars for defying the US.

Charles Glass

Defense Secretary Austin, General Brown Hold Media Briefing At The Pentagon

The Ukraine Aid Package Heightens the Risk of Escalation  The Ukraine Aid Package Heightens the Risk of Escalation 

The passage by Congress of the latest aid package to Ukraine was met with cheers, but there is ample reason for caution.

James Carden and Katrina vanden Heuvel

The Rev. William Barber speaking in front of the US Supreme Court Building

War, Genocide, Violence, and the Gospel’s Response War, Genocide, Violence, and the Gospel’s Response

I come asking with brother Marvin Gaye, “What’s going on? What’s going on?”

Rev. Dr. William J. Barber II

A wall separating Israel and the West Bank.

Can We Build a Shared Homeland for Israelis And Palestinians? Can We Build a Shared Homeland for Israelis And Palestinians?

In this conversation, Jon Wiener and May Pundak of A Land For All discuss a road map for a better two-state solution.

Interview / Jon Wiener