abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeblueskyburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfilterflaggenderglobeglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalitywebwhatsappxIcons / Social / YouTube

Esta página no está disponible en Español y está siendo mostrada en English

Artículo

12 abr 2024

Autor:
James Bamford, The Nation

Palantir allegedly enables Israel's AI targeting in Gaza, raising concerns over war crimes

Ver todas las etiquetas Alegaciones

"How US Intelligence and an American Company Feed Israel’s Killing Machine in Gaza", 12 April 2024

... One of Unit 8200’s newest and most important organizations is the Data Science and Artificial Intelligence Center, which, according to a spokesman, was responsible for developing the AI systems that “transformed the entire concept of targets in the IDF.” Back in 2021, the Israeli military described its 11-day war on Gaza as the world’s first “AI war.” Israel’s ongoing invasion of Gaza offers a more recent—and devastating—example.

...

Earlier this month saw a continuation of that effort, with the targeting of three well-marked and fully approved aid vehicles belonging to World Central Kitchen, killing their seven occupants and ensuring that the food would never reach those dying of starvation. The targeting was precise—placing missiles dead center in the aid agency’s rooftop logos. Israel, however, said it was simply a mistake, similar to the “mistaken” killing of nearly 200 other aid workers in just a matter of months—more than all the aid workers killed in all the wars in the rest of the world over the last 30 years combined, according to the Aid Worker Security Database.

Such horrendous “mistakes” are hard to understand, considering the enormous amount of advanced targeting AI hardware and software provided to the Israeli miliary and spy agencies—some of it by one American company in particular: Palantir Technologies. “We stand with Israel,” the Denver-based company said in posts on X and LinkedIn. “The board of directors of Palantir will be gathering in Tel Aviv next week for its first meeting of the new year. Our work in the region has never been more vital. And it will continue.” As one of the world’s most advanced data-mining companies, with ties to the CIA, Palantir’s “work” was supplying Israel’s military and intelligence agencies with advanced and powerful targeting capabilities—the precise capabilities that allowed Israel to place three drone-fired missiles into three clearly marked aid vehicles.

... Immediately after the talk, Karp traveled to a military headquarters where he signed an upgraded agreement with Israel’s Ministry of Defense. “Both parties have mutually agreed to harness Palantir’s advanced technology in support of war-related missions,” said Executive Vice President Josh Harris.

The project involved selling the ministry an Artificial Intelligence Platform that uses reams of classified intelligence reports to make life-or-death determinations about which targets to attack. In an understatement several years ago, Karp admitted, “Our product is used on occasion to kill people,” the morality of which even he himself occasionally questions. “I have asked myself, ‘If I were younger at college, would I be protesting me?’” Recently, a number of Karp’s employees decided to quit rather than be involved with a company supporting the ongoing genocide in Gaza. And in London’s Soho Square, dozens of pro-Palestine protesters and health workers gathered at Palantir’s UK headquarters to accuse the firm of being “complicit” in war crimes.

Palantir’s AI machines need data for fuel—data in the form of intelligence reports on Palestinians in the occupied territories. And for decades a key and highly secret source of that data for Israel has been the US National Security Agency, according to documents released by NSA whistleblower Edward Snowden. ... he told me that “one of the biggest abuses” he saw while at the agency was how the NSA secretly provided Israel with raw, unredacted phone and e-mail communications between Palestinian Americans in the US and their relatives in the occupied territories. Snowden was concerned that as a result of sharing those private conversations with Israel, the Palestinians in Gaza and the West Bank would be at great risk of being targeted for arrest or worse.

According to the Top Secret/Special Intelligence agreement between the NSA and Israel, “NSA routinely sends ISNU [Israeli SIGINT National Unit] minimized and unminimized raw collection…as part of the SIGINT relationship between the two organizations.” It adds, “Raw SIGINT includes, but is not limited to, unevaluated and unminimized transcripts, gists, facsimiles, telex, voice and Digital Network Intelligence metadata and content.”

Now, with Israel’s ongoing war in Gaza, critical information from NSA continues to be used by Unit 8200, according to a number of sources, to target tens of thousands of Palestinians for death—often with US-supplied 2,000-pound bombs and other weapons. And it is extremely powerful data-mining software, such as that from Palantir, that helps the IDF to select targets. While the company does not disclose operational details, some indications of the power and speed of its AI can be understood by examining its activities on behalf of another client at war: Ukraine. Palantir is “responsible for most of the targeting in Ukraine,” according to Karp. “From the moment the algorithms set to work detecting their targets [i.e., people] until these targets are prosecuted [i.e., killed]—a term of art in the field—no more than two or three minutes elapse,” noted Bruno Macaes, a former senior Portuguese official who was given a tour of Palantir’s London headquarters last year. “In the old world, the process might take six hours.”

The company is currently developing an even more powerful AI targeting system called TITAN (for “Tactical Intelligence Targeting Access Node”). According to Palantir, TITAN is a “next-generation Intelligence, Surveillance, and Reconnaissance ground station enabled by Artificial Intelligence and Machine Learning to process data received from Space, High Altitude, Aerial and Terrestrial layers.” Although designed for use by the US Army, it’s possible that the company could test prototypes against Palestinians in Gaza. “How precise and accurate can you know a system is going to be unless it’s already been trained and tested on people?” said Catherine Connolly of the Stop Killer Robot coalition, which includes Human Rights Watch and Amnesty International.

The most in-depth examination of the connection between AI and the massive numbers of innocent Palestinian men, women, and children slaughtered in Gaza by Israel comes from an investigation recently published by +972 Magazine and Local Call. Although Palantir is not mentioned by name, the AI systems discussed by the journalists appear to fit into the same category. According to the lengthy investigation, Unit 8200 is currently using a system called “Lavender” to target thousands of alleged Hamas fighters. ...

The +972 Magazine report details how the Israeli military uses powerful algorithms to sort through enormous volumes of surveillance data—phone, text, and digital——to come up with lengthy kill lists of targets. And adding to that haul would be the data from the NSA intercepts of Palestinians in the United States communicating with their families in Gaza—a process that continued after Snowden left NSA, according to a number of sources.

...

In a letter to their commanders, Prime Minister Benjamin Netanyahu, and the head of the Israeli army, they charged that Israel used the information collected against innocent Palestinians for “political persecution.” And in testimonies and interviews given to the media, they specified that data were gathered on Palestinians’ sexual orientations, infidelities, money problems, family medical conditions and other private matters that could be “used to extort/blackmail the person and turn them into a collaborator” or create divisions in their society.

Several years ago, Brig. Gen. Yossi Sariel, the current director of Unit 8200, published a book outlining a supposedly fictional and far-reaching AI system. But the journalists from +972 and Local Call discovered that the super-powerful target generation machine he wrote about then as fiction actually exists. ...

...

Such actions were likely contributing factors in the recent decision by the United Nations Human Rights Council to adopt a resolution calling for Israel to be held accountable for possible war crimes and crimes against humanity committed in Gaza. ...

For years, the United States has had strict regulations on the export of weapon systems to foreign countries because of the lack of accountability once in the users’ possession, and the potential for serious war crimes. Even Palantir CEO Alex Karp has argued that “the power of advanced algorithmic warfare systems is now so great that it equates to having tactical nuclear weapons against an adversary with only conventional ones.” Israel’s indiscriminate killing in Gaza offers the perfect example of why it’s time to also begin far stricter regulation of the export of AI systems, like those developed by Palantir. Systems that, as Karp suggested, are the digital equivalent of a weapon of mass destruction. After all, it’s not just the bomb that kills, but the list that puts you and your family under it.

Línea del tiempo