Etiquette



DP Etiquette

First rule: Don't be a jackass.

Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.

Sunday, April 7, 2024

Israel used military AI Program to select targets in Gaza according to new expose

 (WaPo: 4/7/24)

by Ishaan Tharoor

This week, Israeli journalist and filmmaker Yuval Abraham published a lengthy expose on the existence of the Lavender program and its implementation in the Israeli campaign in Gaza that followed Hamas’s deadly Oct. 7 terrorist strike on southern Israel. Abraham’s reporting — which appeared in +972 magazine, a left-leaning Israeli English-language website, and Local Call, its sister Hebrew-language publication — drew on the testimony of six anonymous Israeli intelligence officers, all of whom served during the war and had “first-hand involvement” with the use of AI to select targets for elimination. According to Abraham, Lavender identified as many as 37,000 Palestinians — and their homes — for assassination. (The IDF denied to the reporter that such a “kill list” exists, and characterized the program as merely a database meant for cross-referencing intelligence sources.) White House national security spokesperson John Kirby told CNN on Thursday that the United States was looking into the media reports on the apparent AI tool.

“During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based,” Abraham wrote.

“One source stated that human personnel often served only as a ‘rubber stamp’ for the machine’s decisions, adding that, normally, they would personally devote only about ‘20 seconds’ to each target before authorizing a bombing — just to make sure the Lavender-marked target is male,” he added. “This was despite knowing that the system makes what are regarded as ‘errors’ in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.”

This may help explain the scale of destruction unleashed by Israel across Gaza as it seeks to punish Hamas, as well as the high casualty count. Earlier rounds of Israel-Hamas conflict saw the Israel Defense Forces go about a more protracted, human-driven process of selecting targets based on intelligence and other data. At a moment of profound Israeli anger and trauma in the wake of Hamas’s Oct. 7 attack, Lavender could have helped Israeli commanders come up with a rapid, sweeping program of retribution.

“We were constantly being pressured: ‘Bring us more targets.’ They really shouted at us,” said one intelligence officer, in testimony published by Britain’s Guardian newspaper, which obtained access to the accounts first surfaced by +972.

Many of the munitions Israel dropped on targets allegedly selected by Lavender were “dumb” bombs — heavy, unguided weapons that inflicted significant damage and loss of civilian life. According to Abraham’s reporting, Israeli officials didn’t want to “waste” more expensive precision-guided munitions on the many junior-level Hamas “operatives” identified by the program. And they also showed little squeamishness about dropping those bombs on the buildings where the targets’ families slept, he wrote.

“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” A, an intelligence officer, told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

Widespread concerns about Israel’s targeting strategies and methods have been voiced throughout the course of the war. “It is challenging in the best of circumstances to differentiate between valid military targets and civilians” there, Brian Castner, senior crisis adviser and weapons investigator at Amnesty International, told my colleagues in December. “And so just under basic rules of discretion, the Israeli military should be using the most precise weapons that it can that it has available and be using the smallest weapon appropriate for the target.

In reaction to the Lavender revelations, the IDF said in a statement that some of Abraham’s reporting was “baseless” and disputed the characterization of the AI program. It is “not a system, but simply a database whose purpose is to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organizations,” the IDF wrote in a response published in the Guardian.

“The IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” it added. “Information systems are merely tools for analysts in the target identification process.”

This week’s incident involving an Israeli drone strike on a convoy of vehicles belonging to World Central Kitchen, a prominent food aid group, killing seven of its workers, sharpened the spotlight on Israel’s conduct of the war. In a phone call with Israeli Prime Minister Benjamin Netanyahu on Thursday, President Biden reportedly called on Israel to change course and take demonstrable steps to better preserve civilian life and enable the flow of aid.

Separately, hundreds of prominent British lawyers and judges submitted a letter to their government, urging a suspension of arms sales to Israel to avert “complicity in grave breaches of international law.”

The use of AI technology is still only a small part of what has troubled human rights activists about Israel’s conduct in Gaza. But it points to a darker future. Lavender, observed Adil Haque, an expert on international law at Rutgers University, is “the nightmare of every international humanitarian lawyer come to life.”

 ______________________________________________________________________

 Yuval Abraham  wrote the following note on +972 Magazine in which he summarizes the impact of his report as of 4/5/24, with lots of links for those who want to pursue the unfolding story in greater depth, tracking the responses of governments, journalists, NGOs etc. I am pasting it below.

I broke a major story two days ago. Here’s what it has done so far

This week, we at +972 Magazine and Local Call published a huge story that I’ve been working on for a long time. Our new investigation reveals that the Israeli army has developed an artificial intelligence-based program called “Lavender,” which has been used to mark tens of thousands of Palestinians as suspected militants for potential assassination during the current Gaza war.

According to six whistleblowers interviewed for the article, Lavender has played a central role in the Israeli army’s unprecedented bombing of Palestinians since October 7, especially during the first weeks of the war. In fact, according to the sources, the army gave sweeping approval for soldiers to adopt Lavender’s kill lists with little oversight, and to treat the outputs of the AI machine “as if it were a human decision.” While the machine was designed to mark “low level” military operatives, it was known to make what were considered identification “errors” in roughly 10 percent of cases. This was accompanied by a systematic preference to strike Lavender-marked targets while they were in their family homes, along with an extremely permissive policy toward casualties, which led to the killings of entire Palestinian families.

At the political level, meanwhile, White House National Security spokesperson John Kirby noted that the United States was examining the contents of our investigation. Palestinian parliamentarian Aida Touma-Suleiman cited sections of our report in a speech at the Knesset. UN Secretary General António Guterres expressed that he was “deeply troubled” by our findings, adding “No part of life and death decisions which impact entire families should be delegated to the cold calculation of algorithms.”

It has been meaningful to see so many readers praising our investigation as one of the most important works of journalism in the war. And we have much more we want to do.

I personally hope that this exposé will help bring us a step closer toward ending this terrible war and confronting the violent systems that enable injustice here in Israel-Palestine. I’m grateful to you for reading our investigation, and for supporting the work that journalists like myself are doing at +972 Magazine

 

_____________________________________________

Here is a link to a video from Democracy Now that covers the story in some depth.


 

 

No comments:

Post a Comment