Many of the munitions Israel dropped on targets allegedly selected by Lavender were “dumb” bombs — heavy, unguided weapons that inflicted significant damage and loss of civilian life. According to Abraham’s reporting,
Israeli officials didn’t want to “waste” more expensive
precision-guided munitions on the many junior-level Hamas “operatives”
identified by the program. And they also showed little squeamishness
about dropping those bombs on the buildings where the targets’ families
slept, he wrote.
“We
were not interested in killing [Hamas] operatives only when they were
in a military building or engaged in a military activity,” A, an
intelligence officer, told +972 and Local Call.
“On the contrary, the IDF bombed them in homes without hesitation, as a
first option. It’s much easier to bomb a family’s home. The system is
built to look for them in these situations.”
Widespread concerns about Israel’s targeting strategies and methods have been voiced throughout the course of the war.
“It is challenging in the best of circumstances to differentiate
between valid military targets and civilians” there, Brian Castner,
senior crisis adviser and weapons investigator at Amnesty International,
told my colleagues in December.
“And so just under basic rules of discretion, the Israeli military
should be using the most precise weapons that it can that it has
available and be using the smallest weapon appropriate for the target.”
In reaction to the Lavender revelations, the IDF said in a statement that
some of Abraham’s reporting was “baseless” and disputed the
characterization of the AI program. It is “not a system, but simply a
database whose purpose is to cross-reference intelligence sources, in
order to produce up-to-date layers of information on the military
operatives of terrorist organizations,” the IDF wrote in a response published in the Guardian.
“The
IDF does not use an artificial intelligence system that identifies
terrorist operatives or tries to predict whether a person is a
terrorist,” it added. “Information systems are merely tools for analysts
in the target identification process.”
This
week’s incident involving an Israeli drone strike on a convoy of
vehicles belonging to World Central Kitchen, a prominent food aid group,
killing seven of its workers, sharpened the spotlight on Israel’s
conduct of the war. In a phone call with Israeli Prime Minister Benjamin Netanyahu on Thursday, President Biden
reportedly called on Israel to change course and take demonstrable
steps to better preserve civilian life and enable the flow of aid.
Separately, hundreds of prominent British lawyers and judges submitted a
letter to their government, urging a suspension of arms sales to Israel
to avert “complicity in grave breaches of international law.”
The
use of AI technology is still only a small part of what has troubled
human rights activists about Israel’s conduct in Gaza. But it points to a
darker future. Lavender, observed Adil Haque,
an expert on international law at Rutgers University, is “the nightmare
of every international humanitarian lawyer come to life.”
______________________________________________________________________
Yuval Abraham wrote the following note on +972 Magazine in which he summarizes the impact of his report as of 4/5/24, with lots of links for those who want to pursue the unfolding story in greater depth, tracking the responses of governments, journalists, NGOs etc. I am pasting it below.
I broke a major story two days ago. Here’s what it has done so far
This week, we at +972 Magazine and Local Call
published a huge story that I’ve been working on for a long time. Our
new investigation reveals that the Israeli army has developed an
artificial intelligence-based program called “Lavender,” which has been
used to mark tens of thousands of Palestinians as suspected militants
for potential assassination during the current Gaza war.
According to six whistleblowers
interviewed for the article, Lavender has played a central role in the
Israeli army’s unprecedented bombing of Palestinians since October 7,
especially during the first weeks of the war. In fact, according to the
sources, the army gave sweeping approval for soldiers to adopt
Lavender’s kill lists with little oversight, and to treat the outputs of
the AI machine “as if it were a human decision.” While the machine was
designed to mark “low level” military operatives, it was known to make
what were considered identification “errors” in roughly 10 percent of
cases. This was accompanied by a systematic preference to strike
Lavender-marked targets while they were in their family homes, along
with an extremely permissive policy toward casualties, which led to the
killings of entire Palestinian families.
At the political level, meanwhile, White House National Security spokesperson John Kirby noted that the United States was examining the contents of our investigation. Palestinian parliamentarian Aida Touma-Suleiman cited sections of our report in a speech at the Knesset. UN Secretary General António Guterres
expressed that he was “deeply troubled” by our findings, adding “No
part of life and death decisions which impact entire families should be
delegated to the cold calculation of algorithms.”
It has been meaningful to see so many
readers praising our investigation as one of the most important works
of journalism in the war. And we have much more we want to do.
I personally hope that this exposé
will help bring us a step closer toward ending this terrible war and
confronting the violent systems that enable injustice here in
Israel-Palestine. I’m grateful to you for reading our investigation, and for supporting the work that journalists like myself are doing at +972 Magazine
_____________________________________________
Here is a link to a video from Democracy Now that covers the story in some depth.