ISRAEL used AI computers to pick tens of thousands of targets in Gaza, an investigation has claimed.
The system known as Lavender identified 37,000 potential targets in the first stage of the six-month war.
PABrit aid workers John Chapman, James ‘Jim’ Henderson, and James Kirby, were killed in an Israeli airstrike in Gaza on Monday[/caption]
EPAA convoy of World Central Kitchen cars were struck after the aid workers set out to deliver food to Palestinians in Gaza[/caption]
APThe victims of the strike were British, Polish, Australian, and Palestinian[/caption]
Details emerged amid global outrage over drone strikes on an aid convoy that left seven charity workers dead, including three British military veterans who were there to provide security.
A series precision missile strikes killed former SBS hero John Chapman, 57, former Royal Marine James Henderson and former Rifleman and Afghan war veteran James Kirby, 47.
The Lavender system crunched huge amounts of data to identify suspected Hamas terrorists and their homes, Israeli reports claimed.
Almost 33,000 people have been killed since Israel vowed to crush Hamas in revenge for the October 7 atrocities that left 1,200 people dead in Israel and saw 250 people kidnapped.
A joint investigation by +972 magazine and the Hebrew Language news outlet LocalCall found Israeli Defence Force commander issued “sweeping approval for officers to adopt Lavender’s kill lists”.
Sources claimed the troops were not required “to thoroughly check” the computer’s life or death decisions.
One source claimed soldiers’ only role was to “rubber stamp” the AI verdicts.
A soldier involved in the process said they spent about 20 seconds confirming each target – long enough to confirm it was male – before authorising a bombing.
That was “despite knowing” that the system had an error rate of roughly one in ten.
But the sources insisted that Lavender was more reliable than humans who could be influenced by grief and anger in the wake of the October 7 carnage.
One source told investigators: “I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago.
“Everyone [in Gaza], including me, lost people on October 7. The machine did it coldly. And that made it easier.”
The IDF said it used “information management tools” to help identify military targets.
But it denied relying on AI to identify terrorists.
I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago
Unnamed source
In a statement to the Guardian it said: “Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist.
“Information systems are merely tools for analysts in the target identification process.
“According to IDF directives, analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives.”
They said each target required an “individual assessment”.
A spokesman added: “The IDF does not carry out strikes when the expected collateral damage from the strike is excessive in relation to the military advantage.
“In accordance with the rules of international law, the assessment of the proportionality of a strike is conducted by the commanders on the basis of all the information available to them before the strike.”
Israel Defense ForcesChilling bodycam shows a Hamas terrorist open fire on innocent civilians[/caption]
Twitter/@IDFIsraeli commandos storm a warehouse to rescue hostages and kill Hamas terrorists[/caption]
Leave a comment