Source: Al Arabiya &Agencies
The official website of the Kataeb Party leader
Tuesday 16 April 2024 16:19:07
Five UN human rights experts, citing recent media reports, have condemned the Israeli military’s use of artificial intelligence (AI) in targeting Gaza homes.
These independent UN experts were responding to investigative reports in Israeli media outlets +972 magazine and Local Call, which stated the military was using AI systems known as ‘Gospel,’ ‘Lavender’ and ‘Where’s Daddy?’ to isolate and identify thousands of Palestinians as potential bombing targets – in some cases with as little as 20 seconds of human oversight.
“Six months into the current military offensive, more housing and civilian infrastructure have now been destroyed in Gaza as compared to any conflict in memory,” the UN experts said in their statement, which described the widespread destruction of civilian infrastructure in Gaza as “domicide.”
They also expressed concerns that the Israeli military’s use of AI was combined with “lowered human due diligence to avoid or minimize civilian casualties.”
Earlier this month, in statements issued to the media, Israeli military had denied using AI to identify what it termed as suspected extremists and targets.
The US also said it was looking into the reports that the Israeli military had been using AI to help identify bombing targets in Gaza, White House national security spokesperson John Kirby told CNN this month.
Kirby said the US had not verified the content of the media reports published in +972 magazine and Local Call.
Meanwhile, UN Secretary-General Antonio Guterres expressed grave concern over reports that Israel was using AI to identify targets in Gaza.
Guterres said he was “deeply troubled by reports that the Israeli military’s bombing campaign includes AI as a tool in the identification of targets, particularly in densely populated residential areas, resulting in a high level of civilian casualties.”
He further said: “No part of life-and-death decisions that impact entire families should be delegated to the cold calculation of algorithms,” he said.
The +972 report claims “the Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties.”
During the first weeks of the war in Gaza, the Israeli army “almost completely” relied on AI system Lavender, “which clocked as many as 37,000 Palestinians and suspected militants and their homes for possible air strikes,” the Israeli media outlets reported, citing six Israeli intelligence officers, who served in the army during the current war in Gaza and had first-hand knowledge on using AI to generate targets for assassination.
Meanwhile, in a rare confession of wrongdoing, Israel admitted earlier this month to a series of errors and violations of its rules in the killing of seven aid workers in Gaza, saying it had mistakenly believed it was “targeting armed Hamas operatives.”