Controversy surrounds Israel’s use of AI for targeted killings in Gaza.
A joint investigation by media outlets 972 Magazine and Local Call has raised doubts over Israel’s targeting of militants in last year’s Gaza conflict. It alleged that the Israeli Defense Forces (IDF) used an artificial intelligence system called ‘Lavender’ to identify over 37,000 potential targets in Gaza for bombing.
The allegations come six months after Hamas launched surprise attacks on Israel on 7 October 2023, leading to an Israeli bombing campaign in Gaza. Over 30,000 people lost their lives as the Strip was reduced to rubble in the aerial strikes.
According to the report, Lavender generated ‘kill lists’ to carry out the targeted strikes on militants. However, the IDF rejected the claims, maintaining it does not use AI to designate individuals as targets.
In a statement, the IDF said, “Information systems are merely tools for analysts in the target identification process.” It insisted the technology is not used to profile people as terrorists or predict militant links.
Nonetheless, the investigative piece has stirred debate around the opaque nature of Israel’s targeted killing tactics. With the IDF providing little transparency, questions persist over human oversight and due process in operations reliant on emerging technologies.