Israel’s Use of AI Tool ‘Lavender’ in Gaza Conflict
Introduction
Israel has been utilizing an AI tool named ‘Lavender’ to generate kill lists during the ongoing Israel-Hamas war in Gaza. Here’s an overview of the AI-based tool and its implications.
Development of ‘Lavender’
‘Lavender’ was developed by Unit 8200, the elite intelligence division of the Israel Defense Forces, responsible for clandestine operations.
Functionality of Lavender
The Lavender system identifies potential bombing targets within the military wings of Hamas and Palestinian Islamic Jihad (PIJ) by analyzing data collected through mass surveillance on Gaza’s residents. Individuals are rated on a scale of 1 to 100 based on their likelihood of being militants.
Accuracy and Implementation
Despite a 10% error rate, the outputs of Lavender were initially heavily relied upon by the Israeli military. The AI system’s results were treated as human decisions, with minimal human review of the suggested targets.
Civilian Casualties and Controversies
Allegations suggest that the Israeli army permitted collateral damage during the assassination of low-ranking militants, leading to civilian casualties. The use of unguided “dumb” bombs for these operations was preferred to conserve resources.
Conclusion
The use of AI technology like ‘Lavender’ raises ethical concerns regarding the targeting of individuals in conflict zones and the potential for civilian harm.