The Technology section is published with the support of Favbet Tech
The Israeli military used artificial intelligence to help select bombing targets in Gaza. This could potentially lead to deterioration in the accuracy of the strikes and the killing of many civilians. This is stated in the publications +972 Magazine and Local Call.
The system, called Lavender, was developed after the Hamas terrorist attacks on October 7th. At its peak, Lavender designated 37,000 Palestinians in Gaza as suspected “Hamas militants” and “authorized” their murders.
The Israeli military has denied the existence of such a list of killings. The spokesman said artificial intelligence was not used to identify suspected terrorists, but did not deny the existence of the Lavender system, which the spokesman described as “simply tools for analysts in the process of identifying targets.” Analysts “are required to conduct independent reviews during which they verify that the identified targets comply with relevant definitions under international law and additional restrictions provided in IDF guidelines,” the spokesman said.
However, Israeli intelligence officers told +972 and Local Call that they were not required to independently verify Lavender targets before bombing them, but instead effectively served as a rubber stamp for the machine's “decisions.” In some cases, the officers' only role in the process was to determine whether the target was male.
To create the Lavender system, the dataset was filled with information about known Hamas and Palestinian Islamic Jihad operatives. In addition, Lavender training also used data on people closely associated with Hamas, such as Gaza Ministry of Internal Security employees.
Lavender was trained to identify “traits” associated with Hamas operatives, such as being in a WhatsApp group with a known militant, changing a cell phone every few months, or changing addresses frequently. This data was then used to rank other Palestinians in Gaza on a scale of 1 to 100 based on how similar they were to known Hamas militants in the original data set. People who reached a certain threshold were then designated as targets for strikes. That threshold has been constantly changing, “because it depends on where you set the bar for who is a Hamas operative,” — one military source said.
Fullstack Web Development course from Mate academy. Become a universal developer who can create web solutions from scratch. Find out about the course
According to sources, the system had an accuracy rate of 90%. Some of the people Lavender identified as targets had names or nicknames identical to those of known Hamas militants. Others were relatives of Hamas operatives or people who used phones that once belonged to Hamas operatives.
Intelligence officers were given sweeping powers when it came to dealing with civilian casualties, people familiar with the matter said. This allowed for possible collateral civilian casualties: 15-20 when targeting low-level Hamas operatives and “hundreds” ─ for senior Hamas officials.
Suspected Hamas militants were also targeted in their homes, as determined by Where’s Daddy?. According to officers, this system placed targets identified by Lavender under constant surveillance. They were tracked until they reached their homes, at which point they were bombed, often along with their entire families. However, sometimes officers bombed houses without checking to see if there were targets inside, killing civilians in the process.
The Lavender system is additionally reported to be an extension of Israel's use of surveillance technology against Palestinians in both the Gaza Strip and the West Bank.
Favbet Tech is IT a company with 100% Ukrainian DNA, which creates perfect services for iGaming and Betting using advanced technologies and provides access to them. Favbet Tech develops innovative software through a complex multi-component platform that can withstand enormous loads and create a unique experience for players. The IT company is part of the FAVBET group of companies.