Israeli AI used to identify 37,000 targets in Gaza

A man pushes a bycicle along as he walks amid building rubble in the devastated area around Gaza's Al-Shifa hospital on April 3, 2024. (AFP)
Short Url
  • Testimony reveals Israeli military permitted killings of multiple civilians per strike, often on family homes
  • ‘We’ve killed people with collateral damage in the high double digits, if not low triple digits. These are things that haven’t happened before’

LONDON: Israel has used artificial intelligence to identify as many as 37,000 potential targets during its war in Gaza, intelligence sources have revealed. 

Israeli-Palestinian publication +972 Magazine and Hebrew-language outlet Local Call published a report by journalist Yuval Abraham that interviewed six Israeli intelligence officers who used the AI, called Lavender, which identified targets supposedly linked to Hamas or Palestinian Islamic Jihad.

Lavender has been developed by an elite division of the Israeli military, Unit 8200, and processes huge amounts of data to identify Hamas and PIJ members and affiliates.

Details of how it works are not available, but the sources said Unit 8200 determined it had a 90 percent accuracy rate in identifying people.

The Israeli military used Lavender to compile a vast database of low-ranking individuals across Gaza, alongside another AI tool called the Gospel, which identified buildings and structures.

The sources said Israeli military figures permitted the killing of large numbers of Palestinian civilians in the early days of the conflict after the Oct. 7 Hamas attack on Israel, with airstrikes on low-ranking militants also permitted to kill 15-20 civilians using unguided bombs, often on residential areas.

“You don’t want to waste expensive bombs on unimportant people — it’s very expensive for the country and there’s a shortage (of those bombs),” one source said. 

Another added: “We usually carried out the attacks with ‘dumb’ (indiscriminate) bombs, and that meant literally dropping the whole house on its occupants.

“But even if an attack is averted, you don’t care — you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting.”

A third said: “There was a completely permissive policy regarding the casualties of (bombing) operations. A policy so permissive that, in my opinion, it had an element of revenge.”

For higher-ranking Hamas and PIJ figures, the collateral death toll could be much higher. “We’ve killed people with collateral damage in the high double digits, if not low triple digits. These are things that haven’t happened before,” one of the intelligence sources said.

“It’s not just that you can kill any person who is a Hamas soldier, which is clearly permitted and legitimate in terms of international law, but they directly tell you: ‘You are allowed to kill them along with many civilians’ … In practice, the proportionality criterion did not exist.”

Another suggested that the AI made selecting targets in Gaza easier. “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time,” the source said.

Another added that the AI was more trustworthy than a potentially emotional human. “Everyone there, including me, lost people on Oct. 7. The machine did it coldly. And that made it easier.”

The sources told The Guardian that previously, individual targets would be discussed with multiple Israeli military personnel and signed off by a legal advisor, but that after Oct. 7 pressure grew to speed up the identification of potential targets.

“We were constantly being pressured: ‘bring us more targets.’ They really shouted at us,” one source said. “We were told: now we have to f— up Hamas, no matter what the cost. Whatever you can, you bomb.”

Another said: “At its peak, the system managed to generate 37,000 people as potential human targets, but the numbers changed all the time, because it depends on where you set the bar of what a Hamas operative is.”

They added: “There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defence personnel, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don’t really endanger (Israeli) soldiers.”

The testimony compiled also suggested that the Israeli military used the information it accrued to target people in their homes.

“We were not interested in killing (Hamas) operatives only when they were in a military building or engaged in a military activity,” one source said.

“It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

Before the conflict, Israeli and US intelligence estimated Hamas’s strength at 25,000-30,000 people.

Gaza’s health authorities say at least 32,000 Palestinians have been killed in the conflict, while the UN says 1,340 Gazan families lost multiple members in the first month of the war alone. Of those, 312 families lost more than 10 members.

Sarah Harrison, a former lawyer at the US Defense Department, told The Guardian: “While there may be certain occasions where 15 collateral civilian deaths could be proportionate, there are other times where it definitely wouldn’t be.

“You can’t just set a tolerable number for a category of targets and say that it’ll be lawfully proportionate in each case.”

In a statement, the Israeli military said its bombing was carried out with “a high level of precision” and Lavender is used “to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organisations. This is not a list of confirmed military operatives eligible to attack.

“The IDF (Israel Defense Forces) does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process.”

It added that its procedures “require conducting an individual assessment of the anticipated military advantage and collateral damage expected … The IDF does not carry out strikes when the expected collateral damage from the strike is excessive in relation to the military advantage.

“The IDF outright rejects the claim regarding any policy to kill tens of thousands of people in their homes.”

Battleground: Jerusalem
The biblical battle for the Holy City

Enter


keywords