Pathfinders – Death by algorithm


The world held its breath last month as April showers of Iranian missiles descended on Israel, only to be blown out of the sky by Israel’s Iron Dome defence system. Would regional war flower as a result? Iran has around ten times the population of Israel, but Israel has an elite air force, courtesy of the USA, and some of the best technology in the world. Now insiders have revealed that it has been using AI in the war in Gaza.

According to a non-profit magazine run by Israeli and Palestinian journalists, Israel has been using AI to identify ‘tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties’. According to six Israeli intelligence officers involved in the Gaza war, some of them actually in the targeting rooms, the AI system, known as Lavender, is designed to remove the ‘human bottleneck for both locating the new targets and decision-making to approve the targets’. The officers say that, in the early stages of the war, the Israeli Defence Force almost completely relied on Lavender to identify the domestic homes of up to 37,000 high to low-ranking Hamas and Islamic Jihad military personnel for potential drone strikes. Such was the cursory nature of IDF oversight, which is demanded by international law, that Lavender’s outputs were treated ‘as if it were a human decision’, and the kill lists given ‘sweeping approval’ without anyone checking the raw intelligence data. This was despite it being known that the system had a 10 percent error rate.

Moreover, the AI system was designed to target militants at their most vulnerable, when they had returned to their domestic dwellings to be with their families. A second AI system, chillingly known as ‘Where’s Daddy?’, identified when the militants actually entered their family homes. ‘We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,’ one intelligence officer said. ‘It’s much easier to bomb a family’s home. The system is built to look for them in these situations.’

For lower-ranking militants, the army decided not to use smart bombs, which can take out an individual or a car, but unguided ‘dumb’ bombs which destroy entire buildings or apartment blocks on top of the target individual. ‘You don’t want to waste expensive bombs on unimportant people — it’s very expensive for the country and there’s a shortage’, said one officer. Two others added that the army had also agreed that ‘for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians’, while for a battalion or brigade commander, the permissible civilian deaths could be over 100.

This explains why the number of women and children killed has been so enormous. In previous operations, the IDF had strict rules of proportionality, in compliance with international law, and requirements to cross-check, verify, incriminate and confirm the target’s presence in real time. Because human targeting generated limited results, it was feasible to stick to these rules. Since October 7, targeting has been given over entirely to AI, which has produced gigantic kill lists, and all the verification rules have gone out of the window. Who the system identifies as a target depends on where the users set the bar. It could be little more than changing one’s address or mobile phone. ‘There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defense personnel, police officers, on whom it would be a shame to waste bombs.’

To call this unprecedented is an understatement. Nothing like it has been known before in Israel’s military operations, nor that of anybody else’s including the USA, not even against high-ranking targets like Bin Laden. A US general and former commander of operations against ISIS said, ‘With Osama Bin Laden, you’d have an NCV [Non-combatant Casualty Value] of 30, but if you had a low-level commander, his NCV was typically zero. We ran zero for the longest time.’ For the Israeli Defence force however, things were different. ‘There was hysteria in the professional ranks,’ said one officer, ‘They had no idea how to react at all. The only thing they knew to do was to just start bombing like madmen.’

What will be the long-term upshot of this policy of indiscriminate, AI-assisted slaughter? The Israeli whistleblowers are under no illusions: ‘In the short term, we are safer, because we hurt Hamas. But I think we’re less secure in the long run. I see how all the bereaved families in Gaza — which is nearly everyone — will raise the motivation for [people to join] Hamas 10 years down the line. And it will be much easier for [Hamas] to recruit them.’

So, never mind rules, never mind international agreements, this is the shape of wars to come, where the critical factor is not law but LAWS – lethal autonomous weapons systems which track, target and destroy human life with no human oversight at all. The irony is that capitalism’s technological revolutions have created a global productive capability that could put us all beyond any need for outdated capitalist trade relations, where we could live in peace without markets, prices, wages, debts – or wars. Yet capitalism’s internal logic is to compete for profit, to grow or die, and the inevitable extension of that logic is war. Now AI targeting and LAWS are genies out of the bottle. If we don’t abolish capitalism soon, the human race could end up being obliterated by its own technology.

PJS


Next article: Cannon fodder needed ➤

Leave a Reply