Crisis Update: Troubling Lessons Of Israel’s AI Kill Targets
Efficiency or unearthing darker impulses?
Lethality of machine learning replaces human judgement
Moral vacuum of AI allows institutions to manipulate soldiers
Channelling the “small corner of evil” that exists in every person
For the trajectory of events in Gaza and Israel, see these articles:
Israel-Gaza Clash Is More Than Bad Neighbours - Security failure points to bigger agenda (Oct 8, 2023)
Gaza, An Open And Closed Case (Oct 10, 2023)
'Beheaded Babies' Is A Century-Old Trope (Oct 12, 2023)
Gaza's Fate Holds A Warning (Oct 16, 2023)
Hospital Bombings, Denial And Blood Sacrifice (Oct 18, 2023)
The Bottomless Pit - Israel Digs In (Oct 23, 2023)
Gaza Is A Micro-War Testing Ground (Oct 24, 2023)
Gaza Ground Invasion Begins (Oct 28, 2023)
Gaza Depopulation Plan Revealed By Intel Leak (Oct 30, 2023)
Globalists Plan Ban On Any Critique Of Zionism (Nov 6, 2023)
How -Isms Make War (Nov 10, 2023)
Politicians In A Zionist Death March (Nov 14, 2023)
Bin Laden Cameo Role In Gaza Mind War (Nov 16, 2023)
From New Zealand To Gaza; The Coof Shot And Genocide (Dec 4, 2023)
Gaza's Toll Weighs Heavy On Israel and Palestinians (Dec 10, 2023)
Victory Is An Illusion In Israel's War (Dec 23, 2023)
Israel To Counter Charge Of Genocide In Gaza - A finding in World Court would be blow to moral standing (Jan 11, 2024)
Timeless Voice Bids Us 'Come And See' - A film with lessons from Byelorussia to Gaza (Jan 14, 2024)
Casting Rocks At Regimes Built On Lies - While third-rate puppets panic (Feb 08, 2024)
Troubling Lessons Of Israel’s AI Kill Targets - Efficiency or unearthing darker impulses? (Apr 05, 2024)
The Paths Of Diplomacy And Bomb Throwing - Israel’s pause; Zaporizhzhia, Burisma and Crocus; BRICS and CBDC for the masses (Apr 10, 2024)23)
(2,500 words or about 12 minutes of your company.)
Apr 5, 2024
“Gradually it was disclosed to me that the line separating good and evil passes not through states, nor between classes, nor between political parties either — but right through every human heart — and through all human hearts. This line shifts. Inside us, it oscillates with the years. And even within hearts overwhelmed by evil, one small bridgehead of good is retained. And even in the best of all hearts, there remains… an un-uprooted small corner of evil.”
— Aleksandr Solzhenitsyn
The latest revelations about Israel’s use of artificial intelligence to identify targets in Gaza have sparked outrage. However bloody the results, they’re not the only concern.
AI needs human oversight. Not because strict machines may turn on us, like HAL in 2001: A Space Odyssey but because they cause humans to shirk our own moral responsibility.
From Britain's Metropolitan Police to the U.S. Pentagon, chiefs have insisted that humans will always make the final decision. However they know, from experience, how humans behave if they can blame their action on “orders.” [1]
This makes AI warfare not only a matter of individual overreach but also deliberate institutional objectives.
Israel is currently the biggest project for the use of AI in warfare and the lessons are not good.
The television celebrities and armchair pundits say, as did Jordan Peterson of the Palestinians, “let them have it.” They are blinkered beyond belief.
Technology lacks logic: it cannot know good and thus has no morality. We should be watching very closely the attitudes to life that inform military decisions which could one day come home to haunt us.
Israel’s experience in suppressing Palestinians has been used to train Western police forces. [2]
How confident are you that the latest technology being tested in Gaza, including drones fed by AI, will not find its way back home to you?
The implementation and degree of human error are less important than the institutional objectives and the tolerance for loss of life in pursuing those goals.
This is a project of international policy-makers. Britain and the U.S. are thought to be supplying precision technology and guidance for Israeli drones and missiles.
Banality of evil
Statistical systems can give a false sense of security, even, or especially, when the computer model does not reflect reality.
The Covid forecasts of infection fatality rate illustrated how computer models can be wrong or tweaked to deliver results that justify policy. Likewise the system that in Gaza identifies Hamas militants for extermination.
When it produces fewer targets, the human operator is incentivised to lower the threshold in order to maintain the same number of kills.
Is this a case of the banality of evil, turning death into a statistic, or is there a more troubling discovery: that people behave even worse when relieved of the burden of responsibility?
Humans are eager to unburden themselves when “computer says.” It’s not just that they fail to make good use of the time freed up by labour-saving devices: their judgement actually degrades.
The biggest fault is not even at the individual level.
Neil Ferguson of Imperial College has made wild and incorrect forecasts for decades, from mad cow disease to the Covid fatality rate. He is still employed because his computer models are not meant to interpret the world but to change it.
Rose spectacles
Four months ago the journalist Yuval Abraham of the Israeli journal +972, described how military sources told him they aimed for maximum damage, not accuracy, using AI systems, one of which they named “The Gospel.”
Now he has revealed another system, called Lavender. [3]
The name is appropriate, for Lavender provides Israel’s military with rose-tinted spectacles through which to justify their actions in Gaza.
“A fundamental difference between the two systems is in the definition of the target: whereas The Gospel marks buildings and structures that the army claims militants operate from, Lavender marks people — and puts them on a kill list.”
The number of civilians a soldier is permitted to kill depends on the seniority of the Hamas operative. When targeting a junior official, up to 20 civilians can be killed; with the rank of battalion or brigade commander, however, more than 100 civilians can be killed.
With tens of thousands of low-level operatives, which Israel historically did not bother to track, the military abandoned human assessment and relied on AI instead.
Rating everyone
Lavender rates every person in Gaza on a scale of 1 to 100, according to how likely they are to be a militant. It is fed their travel patterns, WhatsApp group, how regularly they change cell phone or move addresses.
Once it has a list of list of Hamas operatives it rates other people according to features they share with the militants.
“They wanted to allow us to attack [the junior operatives] automatically. That’s the Holy Grail. Once you go automatic, target generation goes crazy,” a senior officer told +972.
Priority was given to creating targets and destroying them, rather than allowing for human intervention or precision.
“Everything was statistical, everything was neat — it was very dry,” one officer is quoted as saying.
The error rate was one-in-10 targets: police, civil defence workers or relatives of militants who might share a nickname, buy a phone previously used by a militant, or share their travel patterns.
Lavender had reached 90 per cent accuracy in identifying affiliation with Hamas. These would be bombed when they returned home to their families, so most of the victims would be women and children.
Once the error rate is approved, it becomes statistical. “The machine did it coldly. And that made it easier.”
Slave to machine
Efficiency is what drives automation: the need for more targets, but also the desire not to “waste” bombs on, say, civil defence workers who help the Hamas government but do not endanger Israeli soldiers (in the words of the officer).
The problem with abrogating responsibility to a statistical system is that humans become slave to the machine. When Lavender produces fewer targets, the human is incentivised to lower the threshold in order to maintain the same number of kills. One soldier told +972 that he’d added 1,200 targets to the system of his own accord, to reinvigorate a decreasing number of kills.
Eventually, Lavender targeted minors for bombing in north Gaza.
Something similar happened during the Bolshevik purges, when the Gulag prison labour became dependent on arrests. The number of random arrests of innocents was increased to feed the system, as documented by Aleksandr Solzhenitsyn among others.
Lethally efficient
Another statistical quirk is while guided missiles were saved for senior officials,
dumb bombs (unguided) were used for junior Hamas operatives, meaning that entire buildings would be destroyed, with many more casualties, in the quest to kill a relatively unimportant target.
Keep reading with a 7-day free trial
Subscribe to Moneycircus to keep reading this post and get 7 days of free access to the full post archives.