Israel Lets AI Decide Who Dies in Gaza
The Israeli military has employed yet another AI-based system to select bombing targets in the Gaza Strip, an investigation by +972 Magazine has revealed. The new system has generated sweeping kill lists condemning tens of thousands of Palestinians, part of the IDF’s growing dependence on AI to plan lethal strikes.
Citing six Israeli intelligence officers, the Tel Aviv-based magazine said the previously undisclosed AI system, dubbed ‘Lavender,’ has played a “central role in the unprecedented bombing” of Gaza since last October, with the military effectively treating its output “as if it were a human decision.”
“Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets,” the outlet reported, adding that “during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants – and their homes – for possible air strikes.”
However, while thousands have been killed in the resulting air raids, the majority were “women and children or people who were not involved in the fighting,” the officers told the magazine, noting that Israeli field commanders often rely on the AI system without consulting more substantial intelligence.
“Human personnel often served only as a ‘rubber stamp’ for the machine’s decisions,” one source said, adding that many commanders spend a mere “20 seconds” reviewing targets before approving strikes – “just to make sure the Lavender-marked target is male.”
Human input has been relegated to such a minor role in the decision-making process that Lavender’s conclusions are often treated as “an order” by Israeli troops, “with no requirement to independently check why the machine made that choice.”
Such decisions are made despite well-known system errors which result in misidentified targets in at least 10% of cases. Nonetheless, the AI has “systematically” selected the homes of suspected militants for strikes, with IDF bombings frequently carried out late at night, when entire families are more likely to be present.
In targeting lower-level Hamas fighters in the early stages of the war, the military largely resorted to the use of unguided ‘dumb bombs,’ concluding it was permissible to “kill up to 15 or 20 civilians” in such operations, the intelligence sources added. Senior militants, meanwhile, could warrant the deaths of “more than 100 civilians” in some cases.
“You don’t want to waste expensive bombs on unimportant people,” one officer said.
Automated Assassination
Lavender is far from the first AI program used to direct operations for Israel’s military. Yet another system unveiled by +972 mag, known as ‘Where’s Daddy?’, has also been used “specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.”
An unnamed intelligence officer told the outlet that homes are considered a “first option” for targeting, observing that the IDF is “not interested in killing [Hamas] operatives only when they [are] in a military building or engaged in a military activity.”
As of April, Israeli bombings have damaged or destroyed a staggering 62% of all housing units in Gaza – or nearly 300,000 homes – leaving more than 1 million people internally displaced, according to United Nations estimates. The territory’s housing sector has borne the brunt of the Israeli onslaught, representing well over two-thirds of the destruction in Gaza to date.
Earlier reporting has shed further light on Israel’s AI-driven “mass assassination factory,” with another program, ‘the Gospel,’ used to automatically generate massive target lists at a rate vastly exceeding previous methods. Under the guidance of that tool, Israeli forces have increasingly struck what they call “power targets,” including high-rise residential structures and public buildings. Such attacks are reportedly part of an effort to exert “civil pressure” on Palestinian society – a tactic clearly prohibited under international law as a form of collective punishment.
The IDF has long relied on extensive “target banks” in planning operations in Gaza and the West Bank, gathering a long list of suspected militant command posts and installations. In recent years, however, those lists have swelled to include thousands of potential targets as the military outsources decision-making to automated systems.
Adding to the litany of AI programs used to deliver death in Gaza and beyond, Israel’s ‘Fire Factory’ system helps to automatically calculate munitions payloads and assign targets to particular aircraft or drones once they are selected. “What used to take hours now takes minutes, with a few more minutes for human review,” an IDF colonel said of the system in comments to Bloomberg.
Artificial intelligence and AI-powered facial recognition tech have similarly taken a greater role in policing the border between the occupied territories and Israel proper – as well as West Bank checkpoints – with the IDF deploying a litany of new systems to identify, surveil and arrest Palestinians in recent years.
Will Porter is assistant news editor at the Libertarian Institute and a regular contributor at Antiwar.com. Find more of his work at Consortium News and ZeroHedge.
Death by Algorithm: Israel’s AI War in Gaza
Remorseless killing at the initiation of artificial intelligence has been the subject of nail-biting concern for various members of computer-digital cosmos. Be wary of such machines in war and their displacing potential regarding human will and agency. For all that, the advent of AI-driven, automated systems in war has already become a cold-blooded reality, deployed conventionally, and with utmost lethality by human operators.
The teasing illusion here is the idea that autonomous systems will become so algorithmically attuned and trained as to render human agency redundant in a functional sense. Provided the targeting is trained, informed, and surgical, a utopia of precision will dawn in modern warfare. Civilian death tolls will be reduced; the mortality of combatants and undesirables will, conversely, increase with dramatic effect.
The staining case study that has put paid to this idea is the pulverising campaign being waged by Israel in Gaza. A report in the magazine +972 notes that the Israeli Defense Forces has indulgently availed itself of AI to identify targets and dispatch them accordingly. The process, however, has been far from accurate or forensically educated. As Brianna Rosen of Just Security accurately posits, “Rather than limiting harm to civilians, Israel’s use of AI bolsters its ability to identify, locate, and expand target sets which likely are not fully vetted to inflict maximum damage.”
The investigation opens by recalling the bombastically titled The Human-Machine Team: How to Create Human and Artificial Intelligence That Will Revolutionize Our World, a 2021 publication available in English authored by one “Brigadier General Y.S.”, the current commander of the Israeli intelligence unit 8200.
The author advances the case for a system capable of rapidly generating thousands of potential “targets” in the exigencies of conflict. The sinister and morally arid goal of such a machine would resolve a “human bottleneck for both locating new targets and decision-making to approve the targets.” Doing so not only dispenses with the human need to vet, check and verify the viability of the target but dispenses with the need to seek human approval for their termination.
The joint investigation by +972 and Local Call identifies the advanced stage of development of such a system, known to the Israeli forces as Lavender. In terms of its murderous purpose, this AI creation goes further than such lethal predecessors as “Habsora” (“The Gospel”), which identifies purportedly relevant military buildings and structures used by militants. Even that form of identification did little to keep the death rate moderate, generating what a former intelligence officer described as a “mass assassination factory.”
Six Israeli intelligence officers, all having served during the current war in Gaza, reveal how Lavender “played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war.” The effect of using the AI machine effectively subsumed the human element while giving the targeting results of the system a fictional human credibility.
Within the first weeks of the war, the IDF placed extensive, even exclusive reliance on Lavender, with as many as 37,000 Palestinians being identified as potential Hamas and Palestinian Islamic Jihad militants for possible airstrikes. This reliance signalled a shift from the previous “human target” doctrine used by the IDF regarding senior military operatives. In such cases, killing the individual in their private residence would only happen exceptionally, and only to the most senior identified individuals, all to keep in awkward step with principles of proportionality in international law. The commencement of “Operation Swords of Iron” in response to the Hamas attacks of October 7 led to the adoption of a policy by which all Hamas operatives in its military wing irrespective of rank would be designated as human targets.
Officers were given expansive latitude to accept the kill lists without demur or scrutiny, with as little as 20 seconds being given to each target before bombing authorisation was given. Permission was also given despite awareness that errors in targeting arising in “approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.”
The Lavender system was also supplemented by using the emetically named “Where’s Daddy?”, another automated platform which tracked the targeted individuals to their family residences which would then be flattened. The result was mass slaughter, with “thousands of Palestinians – most of them women and children or people not involved in the fighting” killed by Israeli airstrikes in the initial stages of the conflict. As one of the interviewed intelligence officers stated with grim candour, killing Hamas operatives when in a military facility or while engaged in military activity was a matter of little interest. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
The use of the system entailed resorting to gruesome, and ultimately murderous calculi. Two of the sources interviewed claimed that the IDF “also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians.” Were the targets Hamas officials of certain seniority, the deaths of up to 100 civilians were also authorised.
In what is becoming its default position in the face of such revelations, the IDF continues to state, as reported in the Times of Israel, that appropriate conventions are being observed in the business of killing Palestinians. It “does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist”. The process, the claim goes, is far more discerning, involving the use of a “database whose purpose is to cross-reference intelligence sources… on the military operatives of terrorist organizations”.
The UN Secretary General, António Guterres, stated how “deeply troubled” he was by reports that Israel’s bombing campaign had used “artificial intelligence as a tool in the identification of targets, particularly in densely populated residential areas, resulting in a high level of civilian casualties”. It might be far better to see these matters as cases of willing, and reckless misidentification, with a conscious acceptance on the part of IDF military personnel that enormous civilian casualties are simply a matter of course. To that end, we are no longer talking about a form of advanced, scientific war waged proportionately and with precision, but a technologically advanced form of mass murder.
Germany, Gaza and the World Court: Broadening the Scope of Genocide
Can it get any busier? The World Court, otherwise known as the International Court of Justice, has been swamped by applications on the subject of alleged genocide. The site of interest remains the Gaza Strip, the subject of unremitting slaughter since the October 7, 2023 cross-border attacks by Hamas against Israel. The retaliation by Israel has been of such brute savagery as to draw the attention of numerous states, including those not directly connected to the conflict.
Given that genocide is a crime of universal jurisdiction abominated by international law, and given the broad application of the UN Genocide Convention intended to suppress and punish it, countries not normally associated with the tormented and blood-drenched relationship between Israel and the Palestinians have taken a keen interest. South Africa got matters moving with its December application last year seeking a judicial determination that Israel was committing genocidal acts in the Gaza Strip.
Since then, Pretoria has convinced the court to issue two interim orders, one on January 26, and another on March 28. While the court has yet to decide the issue of whether Israel is culpable for genocide in waging in Gaza, the interim binding orders demand a lifting of restrictions on humanitarian aid, the prevention of starvation and famine, and observing the UN Genocide Convention. These all hint strongly at the unconscionable conduct on the part of the IDF against the civilian populace.
The implications of such findings also go to Israel’s allies and partners still keen to supply it with weapons, weapons parts, and support of a military industrial nature. Germany has been most prominent in this regard. In 2023 30% of Israel’s military equipment purchases totalling US$326 million came from Berlin. The Scholz government has also been a firm public supporter of Israel’s offensive. “There is only one place for Germany at this time, and that is by Israel’s side,” proclaimed German Chancellor Olaf Scholz to German lawmakers on October 12 last year. Foreign Minister Annalena Baerbock curtly stated that “It was not the job of politicians to tell the guns to shut up.”
Baerbock’s remarks were all the more jarring given the 2006 views of Frank-Walter Steinmeier, who was then serving as Germany’s foreign minister. With puffed up confidence, he claimed then that Europeans and Germans had played a seminal role in ending the conflict between Israel and Hezbollah in Lebanon in “silencing of the guns.”
Cognisant of such a stance, Nicaragua is now taking the South African precedent further by alleging that Germany is complicit in a genocidal enterprise. While its own human rights record is coarse – the government of Daniel Ortega boasts a spotty record which involves, among other things, the killing of protesters – Nicaragua has form at the ICJ. Four decades ago, it took the United States to the world court for assisting the counterrevolutionary Contras in their attempt to overthrow the Sandinista government.
Its 43-page submission to the court insists that Germany is responsible for “serious violations of peremptory norms of international law taking place” in Gaza in its failure to prevent genocide “against the Palestinian people” and “contributed” to its commission by violating the Genocide Convention. It further alleges that Germany failed to comply with humanitarian law principles derived from the Geneva Conventions of 1949, its protocols of 1977 and “intransgressible principles of international law” in failing to “ensure respect for these fundamental norms in all circumstances”.
The application also compacts Israel’s attack on Gaza with “continued military occupation of Palestine”, taking issue with Germany’s alleged “rendering aid or assistance” in maintaining that status quo in the Occupied Territories while “rendering aid or assistance and not preventing the illegal regime of apartheid and the negation of the right of self-determination of the Palestinian people.”
Stretches of the Nicaraguan case would make troubling reading. It notes that “by sending military equipment and now defunding UNRWA [United Nations Relief and Works Agency] which provides essential support for the civilian population, Germany is facilitating the commission of genocide” and had failed, in any case, “in its obligation to do everything possible to prevent the commission of genocide”.
Such conduct was all the more egregious “with respect to Israel given that Germany has a self-proclaimed privileged relationship with it, which would enable it to usefully influence its conduct.”
With these considerations in mind, the application by Nicaragua argues that Germany is obligated to “immediately” halt its military support for Israel “that may be used in the commission of genocide, crimes against humanity, grave breaches of the Geneva Conventions of 1949, attacks directed against civilian objects or civilians protected as such, or other war crimes”. Germany is further asked, not merely to “end its assistance to Israel” but “cooperate to uphold international law and to bring the perpetrators of these atrocities to justice.”
On April 8, the ICJ opened preliminary hearings. Alain Pellet, representing Nicaragua, argued that “Germany was and is fully conscious of the risk that the arms it has furnished and continues to furnish Israel” could be used in the commission of genocidal acts. Another legal representative, Daniel Mueller, called the provision of humanitarian airdrops to “Palestinian children, women and men” a “pathetic excuse” given the furnishing of “military equipment that is used to kill and annihilate them”. Nicaragua’s ambassador to the Netherlands, Carlos José Argüello Gómez, derided Berlin’s seeming inability “to be able to differentiate between self-defence and genocide.”
Berlin’s defence follows on April 9. A sense of its bitter flavour can be gathered from one of its top legal briefs, Tania von Uslar-Gleichen. “Germany completely rejects the accusations. We never did violate the Genocide Convention nor humanitarian law either directly or indirectly.” Berlin was “committed to the upholding of international law”.
If the defence fails to sway the judges, the case may well chart a line about third party responsibilities on preventing genocide in international humanitarian law. At this point, the momentum towards some clarity on the point seems inexorable.