Tuesday, February 17, 2026

Opinion

Algorithms and AI have turned Gaza into a laboratory of death


February 17, 2026 


A view of makeshift tents amid the rubble left behind by Israeli attacks as Palestinians carry on with their daily lives under harsh conditions in Gaza Strip on February 16, 2026. [Saeed M. M. T. Jaras – Anadolu Agency]

by Sayid Marcos Tenorio
soupalestina
Middle East Monitor.


The revelations by +972 Magazine and Local Call have exposed the darkest core of the contemporary war in Gaza, in which genocide is carried out not only by bombs and missiles, but by data, algorithms and global digital platforms.

The Israeli artificial intelligence system known as Lavender has confirmed what the Palestinian resistance, Lebanon, and Iran have denounced for years: Technology as an organic part of the Zionist war machine, functioning as an instrument of surveillance, target selection, and mass extermination.

The liberal rhetoric of “digital privacy” collapses in the face of the facts. Applications such as WhatsApp insist on the promise of end-to-end encryption, but conceal what is essential, in which metadata are worth more than messages.

Location, contact networks, patterns of communication, and group affiliations make it possible to map the social life of an entire people. In Gaza, these data have been incorporated into military systems that turn human relationships into algorithmic criteria for death.

Lavender assessed virtually the entire population of the Gaza Strip, comprising more than 2.3 million people, assigning automated “risk scores”. Merely being in a WhatsApp group, maintaining frequent contact with someone already marked, or displaying digital patterns considered “suspicious” was enough to be placed on execution lists.

Human supervision was deliberately minimal, reduced to seconds, with conscious acceptance of high error rates. Entire families were killed in their homes, treated as “acceptable collateral damage” in an algorithmic equation that normalises massacre.

This is not a technical deviation. It is a policy of extermination. International Humanitarian Law explicitly prohibits indiscriminate attacks and requires distinction between civilians and combatants.

Systems that automate lethal decisions, pre-accepting the death of innocents, constitute crimes against humanity and reinforce the characterisation of genocide as a technologically organised and rationalised process.

The machinery that sustains this model is global. Twenty-first century espionage no longer depends on intercepting messages, but on controlling digital ecosystems.

Private platforms function as permanent sensors of planetary social life, feeding databases accessible to intelligence services such as the Mossad and the CIA, through formal cooperation, legal pressure or the exploitation of vulnerabilities. This represents a structural convergence between big tech companies, the military-industrial complex and the imperial security apparatus.

Palestine is the laboratory. In an official statement released during the war, Hamas stated on its Telegram channel that “the occupier has turned every modern tool into a weapon against the Palestinian people, using technology to justify the killing of civilians and to conceal genocide behind technical terms”

(free translation). The denunciation is clear: Israel is not waging a war against combatants, but against Palestinian existence itself, now mediated by algorithms.

Lebanese Hezbollah has warned that this model forms part of a regional hybrid war, combining digital surveillance, technological sabotage, and selective attacks.

After the attack that occurred in Lebanon in 2024, involving the coordinated explosion of pagers used by its members, Hezbollah declared through institutional channels that “the enemy has turned civilian devices into tools of assassination, proving that its war knows no ethical or human limits” (free translation). The episode revealed a new level in the weaponisation of everyday technology.

This pattern is not isolated. International investigations have already demonstrated the recurring use of military spyware against journalists, activists, and political leaders in various countries, often through smartphones widely available on the global market.

The message is unequivocal: every connected device is a potential instrument of surveillance, control, or death when inserted into the logic of imperial power.

Leaders of the Islamic Republic of Iran have been particularly outspoken. The Iranian Supreme Leader, Ayatollah Ali Khamenei, has stated in various speeches that “the Zionist regime is a cancerous tumour that uses the most modern tools to oppress and massacre peoples”.

Iranian authorities maintain that Gaza foreshadows the future of imperial domination, in a world governed by algorithmic surveillance, selective assassinations, and “clean” wars only in rhetoric.

The Lavender case thus exposes the consolidation of a digital necropolitics. Algorithms decide who lives and who dies; corporations provide the infrastructure; intelligence services operate in the shadows; and technocratic language seeks to normalise the unacceptable. Gaza bleeds so that this model may be tested, refined, and then exported.

Denouncing this machinery is a historic task. It is not merely a matter of solidarity with the Palestinian people, although that solidarity is urgent and non-negotiable.

It is about resisting a world in which data are worth more than lives, in which technology serves colonialism, and in which genocide is presented as an “algorithmic decision”. Today it is Gaza. Tomorrow, any people who dare to resist.

The views expressed in this article belong to the author and do not necessarily reflect the editorial policy of Middle East Monitor.

No comments: