Sunday, June 02, 2024

 

Stop Killer AI

Most people take around 10 minutes to decide what show to stream and around 3 minutes to decide what to order off of a restaurant menu. An Israeli intelligence officer said he took about 20 seconds to decide whether to add a human being to a kill list based on suggestions generated by “Lavender,” an artificial intelligence system powered by data Israel has collected from its surveillance of Palestinian communities. Then, the Israeli military uses another AI system, sadistically named “Where’s Daddy?”, to strike targets at home with their families. Learn more about Lavender, Where’s Daddy?, and the violence made possible by digital dehumanization in our new interactive visual, Stop Killer AI.

Visualizing Palestine is grateful to an anonymous friend who worked with us to create this interactive visual, building on our previous story Automating Genocide.

“When we consider the impact of such [artificial intelligence] systems on human rights, we need to look at the consequences, first, if they malfunction and second, if they work as intended. In both situations, reducing human beings to statistical data points has grave and irreversible consequences for people’s dignity, safety, and lives.”–Marwa Fatafta and Daniel Lefeur, “Artificial Genocidal Intelligence: how Israel is automating human rights abuses and war crimes,” Access Now.

Explore how Israel uses artificial intelligence to produce targets for its bombing campaign faster than humanly possible.

“Nothing happens by accident,” said another source. “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed […] These are not random rockets. Everything is intentional. We know exactly how much collateral damage there is in every home.” —Yuval Abraham, “A mass assassination factory’: Inside Israel’s calculated bombing of Gaza“, +972 Magazine.

Human rights advocates are calling for a ban of AI target-generation systems in warfare, biometric mass surveillance, so-called “social scoring” algorithms, and other technologies that are fundamentally incompatible with human rights.FacebooTwitter


Visualizing Palestine is the intersection of communication, social sciences, technology, design and urban studies for social justice. Visualizing Palestine uses creative visuals to describe a factual rights-based narrative of Palestine/Israel. Read other articles by Visualizing Palestine, or visit Visualizing Palestine's website.

No comments:

Post a Comment