The age of killer robots may have already begun
Bryan Walsh
AXIOS
Sat, May 29, 2021,
A drone that can select and engage targets on its own attacked soldiers during a civil conflict in Libya.
Why it matters: If confirmed, it would likely represent the first-known case of a machine-learning-based autonomous weapon being used to kill, potentially heralding a dangerous new era in warfare.
Driving the news: According to a recent report by the UN Panel of Experts on Libya, a Turkish-made STM Kargu-2 drone may have "hunted down and ... engaged" retreating soldiers fighting with Libyan Gen. Khalifa Haftar last year.
It's not clear whether any soldiers were killed in the attack, although the UN experts — which call the drone a "lethal autonomous weapons system" — imply they likely were.
Such an event, writes Zachary Kallenborn — a research affiliate with the Unconventional Weapons and Technology Division of the National Consortium for the Study of Terrorism and Responses to Terrorism — would represent "a new chapter in autonomous weapons, one in which they are used to fight and kill human beings based on artificial intelligence."
How it works: The Kargu is a loitering drone that uses computer vision to select and engage targets without a connection between the drone and its operator, giving it "a true 'fire, forget and find' capability," the UN report notes.
Between the lines: Recent conflicts — like those between Armenia and Azerbaijan and Israel and Hamas in Gaza — have featured an extensive use of drones of all sorts.
The deployment of truly autonomous drones could represent a military revolution on par with the introduction of guns or aircraft — and unlike nuclear weapons, they're likely to be easily obtainable by nearly any military force.
What they're saying: "If new technology makes deterrence impossible, it might condemn us to a future where everyone is always on the offense," the economist Noah Smith writes in a frightening post on the future of war.
The bottom line: Humanitarian organizations and many AI experts have called for a global ban on lethal autonomous weapons, but a number of countries — including the U.S. — have stood in the way.
Sat, May 29, 2021,
A drone that can select and engage targets on its own attacked soldiers during a civil conflict in Libya.
Why it matters: If confirmed, it would likely represent the first-known case of a machine-learning-based autonomous weapon being used to kill, potentially heralding a dangerous new era in warfare.
Driving the news: According to a recent report by the UN Panel of Experts on Libya, a Turkish-made STM Kargu-2 drone may have "hunted down and ... engaged" retreating soldiers fighting with Libyan Gen. Khalifa Haftar last year.
It's not clear whether any soldiers were killed in the attack, although the UN experts — which call the drone a "lethal autonomous weapons system" — imply they likely were.
Such an event, writes Zachary Kallenborn — a research affiliate with the Unconventional Weapons and Technology Division of the National Consortium for the Study of Terrorism and Responses to Terrorism — would represent "a new chapter in autonomous weapons, one in which they are used to fight and kill human beings based on artificial intelligence."
How it works: The Kargu is a loitering drone that uses computer vision to select and engage targets without a connection between the drone and its operator, giving it "a true 'fire, forget and find' capability," the UN report notes.
Between the lines: Recent conflicts — like those between Armenia and Azerbaijan and Israel and Hamas in Gaza — have featured an extensive use of drones of all sorts.
The deployment of truly autonomous drones could represent a military revolution on par with the introduction of guns or aircraft — and unlike nuclear weapons, they're likely to be easily obtainable by nearly any military force.
What they're saying: "If new technology makes deterrence impossible, it might condemn us to a future where everyone is always on the offense," the economist Noah Smith writes in a frightening post on the future of war.
The bottom line: Humanitarian organizations and many AI experts have called for a global ban on lethal autonomous weapons, but a number of countries — including the U.S. — have stood in the way.
SEE MY GOTHIC CAPITALI$M
ALSO SEE
No comments:
Post a Comment