Killer robots: Will they be banned?
The UN conference calls them “lethal autonomous weapons systems.” Critics call them killer robots. They can take the form of drones, land vehicles, or submarines.
These aren’t the drones that deliver your online order. Loaded with cameras, sensors, and explosives, their mission is to drive themselves to a target with an algorithm in the driver’s seat. They destroy themselves along with the target, leaving behind just a pile of electronic detritus. Increasingly, these sorts of weapons are the stuff of a manufacturer’s promotional materials rather than science fiction movies. From today, a United Nations conference of 80 countries gathers in Geneva to debate whether to ban them or at least regulate them more strictly.
Autonomous weapons are, as their name suggests, able to select and attack targets on their own. That is unlike piloted drones and other weapons, which a human operator directs from afar. Arms manufacturers are taking advantage of the latest advances in artificial intelligence and machine learning to develop them. The UN conference calls them “lethal autonomous weapons systems.” Critics call them killer robots. They can take the form of drones, land vehicles, or submarines. Some countries want autonomous weapons banned, arguing that an algorithm should never decide over life and death. Other countries want autonomous weapons regulated, with more or less binding rules of engagement that include some role for human decision-making.
The UN has been meeting twice a year since 2014 to debate the issue. The United States, Russia, and China are the loudest opponents of an outright ban on autonomous weapon systems or binding rules to govern their use.
Russia blocked the last meeting, which was scheduled for March, by refusing to accept the agenda. At that point, Russia’s invasion of Ukraine was a few weeks old. “If an autonomous weapon makes a mistake and possibly commits a war crime, who’s responsible?” asks Vanessa Vohs, who researches autonomous weapons at the German Armed Forces University in Munich. For Vohs, accountability is one of several open questions.
The Geneva gatherings do not appear to be close to answering many of them, and Russia’s war in Ukraine has added to the uncertainty. For some, it is more evidence to ban autonomous weapons. Others see the war as another sign that doing so is hopeless. “There is evidence of Russia using autonomous weapons in this conflict,” said Ousman Noor, who works for the Campaign to Stop Killer Robots. The NGO wants to see these weapons banned. “That could lead to the acknowledgment of urgently needing to regulate these weapons before they get sold the world over.”
The US has reportedly sent the Ukrainian army several tactical unmanned “kamikaze” drones that can find their own target and explode on impact. AI experts have long warned of the ease of producing small, armed drones in large numbers, which any IT student could program.
“Without the need of a person to service these weapons, you can dispatch tens of thousands, if not millions, of them,” Stuart Russell, an AI researcher, told DW. “We’re creating weapons with the potential of being deadlier than an atomic bomb.” The war in Ukraine has motivated countries to spend more on their militaries, including investing in the latest weaponry. The additional 100 bn euros ($102 bn) that Germany is borrowing to top up its defense budget may go partly to buying fleets of armed drones or other advanced weapons systems using AI.
Observers at the Geneva talks have said Germany’s representatives there have so far remained reluctant to take a clear position. Few believe the multilateral discussion will result in a ban or any binding rules. With reports of autonomous weapons already being deployed on the battlefield, there is an increasing sense of urgency to find a solution. “That’s why we need new rules,” Vohs said. “Before we find ourselves in an apocalyptic scenario when something really goes wrong.”