Powerful video warns of the danger of autonomous ‘slaughterbot’ drone swarms
The scenes depicted are terrifying: a classroom shooting, target political assassinations, nations living in fear of targeted attacks with no way to respond.
It looks like an episode of “Black Mirror.” And like the hit BBC show, the short film “Slaughterbots” is a prescient warning about the use of technology in our lives – and its potential consequences.
Produced by a University of California, Berkeley professor, the short film offers a prescriptive glance at a future where autonomous drone swarms have become the norm, with dire consequences for global security and stability.
“The problem with autonomous weapons is that they’re intrinsically scalable,” Professor Stuart Russell told Global News from Paris, France.
“Meaning that because they’re autonomous and therefore don’t need a human to take care of them, so to speak.”
Russell admits that this film was partly inspired by the seminal Reagan-era film “The Day After,” which warned of the fallout from a nuclear war between Russian and the Soviet Union. In this case, Russell’s aim was to convey a similar notion regarding autonomous, self-piloting drones.
“One human or a small team of humans can launch thousands, or millions, or even billions of them if they can afford to – and kill, essentially, arbitrarily large numbers of people,” Russell said.
“That makes them weapons of mass destruction.”
One image from the film shows how a swarm of drones could be deployed against a large target like a city.
The video, produced in association with the Future of Life Institute, begins with an actor portraying a weapons industry CEO, unveiling a new kind of autonomous drone weapons system. This miniature drone, piloted by a computer program, is capable of independent facial recognition – and armed with an explosive charge, transforms itself into a targeted assassination tool.
The video than transitions into a sort of faux-documentary, showing the impact of these autonomous drone “swarms” on civil society. The images are purposely stark: an attack on the United States Senate is shown with only “one side of the aisle” being targeted. Another, a drone swarm attacks a classroom, with only certain students being targeted.
In the video, the perpetrator of these attacks is kept purposely vague – terrorism, government action, even a random attack – highlighting what Russell says is one of the dangers of AI-controlled drone weapons.
“When I talk to people about autonomous weapons, they say ‘Oh you know we could really use those to kill a lot of bad guys,’” Russell said. “You hear this over and over again. And it never occurs to people to think what would happen if the bad guys have them? We would be just as vulnerable if not more so.”
The video was shared at the United Nations Convention on Certain Conventional Weapons on November 12, part of a campaign by several global NGO’s to raise awareness about the dangers of autonomous weapons.
Perhaps the most disturbing scene in the movie portrays a drone attack on a classroom.
And for those who may think this kind of weapons system is in the distant future, consider that in January of this year the United States tested 103 miniature “Perdix” drones, deployed from three F/A-18 Super Hornet fighter jets.
“Perdix [drones] are not pre-programmed synchronised individuals, they are a collective organism, sharing one distributed brain for decision-making and adapting to each other like swarms in nature,” William Roper, director of the Strategic Capabilities Office, told BBC News in January. “Because every Perdix communicates and collaborates with every other Perdix, the swarm has no leader and can gracefully adapt to drones entering or exiting the team.”
In May, the United States Navy tested their own autonomous drone weapons system. In addition, China is also developing their own fleet of fixed-wing autonomous drone aircraft. When asked to offer a guess, Russell estimated that a commercial product, similar to what is displayed in his film, could be on the market in as soon as 18 months.
In the end, it’s their ability to specifically target, the difficulty in practically defending against them, and the further difficulty in tracking their user that has Russell worried about their implications.
“They’re more dangerous than nuclear weapons,” Russel said. “Because it’s more tempting to use them.”