Melbourne robotics experts have joined the international science community in voicing concerns that Artificial Intelligence (AI) is the most dangerous warfare tool since the nuclear arms race.
Physicist Steven Hawking, Space X’s Elon Musk, Apple co-founder Steve Wozniak, Skype co-founder Jaan Talinn and activist Noam Chomsky are just some of the names on a now 12,000-signature strong petition curated by the Future of Life Institute.
The letter refers to armed quadcopters or drones which “can search for and eliminate people meeting certain pre-defined criteria” and aren’t controlled by humans.
The number of signatories has increased fivefold over the course of today, with nearly 2000 signatories from the robotics field.
“Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons” the letter reads.
Melbourne-based Dr Vincent Crocher is a postdoctoral researcher in robotics at the University of Melbourne. He said he signed to raise the need for public discussion about what the “rules of engagement” should be about technology already being used.
“We have to ask ourselves these questions now. The public should know that the problem is on the table now, not in 20 years,” he said.
The letter was presented on Tuesday at the International Joint Conference on Artificial Intelligence which continues today in Buenos Aires and follows the April meeting of the Convention on Conventional Weapons held at the United Nations in Geneva.
David Beesley is completing a masters in emerging drone culture at RMIT University. He said he believes the call for UN regulation on autonomous weapons is justified, but noted how frequently semi-autonomous weapons were already being used.
“The Israelis for example, in no-man’s land and on the borders, they’ve already got machine gun platforms that will shoot on movement,” he said.
“The Americans have similar things on the border between Mexico and the southern states. But by taking the human out of that decision-making process and letting the machines take absolute control, I think it’s a recipe for disaster,” he said.
Yet despite the fear around AI, Mr Beesley said the advancements in unmanned technology had undeniable benefits in civilian society doing “dull, dirty and dangerous jobs” such as surveying and mapping pipelines or farms.
Although he backed calls to group autonomous weapons alongside chemical and biological war tools at a UN level, Mr Beesley said everyday civilians are already creating dangerous, homemade weapons with drone technology.
US authorities are currently investigating the teenager behind a YouTube video uploaded earlier this month showing a homemade drone attached to a handgun firing shots in Connecticut.
And there are dozens of DIY videos showing modified drones shooting fireworks and successfully hitting running targets.
Roman Candle Attack Drone 2.0
So we attached 2 Roman Candle’s to the DJI Phantom 2 Drone and sent the guys running. 1 hit and many close calls. Lots of fun in Danger Bay Please don’t try …
“It’s open to abuse, our worst paranoid nightmares. But I’d like to think that human nature will prevail and there will be generally positive technologies, but there is always that spooky element,” Mr Beesley said.