Saving humanity from killer robots starts today, say scientists
Loading...
Fully autonomous weapons, or "killer robots," have come under scrutiny at the in Davos, Switzerland.
It is the first time the annual meeting has considered the subject, and it was discussed amid a general flurry of interest in the world of artificial intelligence.
While there was a focus on human society can enjoy as the field of robotics advances, one hour-long panel session Thursday considered the darker side: 鈥淲hat if robots go to war?鈥
The idea of rogue robots causing havoc is nothing new: science fiction has depicted such apocalyptic scenarios for decades.
But scientists, experts, and various organizations have in recent years begun to take the threat seriously.
鈥淚t鈥檚 not about destroying an industry or a whole field,鈥 says Mary Wareham, coordinator of , in a phone interview with 海角大神.聽鈥淚t鈥檚 about trying to ring-fence the dangerous technology."
This coalition of non-governmental organizations, launched in 2013, aims to 鈥,鈥 defining these as 鈥渨eapons systems that select targets and use force without further human intervention."
Renowned physicist Stephen Hawking was one of thousands of researchers, experts, and business leaders to sign an open letter in July 2015, which concludes:
鈥淪tarting a military AI [artificial intelligence] arms race is a bad idea, and should be on offensive autonomous weapons beyond meaningful human control.鈥
Yet there are those who see a preemptive ban as a missed opportunity, These technologies may offer the possibility of 鈥reducing noncombatant casualties鈥 in war, as Ronald Arkin, associate dean at the Georgia Institute of Technology in Atlanta, told the Monitor in June 2015.
He did however concede it made sense to have a moratorium on deploying such weapons 鈥渦ntil we can show that we have exceeded human-level performance from an ethical perspective."
The panel in Davos included former UN disarmament chief Angela Kane and BAE Systems chair Sir Roger Carr, as well as an artificial intelligence expert and a robot ethics expert.
The chair of BAE Systems, a 鈥,"聽described a $40 billion industry working on autonomous weapons in 40 countries.
Mr. Carr went on to say fully autonomous weapons would be 鈥渄evoid of responsibility鈥 and would have 鈥渘o emotion or sense of mercy.鈥澛犫淚f you from human endeavor whether it is in peace or war, you will take humanity to another level which is beyond our comprehension,鈥 he warned.
So, how close are fully autonomous weapons to becoming a reality?
Back in 2012, some predicted it would take a couple of decades, Ms. Wareham tells the Monitor in her interview, but even since then, estimates have shrunk, as last year鈥檚 open letter describes:
鈥淎rtificial Intelligence (AI) technology has reached a point where the deployment of such systems is 鈥 practically if not legally 鈥 , and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms."