Twitter Facebook Google RSS
 
National Defense > Blog > Posts > Roboticist: Lethal Autonomy 'Inevitable'
Roboticist: Lethal Autonomy 'Inevitable'
By Sarah Sicard



Despite continued controversy regarding robots that can apply lethal force autonomously, militaries are continuing to develop these capabilities at a rapid rate, said a roboticist at the International Neuroethics Society meeting Nov. 13.

“Unless regulated by international treaties, lethal autonomy is inevitable,” said Ronald Arkin, associate dean for research and space planning at the School of Interactive Computing at Georgia Tech.

“It’s already here,” he added.

While most weaponized unmanned robots remain under direct human control, many nations are moving toward fuller autonomy — much to the dismay of humanitarian groups and nongovernmental organizations, he said.

Nearly 60 of these groups have called for a worldwide ban of autonomous technologies as a whole, he said.

It is an issue that has been, and continues to be, debated heavily at the United Nations. However, banning these systems without further understanding of their capabilities is premature, Arkin said.

Despite continued disparagement from human rights groups, Arkin said the data suggests that there is significantly less collateral damage with the use of certain unmanned systems.

“Atrocities [are] an inherent part of warfare,” Arkin said. Autonomous systems could be used to reduce casualties, he added. The goal now is to make these systems more humane. “We don’t want terminators,” said Arkin.

However, he added, the definitions of autonomy and human control both must be established before governing agencies make laws regarding the development and deployment of these systems.

“Philosophers… have been figuring out the right ways [for men] to kill each other for thousands of years, codifying that for the last several hundred years and actually having international agreements on what is the right way… for us to slaughter each other in the battlefield,” he said.

Now they want robots to conform to those standards as well, he added.

The hope among roboticists in this field is that the technology — being less prone to human, emotional error — will one day eliminate some of the unnecessary casualties in warfare, he said.

It is Arkin's hope that these systems will have the capacity not only to decide when to fire, but also when not to fire. They could not only more effectively seek out targets, but also eliminate the use of blunt force, he said.

“There is a weak link in the kill chain,” he said. “Human beings, unfortunately — not all — stray from what should be done” in combat situations.

One of the biggest questions roboticists need to answer is: “Can we find out ways that can make them outperform human warfighters with respect to ethical performance?” Arkin said.

For the military, being able to rely more heavily on robots would allow it to “expand the battlefield,” and influence larger areas for longer periods of time, he added.

“The overall goal, from the military’s point of view is to reduce… casualties,” Arkin said.

While he admitted these systems would not be perfect, and that they will still kill civilians, he thinks it’s likely that they will kill far fewer than human war fighters. These systems may one day be made “more humane, which translates into better compliance with international humanitarian law,” he added.

Photo Credit: Thinkstock

Comments

Re: Roboticist: Lethal Autonomy 'Inevitable'


When Arkin says roboticists, he should not use the plural. There is only one roboticist with this belief. Many expert meetings of roboticists from military to civil have made it clear that robot weapons will be indiscriminate.

There are also open letter signed by many from AI and Robotics that strongly disagree with Arkin's take on the issue. Whey was he the only robotics professor interviewed here when his views are controversial and disagreed with by so many.
RobotLover at 11/19/2014 11:24 AM

Re: Roboticist: Lethal Autonomy 'Inevitable'

Prof. Ron Arkin is right, currently the key thing is to map know-how on lethal autonomy to existing International Humanitarian Law.  Some kind of Value Sensitive Design.  Without such research, many things shall come from different countries under the pretext that lethal autonomous weapons cannot comply with IHL.  Key devices, are the Arkin Ethical Governor and my Codified Key Safety Switch System/Concept, as the drivers for IHL compliance in the realm of Lethal Autonomy
Nyagudi Musandu Nyagudi at 11/19/2014 12:52 PM

Add Comment

Items on this list require content approval. Your submission will not appear in public views until approved by someone with proper rights. More information on content approval.

Name: *

eMail *

Comment *

Title

Attachments

Name: *


eMail *


Comment *


 

Refresh
Please enter the text displayed in the image.
The picture contains 6 characters.

Characters *

  

Legal Notice *

NDIA is not responsible for screening, policing, editing, or monitoring your or another user's postings and encourages all of its users to use reasonable discretion and caution in evaluating or reviewing any posting. Moreover, and except as provided below with respect to NDIA's right and ability to delete or remove a posting (or any part thereof), NDIA does not endorse, oppose, or edit any opinion or information provided by you or another user and does not make any representation with respect to, nor does it endorse the accuracy, completeness, timeliness, or reliability of any advice, opinion, statement, or other material displayed, uploaded, or distributed by you or any other user. Nevertheless, NDIA reserves the right to delete or take other action with respect to postings (or parts thereof) that NDIA believes in good faith violate this Legal Notice and/or are potentially harmful or unlawful. If you violate this Legal Notice, NDIA may, in its sole discretion, delete the unacceptable content from your posting, remove or delete the posting in its entirety, issue you a warning, and/or terminate your use of the NDIA site. Moreover, it is a policy of NDIA to take appropriate actions under the Digital Millennium Copyright Act and other applicable intellectual property laws. If you become aware of postings that violate these rules regarding acceptable behavior or content, you may contact NDIA at 703.522.1820.

 

 

Bookmark and Share