Get the most out of Ro­bot­ics Business Review!



This is a preview article. Please register for access to all content.
Learn more about membership benefits and start your membership today!

DoD Rethinks Lethal, Autonomous Drones
drone hanger
Trying to avoid “unintended engagements” a/k/a deadly accidents
By RBR Staff



The U.S. Defense Department has issued a new directive on the use of autonomous and semi-autonomous weapon systems, an attempt to regulate a technology that officials say could be years from becoming reality.

The directive, released Nov. 27, is focused on systems that can select and engage targets without the intervention of a human operator. Non-lethal autonomous systems, such as electronic attack or cyberspace systems, fall outside its jurisdiction. So do technologies such as the Patriot missile system, which have autonomous functions but still require human supervision.

Any autonomous and semi-autonomous weapon systems “shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force,” the doctrine reads. Humans still must play an oversight role, with the ability to activate or deactivate system functions should the need arise.

Systems will also be required to go through “rigorous” verification and validation and operational test and evaluation stages to catch potential malfunctions before the systems ever see active duty.

Once through the testing stages, systems will require the approval of the undersecretary of defense for policy, the undersecretary of defense for acquisition, technology and logistics, and the chairman of the Joint Chiefs of Staff before their activation.

The overall goal of the new rules is to avoid “unintended engagements,” defined in the doctrine as “damage to persons or objects that human operators did not intend to be the targets of U.S. military operations, including unacceptable levels of collateral damage beyond those consistent with the law of war, [rules of engagement], and commander’s intent.”

The new rules aren’t in place to discourage the development of an autonomous weapon system, said David Ochmanek, deputy assistant secretary for policy force development, who described the doctrine as “flexible.”

“What it does say is that if you want to [develop an autonomous weapon system], there will be a rigorous review process, and if you expect it to be approved, you will be asked to show that your software and hardware has been subject to rigorous test and validation,” Ochmanek said.

While the department is looking toward the future, the report’s authors don’t expect to need the new regulations any time soon.

“This directive is, for once, out ahead of events,” Ochmanek said. “This isn’t something where we all of a sudden realized someone’s out there about to develop a Terminator and decided we better get a directive out. That’s not the case.”

Although Ochmanek declined to guess at a timetable for the development of this technology, “I can say with confidence that there is no development program going on right now that would create an autonomous weapons system,” he said.

The idea of a robotic military UAV that can identify enemies and hunt them down is a long-time staple in science fiction. But even when autonomous military systems become a reality, they are unlikely to resemble something out of “Star Wars” or “The Matrix.”

Inherently lacking in necessary human qualities

“When you hear folks talk about this outside the Pentagon, in reports, they tend to leap to the hardest case ... something making a judgment call that [is] hard for people to make,” said a defense official involved with the drafting of the new doctrine.

The official used the example of two cars driving on the ground, one with an ally inside and one with an enemy inside. A machine would have to process an incredible amount of different data to be able to decide which car should be targeted.

“We don’t want to build a robot for that. Machines won’t have an advantage in that case,” said the official, who added that DoD would have a series of meetings with interested parties to brief them on the new doctrine.

The specter of that “hardest case” was raised in a Nov. 19 Human Rights Watch (HRW) report, “Losing Humanity: The Case against Killer Robots.” The report warned of the need to regulate autonomous devices, “which would inherently lack human qualities that provide legal and non-legal checks on the killing of civilians.”

Ochmanek denied any connection between the release of the HRW report and the new doctrine, which was in development for 18 months with the help of representatives from the Joint Staff, DoD’s acquisitions office, the Office of the General Counsel, the U.S. military services, and the research and development community.


Get premium access to all RBR content, join today!
Get your membership today!
Already a member? Log in.





Comments
No comments yet. Be the first to post a comment.

Name:

Email:


View comment guidelines

Remember me

Notify me of follow-up comments?




Special Focus: Robots and the Law

Special Focus: 3D Printing
3D Printing

The new reality of customizable, one-off production:
Additive Manufacturing (AM). Where it’s going, why and what’s
driving its emergence.


3D Bioprinting Human Tissue the Organovo Way

UPS, Stratasys Expand 3D Printers to 100 Stores Nationwide

RoboBusiness as Tool for Creativity and Innovation
More in 3D Printing



nLink Mobile Drilling Robot Wins RoboBusiness Pitchfire Startup Contest

10 Most Common Legal Issues for Robotics Startups

7 Game-Changing Robots

RoboBusiness Celebrates Mass. Robotics

Amazon, Google Form Small UAV Coalition