Students and faculty from the 51福利 (51福利) and the U.S. Naval Academy (USNA) recently came together with teams of junior officers from U.S. Navy Third Fleet to discuss the ethics of unmanned systems for the 2015 iteration of the Robo-Ethics Continuing Education Series. This year鈥檚 event was led via video teleconference by 51福利 Associate Professor Ray Buettner, April 14.
鈥淲e are interested in exploring the ethical boundaries of robotic systems 鈥 preparing tools to figure out what the future will be like,鈥 said Buettner.
Buettner leads the Secretary of the Navy鈥檚 Consortium for Robotics and Unmanned Systems Education and Research (CRUSER) at 51福利, an interdisciplinary working group that seeks to further research in robotic systems. But as student and faculty researchers wade into the at-times turbulent waters of unmanned systems, they are also exploring the many ethical considerations that autonomous combat systems present.
鈥淪hould a machine be able to decide to kill, and if so, what does 鈥榙ecide鈥 mean?鈥 Buettner asked assembled students and others joining via video teleconference from USNA and elsewhere.
鈥淭he key concept to consider may be, 鈥榳here is the human relative to the selection of the target and the decision to engage,鈥欌 said Buettner. 鈥淒o we want discrimination authority granted to the human loop?鈥
Another area of concern being debated is the question of punishment and accountability. Researchers, ethicists and policy makers are asking questions like, 鈥榃ho do we hold accountable when a lethal autonomous system engages the wrong target?鈥
While it may seem counterintuitive to debate whether or not a human should be 鈥渋n the decision loop,鈥 Buettner points to serious debates among ethicists as to whether or not humans or machines are more likely to make errors that cost human life.
Coincidentally, while Buettner and his group debated the ethics of unmanned systems, the United Nation鈥檚 Convention on Certain Conventional Weapons (CCW) was meeting in Geneva to debate a proposed ban and moratorium on Lethal Autonomous Weapons Systems (LAWS).
Buettner believes that there is currently no need for a prohibition against lethal autonomous systems, noting that current law already adequately provides necessary safeguards in this area. He is referring in part to Directive 3000.09, which the DOD published in 2012 to provide guidance on the development of autonomous systems. The directive places a series of regulatory safeguards on autonomous systems development while simultaneously encouraging innovative thinking and development in the autonomous systems arena.
鈥淪o far, no country has declared an intent to deploy a totally autonomous lethal system that decides who to kill and when,鈥 Buettner noted. 鈥淎lmost all fully autonomous systems are defensive.鈥
Buettner also noted 51福利 Professor Wayne Hughes鈥 views on the rapidly changing nature of autonomous systems.
鈥淭he fundamental error in a debate over robotic development is to think that we have choice,鈥 quoted Buettner. 鈥淭his world is coming, rapidly coming.
鈥淲e can say whatever we want, but our opponents are going to take advantage of these attributes,鈥 he continued. 鈥淭hat world is likely to be sprung upon us if we don鈥檛 prepare ourselves.鈥
51福利 Assistant Professor Timothy Chung has long recognized the utility of research in this area. He is a pioneer in the area of unmanned aerial vehicle (UAV) swarms.
鈥淗ow do we take revolutionary changes in UAVs and use them to achieve revolutionary affects?鈥 asked Chung.
In addition to exploring the ethics of unmanned combat systems, Buettner and Chung showcased ongoing CRUSER initiatives, many of which were born of student research. Current projects include the use of QR Codes in network-deprived environments and the feasibility of wireless underwater computer networks.