Lethal autonomous systems: the ethics of programming robots for war

The World

Now that it’s possible to program unmanned combat vehicles to make decisions about where (and who) to strike in war situations, new questions of ethics have risen: In which situations can we allow robots to make their own decisions? Can we program robots to follow the Geneva Conventions? There is a more basic question, too: Do we even want robot soldiers?

Guest’s notes: Ronald Arkin on ethical behavior (and robots)

Historically research in military autonomous systems has focused on how to ensure that robots comply with mission requirements and safely conduct their duties from an operational perspective, whether as individual robots or as teams. It is now time to focus on other aspects as well, which include ethical compliance to the Laws of War and the Rules of Engagement. The end goal of this research is not necessarily more efficient killing machines but possibly more humane ones, i.e., where their application in the battlefield can potentially result in a reduction of collateral damage and noncombatant casualties when compared to human performance. This should occur without eroding mission performance.

The research I am conducting involving embedding ethical behavior in robots capable of lethal force is premised on two assumptions. The first is that warfare is inevitable and the second is that autonomous systems will eventually be used in its conduct. While I maintain the utmost respect for our warfighters and I believe that the vast majority do the best they can under the circumstances, given the current tempo of the battlefield it is no longer possible for humans to make fully informed and rational decisions regarding the application of lethal force in many instances. The tendency towards ethical infractions in soldiers is well documented in a recent report by the Surgeon General.

It is my belief that the use of robotic technology can potentially reduce the number of atrocities that occur during war, and it is the responsibility of scientists such as myself to look for ways to protect innocent lives while designing advanced technological solutions. I am also committed to providing our warfighters with the best possible tools for their job. These goals need not be in conflict.

It should be noted that I do not foresee the advent of robot warriors sweeping across the countryside as evidenced in science fiction, but rather that these machines will be embedded with our troops for highly specialized mission-specific tasks in support of human operations, such as counter-sniper or building clearing missions. They should not, and likely could not, replace soldiers one-for-one. Also I do not see the results of this research being used in the near future but are rather geared for the so-called war after next. It is also intended that these systems be deployed in total war scenarios and not where there are high concentrations of civilians, contrary to many of our current military involvements.

Space limitations prevent a full exposition of any of these positions. My personal research approach to this problem is documented in an upcoming book available this spring entitled “Governing Lethal Behavior in Autonomous Robots”. An earlier technical report upon which it is loosely based is also available.

Finally, I should clearly state that I am not an advocate for war or the use of robots as weapons of war. But if they are going to be used for this purpose, which I see as largely inevitable, we must find ways to ensure that they are suitably restrained according to international law. In any case, further discussion should go on at both national and international levels to determine the appropriate use of this new class of weapons to ensure that we as a society understand and accept the consequences of this new technology, even if such a discussion leads to an outright ban on its use.

? Ronald Arkin

Do you support journalism that strengthens our democracy?

At The World, we believe strongly that human-centered journalism is at the heart of an informed public and a strong democracy. We see democracy and journalism as two sides of the same coin. If you care about one, it is imperative to care about the other.

Every day, our nonprofit newsroom seeks to inform and empower listeners and hold the powerful accountable. Neither would be possible without the support of listeners like you. If you believe in our work, will you give today? We need your help now more than ever!