August 15, 2018

Why AI researchers shouldn’t turn their backs on the military

Source: The MIT Technology Review

Journalist: Will Knight

More than 2,400 AI researchers recently signed a pledge promising not to build so-called autonomous weapons—systems that would decide on their own whom to kill. This follows Google’s decision not to renew a contract to supply the Pentagon with AI for analysis of drone footage after the company came under pressure from many employees opposed to its work on a project known as Maven.

Paul Scharre, the author of a new book, Army of None: Autonomous Weapons and the Future of War, believes that AI researchers need to do more than opt out if they want to bring about change.

An Army Ranger in Iraq and Afghanistan and now a senior fellow at the Center for a New American Security, Scharre argues that AI experts should engage with policymakers and military professionals to explain why researchers are concerned and help them understand the limitations of AI systems.

Scharre spoke with MITTechnology Review senior editor Will Knight about the best way to halt a potentially dangerous AI arms race.

Will Knight: How keen is the US military to develop AI weapons?

Paul Scharre: "US defense leaders have repeatedly stated that their intention is to keep a human “in the loop” and responsible for lethal-force decisions. Now, the caveat is they’ve also acknowledged that if other countries build autonomous weapons, then they may be forced to follow suit. And that’s the real risk—that if one country crosses this line then others may feel they have to respond in kind just to remain competitive."

WK: Can these promises really be trusted, though?

PS: "I think senior US defense officials are sincere that they want humans to remain responsible for the use of lethal force. Military professionals certainly don’t want their weapons running amok. Having said that, it remains an open question how to translate a broad concept like human responsibility over lethal force into specific engineering guidance on what kinds of weapons are allowed. The definition of what constitutes an “autonomous weapon” is contested already, so there may be differing views on how to put those principles into practice."


Read the Full Interview at the MIT Technology Review

Author

  • Paul Scharre

    Executive Vice President and Director of Studies

    Paul Scharre is the executive vice president and director of studies at the Center for a New American Security (CNAS). He is the award-winning author of Four Battlegrounds: Po...