Musk, Hawking: Terminate killer robots before they terminate us

Billionaire entrepreneur Elon Musk, physicist Stephen Hawking, and others have signed an open letter warning of the dangers of combining weapons with artificially intelligence.  

|
Shizuo Kambayashi/AP/File
SoftBank Corp CEO Masayoshi Son speaks with the company's robot Pepper, a humanoid robot designed to live with humans, during a press conference in Maihama, near Tokyo, in June.

A global arms race to make artificial-intelligence-based autonomous weapons is almost sure to occur unless nations can ban the development of such weapons, several scientists warn.

Billionaire entrepreneur Elon Musk, physicist Stephen Hawking and other tech luminaries have signed an open letter warning against the dangers of starting a global arms race of artificial intelligence (AI) technologyunless the United Nations supports a ban on weapons that humans "have no meaningful control over."

The letter, which was issued by the Future of Life organization, is being presented today (July 27) at the International Joint Conference On Artificial Intelligence in Buenos Aires, Argentina. [Super-Intelligent Machines: 7 Robotic Futures]

"The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow," the letter reads, referring to the automatic weapons.

The risks, the signatories say, could be far greater than those posed by nuclear weapons.

Rise of the machines

From self-driving cars to sex bots, more and more of humanity's fundamental tasks are being taken over by robots. The inevitable march of the machines has spurred both utopian and apocalyptic visions of the future. Rogue AI that threatens humanity has featured prominently in science fiction movies such as "The Matrix" and "2001: A Space Odyssey."

But increasingly, these fears aren't just being played out on the silver screen. In fact, artificial-intelligence researchers themselves have voiced concerns over how innovations in the field are being developed. With autonomous AI weapons — such as drone planes that could seek and kill people using a face-recognition algorithm — the technology could be here in a matter of years, the writers of the letter argue.

And while drone fighters could limit battlefield casualties, these autonomous bots could also lower the threshold for initiating conflicts in the first place, the letter states.

In addition, such automatic weapons could conceivably be in the hands of almost every military power on Earth, because AI-based killing machines wouldn't require costly or hard-to-obtain materials. It wouldn't be long before assassins, terrorists and other bad actors could purchase them on the black market and use them for nefarious purposes, the scientists wrote.

"Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity," the letter states. 

This isn't the first time the science and tech luminaries have warned against the dangers of AI. In 2014, Hawking said the development of full artificial intelligence could spell the end of the human race. Both Hawking and Musk signed a letter by the same organization in January warning that AI holds great dangersunless humanity can ensure that AI systems "will do what we want them to."

Follow Tia Ghose on Twitter and Google+. Follow Live Science @livescienceFacebook Google+. Original article on Live Science.

Copyright 2015 LiveScience, a Purch company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Musk, Hawking: Terminate killer robots before they terminate us
Read this article in
https://www.csmonitor.com/Science/2015/0728/Musk-Hawking-Terminate-killer-robots-before-they-terminate-us
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe