Ethical principles in robots: a possibility?

 |  NCR Today

I'm in Wisconsin with my family, with members from as far away as Alaska. Monday morning on the deck, eating banana bread, my brother-in-law Tom raised the question of whether we can instill ethical principles in robots. My sister said we humans still control the robots, but Tom said not necessarily. At this point, we can always unplug them. However, at what point will a drone be able to decide to preserve itself, say, from an attempt to take control of the motherboard?

Artificial intelligence would seem to include the fact that the machines learn from the data they collect. So, for instance, a self-driving car would contain the rules of the road, a human installation. But it would be reading data from 4 miles ahead, making decisions about what route to take, appropriate speed, whether, I suppose, it would be more efficient to mow down a jaywalking pedestrian. These are decisions that can't be put into the robot ahead of time by a human because a human cannot foresee road conditions.

Isaac Asimov developed, in his science fiction, three robotic laws: A robot may not injure a human being or allow a human to come to harm; a robot must obey human orders unless they would cause human harm; a robot must protect itself, unless that self-protection conflicts with laws one and two. Very good thinking on Asimov's part. We just have not programmed these laws into computers. You can read more here: "Preventing an autonomous-systems arms race."

Christmas-NCR-gifts-half_0.jpgGive a subscription to our award-winning newspaper and save $10.

Human commitment to these robotic laws or some ethical code would end drone warfare, the topic of an editorial in Sunday's New York Times, "Reigning in the Drones."  It calls for better controls and public accountability of drone warfare. The Times refers to a 77-page report by The Stimson Center about the risks of creating more opposition and more drone warfare. And Richard A. Clarke, author of Sting of the Drone, writes: "Since [Nov. 12, 2001], the United States has killed at least two thousand people in five countries using armed drones. And the killing continues."

All this is only a surface discussion of tactical warfare. The deeper ethical questions about the powers of what we are making don't get discussed often. My brother-in-law's point is that today's computers, from cameras to bombs, were programmed by other computers. The programmers who wrote the original code assume the machines will keep the ball rolling.

We have a limited time frame to decide if we want our machines to hold some ethical norms. We haven't shown much ability so far to control technology. Soon we may lose control.

Support independent reporting on important issues.

 One family graphic_2016_250x103.jpg


NCR Comment code: (Comments can be found below)

Before you can post a comment, you must verify your email address at
Comments from unverified email addresses will be deleted.

  • Be respectful. Do not attack the writer. Take on the idea, not the messenger.
  • Don't use obscene, profane or vulgar language.
  • Stay on point. Comments that stray from the original idea will be deleted. NCR reserves the right to close comment threads when discussions are no longer productive.

We are not able to monitor every comment that comes through. If you see something objectionable, please click the "Report abuse" button. Once a comment has been flagged, an NCR staff member will investigate.

For more detailed guidelines, visit our User Guidelines page.

For help on how to post a comment, visit our reference page.

Commenting is available during business hours, Central time, USA. Commenting is not available in the evenings, over weekends and on holidays. More details are available here. Comments are open on NCR's Facebook page.



NCR Email Alerts


In This Issue

July 14-27, 2017