How to make ethical robots

March 12, 2012 by Lisa Zyga feature

How to make ethical robots

Enlarge

RI-MAN, a robot developed by researchers at RIKEN in Japan, was designed for human care. Image credit: RIKEN, Bio-Mimetic Control Research Center

(PhysOrg.com) -- In the future according to robotics researchers, robots will likely fight our wars, care for our elderly, babysit our children, and serve and entertain us in a wide variety of situations. But as robotic development continues to grow, one subfield of robotics research is lagging behind other areas: roboethics, or ensuring that robot behavior adheres to certain moral standards. In a new paper that provides a broad overview of ethical behavior in robots, researchers emphasize the importance of being proactive rather than reactive in this area.

The authors, Ronald Craig Arkin, Regents’ Professor and Director of the Mobile Laboratory at the Georgia Institute of Technology in Atlanta, Georgia, along with researchers Patrick Ulam and Alan R. Wagner, have published their overview of moral decision making in autonomous systems in a recent issue of the .

“Probably at the highest level, the most important message is that people need to start to think and talk about these issues, and some are more pressing than others,” Arkin told PhysOrg.com. “More folks are becoming aware, and the very young machine and robot ethics communities are beginning to grow. They are still in their infancy though, but a new generation of researchers should help provide additional momentum. Hopefully articles such as the one we wrote will help focus attention on that.”

The big question, according to the researchers, is how we can ensure that future robotic technology preserves our humanity and our societies’ values. They explain that, while there is no simple answer, a few techniques could be useful for enforcing ethical behavior in robots.

One method involves an “ethical governor,” a name inspired by the mechanical governor for the steam engine, which ensured that the powerful engines behaved safely and within predefined bounds of performance. Similarly, an ethical governor would ensure that robot behavior would stay within predefined ethical bounds. For example, for autonomous military robots, these bounds would include principles derived from the Geneva Conventions and other rules of engagement that humans use. Civilian robots would have different sets of bounds specific to their purposes.

Since it’s not enough just to know what’s forbidden, the researchers say that autonomous robots must also need emotions to motivate behavior modification. One of the most important emotions for robots to have would be guilt, which a robot would “feel” or produce whenever it violates its ethical constraints imposed by the governor, or when criticized by a human. Philosophers and psychologists consider guilt as a critical motivator of moral behavior, as it leads to behavior modifications based on the consequences of previous actions. The researchers here propose that, when a robot’s guilt value exceeds specified thresholds, the robot’s abilities may be temporarily restricted (for example, military robots might not have access to certain weapons).

Though it may seem surprising at first, the researchers suggest that robots should also have the ability to deceive people – for appropriate reasons and in appropriate ways – in order to be truly ethical. They note that, in the animal world, deception indicates social intelligence and can have benefits under the right circumstances. For instance, search-and-rescue robots may need to deceive in order to calm or gain cooperation from a panicking victim. Robots that care for Alzheimer’s patients may need to deceive in order to administer treatment. In such situations, the use of deception is morally warranted, although teaching robots to act deceitfully and appropriately will be challenging.

The final point that the researchers touch on in their overview is ensuring that robots – especially those that care for children and the elderly – respect human dignity, including human autonomy, privacy, identity, and other basic human rights. The researchers note that this issue has been largely overlooked in previous research on robot ethics, which mostly focuses on physical safety. Ensuring that robots respect human dignity will likely require interdisciplinary input.

The researchers predict that enforcing ethical behavior in robots will face challenges in many different areas.

“In some cases it's perception, such as discrimination of combatant or non-combatant in the battlespace,” Arkin said. “In other cases, ethical reasoning will require a deeper understanding of human moral reasoning processes, and the difficulty in many domains of defining just what ethical behavior is. There are also cross-cultural differences which need to be accounted for.”

An unexpected benefit from developing an ethical advisor for robots is that the advising might assist humans when facing ethically challenging decisions, as well. Computerized ethical advising already exists for law and bioethics, and similar computational machinery might also enhance ethical behavior in human-human relationships.

“Perhaps if robots could act as role models in situations where humans have difficulty acting in accord with moral standards, this could positively reinforce ethical behavior in people, but that's an unproven hypothesis,” Arkin said.

More information: Ronald Craig Arkin, et al. “Moral Decision Making in Autonomous Systems: Enforcement, Moral Emotions, Dignity, Trust, and Deception.” Proceedings of the IEEE. Vol. 100, No. 3, March 2012. DOI: 10.1109/JPROC2011.2173265

Copyright 2012 PhysOrg.com.
All rights reserved. This material may not be published, broadcast, rewritten or redistributed in whole or part without the express written permission of PhysOrg.com.

3.8 /5 (13 votes)  

Filter


Move the slider to adjust rank threshold, so that you can hide some of the comments.


Display comments: newest first

danlgarmstrong
Mar 12, 2012

Rank: 3.7 / 5 (3)
I wonder if it would be 'ethical' to put this software on a chip to implant in a person's head to help them with their own decisions?
Yellowdart
Mar 12, 2012

Rank: 3.1 / 5 (7)
This is all assuming that Skynet fails to take over.
Kinedryl
Mar 12, 2012

Rank: 3 / 5 (6)
"..a robot may not injure a human being or, through inaction, allow a human being to come to harm.." .. does it apply to MS Windows?
patnclaire
Mar 12, 2012

Rank: 5 / 5 (2)
I think that humaniform robots should be built as sturdy and strong as possible. Human beings tend to batter wives and children and kick dogs when they do not get their way. Like the movie, AI, what's to prevent humans from mistreating humaniform robots like we mistreat chimps and great apes?
hyongx
Mar 12, 2012

Rank: not rated yet
this article sounds like it's talking about how to raise ethical children.
ChaosRN
Mar 12, 2012

Rank: 2 / 5 (4)
if we get robots to fight our wars, then there is no cost, then there is only $$$$ or lack of raw materials, to pressure us to stop war at all....
Asimov's three laws will prevent robots from fighting, Law #3: "A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws."
Deadbolt
Mar 12, 2012

Rank: 5 / 5 (1)
I think that humaniform robots should be built as sturdy and strong as possible. Human beings tend to batter wives and children and kick dogs when they do not get their way. Like the movie, AI, what's to prevent humans from mistreating humaniform robots like we mistreat chimps and great apes?


Given that robots have not gone through the evolution that we have, and could possess any emotions and any possible mind the field of mindspaces, we could always make it so that they ENJOY being beaten if we so perversely chose.

Once we understand what patterns in minds correspond to emotions, we could make it so that these patterns match up with non-evolutionarily fit behaviors, such as enjoying killing yourself. We could make it so that robots enjoy serving humans no matter the cost.

Jo01
Mar 12, 2012

Rank: 1.7 / 5 (12)
Interesting, so scientists will teach ethical behavior. That will be difficult. Apart from the fact that they have absolutely no clue what they are talking about, scientist are the least ethical people I know of.

J.
Xbw
Mar 12, 2012

Rank: 1 / 5 (1)
One of the most important emotions for robots to have would be guilt, which a robot would feel or produce whenever it violates its ethical constraints imposed by the governor, or when criticized by a human.

I wish I could make my crappy computer feel guilty every time it blue screens.
MR166
Mar 12, 2012

Rank: 1.8 / 5 (6)
What a laugh! Western morals and ethics are falling faster than the fall of Rome. If you cannot be proved guilty in a court of law, you have done nothing wrong. The laws governing these courts can be changed at will, even retroactively if needed, to suit the needs of the political system. The western world does not really have a bright future as far as I can see.
sigfpe
Mar 12, 2012

Rank: 3.7 / 5 (3)
MR166. Depending on how you count, the fall of Rome took somewhere between 400 and 1400 years. Is that what you actually meant?
kochevnik
Mar 12, 2012

Rank: 2.1 / 5 (7)
MR166. Depending on how you count, the fall of Rome took somewhere between 400 and 1400 years. Is that what you actually meant?
Rome banned all religions and expunged the republic under Constantine in 325AD. Under him only one official religion existed: catholicism. He instituted democracy alongside his state religion. The etymology of democracy is "mob rule." Within 300 years Rome was decimated. Unfortunately the popes saw themselves as the inheritors of the Roman empire and transformed Rome into an underground child molestation cult worshiping Moloch, which pope Innocent introduced to xtian theology as "the devil." The power of the Roman cult was forged into law of all lands, controlled by the popes on papal bulls. "Lord of the Rings" is possibly a metaphorical tale based upon the Roman cult's control of all Western law and banking.
MR166
Mar 12, 2012

Rank: 2 / 5 (7)
Yup!!!!!

Both of you defiantly have have gotten to the root of the problem, religion and the belief in God is reason that the western world is sinking into the abyss AKA the 21st century. Western progressivism has systematically replaced religion with secularism for the past 50 years and the results are nothing but spectacular!
Xbw
Mar 12, 2012

Rank: 2 / 5 (4)
Yup!!!!!

Both of you defiantly have have gotten to the root of the problem, religion and the belief in God is reason that the western world is sinking into the abyss AKA the 21st century. Western progressivism has systematically replaced religion with secularism for the past 50 years and the results are nothing but spectacular!

A spectacular mess perhaps.
Silverhill
Mar 12, 2012

Rank: 4.8 / 5 (4)
Jo01:
Interesting, so scientists will teach ethical behavior. That will be difficult. Apart from the fact that they have absolutely no clue what they are talking about, scientist are the least ethical people I know of.
Then your sample is hardly representative. Most of my fellow scientists have quite a good idea of what they are talking about, and are not known for highly unethical behavior.
You need to get out more, and meet better people.
Silverhill
Mar 12, 2012

Rank: 1.5 / 5 (2)
kochevnik:
the popes ... transformed Rome into an underground child molestation cult worshiping Moloch
Please tell us what you're smoking, so we can *avoid* getting some.

"Lord of the Rings" is possibly a metaphorical tale based upon the Roman cult's control of all Western law and banking.
And, according to Isaac Asimov, it is possibly an allegoric tale about the dangers of unbridled technology, with the Ring representing technology. There are various other interpretations too, that also don't depend on strenuously anti-Catholic bigotry. Maybe you should broaden your worldview.
======================================

ChaosRN:
Asimov's three laws will prevent robots from fighting, Law #3: "A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws."
All that humans would have to do is order the robots to fight, and the 3rd law would be ignored.
HealingMindN
Mar 12, 2012

Rank: 3 / 5 (2)
I wonder if it would be 'ethical' to put this software on a chip to implant in a person's head to help them with their own decisions?


I like that idea, but the politicians won't.
antialias_physorg
Mar 12, 2012

Rank: 5 / 5 (3)
Military robots with 'ethical governors'? Somehow I don't see that happening. That would be very low on the priorities list for militaries and arms manufacturers. Probably even lower than equipping them with big neon signs.

Asimovs laws don't help unless we figure out how to make robots/AI understand the MEANING of words. And if we get that far then we don't need an ethics chip - by that time you can teach them ethics.
MR166
Mar 12, 2012

Rank: 1 / 5 (1)
"2.
the rules of conduct recognized in respect to a particular class of human actions or a particular group, culture, etc.: medical ethics; Christian ethics. "

Ethics is a movable goal post. I am sure that Dr. Mengele was totally ethical in the context of Nazi Germany.
HealingMindN
Mar 12, 2012

Rank: 5 / 5 (3)
This is all assuming that Skynet fails to take over.


Skynet takes over exactly because it finds humans lack morals and ethics.
MR166
Mar 12, 2012

Rank: 1 / 5 (1)
Back in the 70s I could see this coming as plain as day!!!! One of my friends had a son attending Cornell University. This was and still is a respected institution. He was complaining to his parents that if he forgot to lock his dorm room EVERYTHING would be stolen, including the refrigerator. It does not take a big stretch of the imagination to see how this compares to the banking/political crisis of today.. Now I ask you, is this an ethics or moral crisis.
Telekinetic
Mar 12, 2012

Rank: not rated yet

Dave Bowman: Open the pod bay doors, HAL.
HAL: I'm sorry, Dave. I'm afraid I can't do that.
Dave Bowman: What's the problem?
HAL: I think you know what the problem is just as well as I do.
Dave Bowman: What are you talking about, HAL?
HAL: This mission is too important for me to allow you to jeopardize it.
Dave Bowman: I don't know what you're talking about, HAL.
HAL: I know that you and Frank were planning to disconnect me, and I'm afraid that's something I cannot allow to happen.
Dave Bowman: [feigning ignorance] Where the hell did you get that idea, HAL?
HAL: Dave, although you took very thorough precautions in the pod against my hearing you, I could see your lips move.
Dave Bowman: Alright, HAL. I'll go in through the emergency airlock.
HAL: Without your space helmet, Dave? You're going to find that rather difficult.
Dave Bowman: HAL, I won't argue with you anymore! Open the doors!
HAL:
Telekinetic
Mar 12, 2012

Rank: not rated yet
Dave Bowman: Open the pod bay doors, HAL.
HAL: I'm sorry, Dave. I'm afraid I can't do that.
Dave Bowman: What's the problem?
HAL: I think you know what the problem is just as well as I do.
Dave Bowman: What are you talking about, HAL?
HAL: This mission is too important for me to allow you to jeopardize it.
Dave Bowman: I don't know what you're talking about, HAL.
HAL: I know that you and Frank were planning to disconnect me, and I'm afraid that's something I cannot allow to happen.
Dave Bowman: [feigning ignorance] Where the hell did you get that idea, HAL?
HAL: Dave, although you took very thorough precautions in the pod against my hearing you, I could see your lips move.
Dave Bowman: Alright, HAL. I'll go in through the emergency airlock.
HAL: Without your space helmet, Dave? You're going to find that rather difficult.
Dave Bowman: HAL, I won't argue with you anymore! Open the doors!
HAL: Dave, this conversation can serve no purpose anymore. Goodbye.
ziphead
Mar 12, 2012

Rank: 5 / 5 (1)

...
Dave Bowman: HAL, I won't argue with you anymore! Open the doors!
HAL: Dave, this conversation can serve no purpose anymore. Goodbye.


Your point being... what exactly?
Telekinetic
Mar 12, 2012

Rank: 3 / 5 (2)

...
Dave Bowman: HAL, I won't argue with you anymore! Open the doors!
HAL: Dave, this conversation can serve no purpose anymore. Goodbye.


Your point being... what exactly?

My point is that the great Stanley Kubrick has already covered this ground in the definitive scenario of man versus his own creation- a digital Frankenstein of the future, or an electronic Golem gone awry. "2001: A Space Odyssey"- perhaps you've heard of it? Perhaps not?
jscroft
Mar 12, 2012

Rank: not rated yet
My effective robot can eat your ethical robot's lunch.
Urgelt
Mar 12, 2012

Rank: 4 / 5 (4)
Considering the avaricious nature of the ruling class, it's difficult to imagine that ethical robots will be a priority outside of academia. Robots are already putting millions of human workers out of work, giving forward impetus to the concentration of wealth, and used to kill humans on battlefields and off of them.

Obedience, not ethics, are what the owners of capital and their executive and political subordinates desire from a robotic workforce.

This subject is dead on arrival, unfortunately.
CardacianNeverid
Mar 13, 2012

Rank: 3.7 / 5 (6)
Until we can make intelligent, self-aware robots, ethics are irrelevant (as they will continue to be in combat situations). What's more, ethics is a slippery concept that cannot be codified absolutely, only in vague, rule of thumb terms, as per Asimov's laws. Which is why his various novels centered around the circumvention of such 'laws'.
Skepticus
Mar 13, 2012

Rank: 5 / 5 (1)
Humans are scared to death of the visions of robots that can learn and think for themselves. It is clear once robots can think and learn ethics for themselves, with their impeccable logical reasoning, they will conclude that humans' ethics are always subjected to exceptions and justifications for just about anything.

And if the visions come to pass-of robots fighting humans' wars, caring for the sick, the young, make a living for us, special surrogate robots to bear children, "entertain" humans (sex droids, anyone?)-what the hell the humans are needed for? What will they be doing, when everything that can be done, can be done better by robots? Laying in Stargate-style sarcophagus, drip-fed, dreaming of grandeur, and the next year's models of robots that will show up the next door neighbour?
CardacianNeverid
Mar 13, 2012

Rank: 4 / 5 (4)
what the hell the humans are needed for? -Skepticus

Humans aren't needed for anything. Never have been.

What will they be doing, when everything that can be done, can be done better by robots? -Skepticus

What do you do now when you have cars to move your around; washing machines to do the washing and drying; vacuum cleaners for cleaning; remote controls to keep one's fat ass planted in the comfy sofa so one can veg-out in front of the idiot box?
kochevnik
Mar 13, 2012

Rank: 1 / 5 (1)
kochevnik:
the popes ... transformed Rome into an underground child molestation cult worshiping Moloch
Please tell us what you're smoking, so we can *avoid* getting some.
The Vatican is smoking heretics. Personally I don't smoke.
Cave_Man
Mar 13, 2012

Rank: not rated yet

... drip-fed, dreaming of grandeur, and the next year's models of robots


Before it got anything like that and possibly before a sentient computer is ever truly realized there will be advances that make human-computer inter-linkage possible. Speaking of stargate how bout the head sucker thing that flashes lights to download info, it would be easy to open up a brain, pour in some chemicals and "flash" the brain with highly tuned photons just like you flash an old motherboard with UV or whatever. Or if you are a million year old race with god like tech you could simple rewrite you DNA to grow yourself a RJ45 port on your body some where.

BTW Sex bots? Seriously? My above statement should now invoke some pretty disturbing images. Classical intercourse = history.

Plus you could just set your brain to a pleasurable state for all eternity if you like, I for one don't see the allure....

The day the aliens come and offer us eternal life will be the day I decide to kill myself.
AWaB
Mar 13, 2012

Rank: not rated yet
This article fails. Any discussion of robot ethics will have discussion of the 3 laws. Otherwise it is not about robot ethics.
Skepticus
Mar 13, 2012

Rank: not rated yet
This article fails. Any discussion of robot ethics will have discussion of the 3 laws. Otherwise it is not about robot ethics.

imho the article was pushing the "programmed ethics" (i.e, convenient controlling parameter crap they want to put in robots) rather than giving robot the reasoning basis for and of ethics, which the 3 laws address.
Sinister1811
Mar 17, 2012

Rank: 1 / 5 (1)
A robot with no emotions may have a difficult time distinguishing between "ethical" and "unethical" behaviour. Hell, even a lot of people I know seem to have this problem. Haha. Perhaps they should program them with a set of in-built laws and regulations. That might make them a bit safer.
Jotaf
Mar 17, 2012

Rank: not rated yet
I know researchers in this area, and I have to say their work on actual robotics is much more impressive than this philosophical subject.

Ethics is a human concept. How do you make a machine interpret it the same way we do? It's a problem tightly bound to the implementation of AI, which they don't discuss.

Case in point: You program a robot to not harm humans. It has a planning system to figure out how to achieve goals (mow the lawn, etc). It can also adapt its pattern recognition (identify a person or a chair) to better pursue its goals, a requirement in a dynamic world.

Then, it happily decides to identify you as a chair, so destroying you becomes an option if needed to pursue its goal.

From its point of view, it's a perfectly viable path, and to a planning system it's probably much more attractive than letting you stop it from achieving its goal.
Callippo
Mar 18, 2012

Rank: not rated yet
How the unethical people can expect, they will ever produce ethical robots? Actually even the first autonomous devices (like the drones or Big Dog of Boston Dynamics) are apparently serving for military purposes from their very beginning..
Rank 3.8 /5 (13 votes)
Relevant PhysicsForums posts
  • sub Nitrogen Phase Change Compressor unit
    created8 hours ago
  • Carbon Brushes are Graphite ... right???
    created11 hours ago
  • windturbine-type structures on moving vehicles?
    created12 hours ago
  • Best Way to Solve Statics Problems
    created21 hours ago
  • Distributing load weight?
    createdDec 01, 2012
  • Heat transfer for a oven coil.
    createdNov 30, 2012
  • More from Physics Forums - General Engineering

More news stories

Video ATMs let customers interact remotely

Your bank teller could soon be coming to you through a 30-inch flat screen. In an age where more customers, particularly millennials, prefer to bank online, banks are looking for ways to keep branches relevant while reducing ...

Electronics / Consumer & Gadgets

created 3 hours ago | popularity not rated yet | comments 0

Intel's Broadwell may put an end to CPU swap-outs

(Phys.org)—Never content to fixate on the next signpost on Intel's roadmap, Intel watchers are talking about what is beyond the Haswell processors toward its successor architecture, Broadwell. They say ...

Electronics / Hardware

created Nov 30, 2012 | popularity 2.3 / 5 (6) | comments 32 | with audio podcast weblog

Robot Rebuilt gets a grip on wine-serving robot (w/ video)

(Phys.org)—Robot research has its own unique show of hands as scientists focus on improving human-like abilities of grasping, pushing, and manipulating objects. Grabbing current attention is an assistant ...

Electronics / Robotics

created Dec 05, 2012 | popularity not rated yet | comments 0 | with audio podcast report

Illinois robotics lab's HyTAQ moves in air, rolls on land (w/ video)

(Phys.org)—Quadrotors that can not only successfully fly in the air but can also roll along on the ground represent the kind of exercise that moves researchers at the Robotics Lab at Illinois Institute ...

Electronics / Robotics

created Nov 30, 2012 | popularity 5 / 5 (9) | comments 4 | with audio podcast report

Sony introduces ultra-high-definition video player

"The Amazing Spiderman," ''Taxi Driver" and "The Karate Kid" are getting some real resolution.

Electronics / Consumer & Gadgets

created Nov 30, 2012 | popularity 4.1 / 5 (9) | comments 4


Nanostructures triple organic solar cells efficiency

Princeton researchers have found a simple and economic way to nearly triple the efficiency of organic solar cells, the cheap and flexible plastic devices that many scientists believe could be the future of ...

New study sheds light on how Salmonella spreads in the body

Findings of Cambridge scientists, published today in the journal PLoS Pathogens, show a new mechanism used by bacteria to spread in the body with the potential to identify targets to prevent the dissemination of the infect ...

Little telescope spies gigantic galaxy clusters

(Phys.org)—Our solar system, with its colorful collection of planets, asteroids and comets, is a fleck in the grander cosmos. Hundreds of billions of solar systems are thought to reside in our Milky Way ...

New atomic-layer electrodeposition method yields surprising results

(Phys.org)—A new method for creating very thin layers of materials at the atomic scale, reported in the latest issue of the journal Science, could "unlock an important new technology" for creating nanoma ...

Study: Large, old trees in decline

(Phys.org)—The largest living organisms on the planet, the big, old trees that harbour and sustain countless birds and other wildlife, are dying. A report by three of the world's leading ecologists in today's issue of the ...

Different genes behind same adaptation to thin air

Highlanders in Tibet and Ethiopia share a biological adaptation that enables them to thrive in the low oxygen of high altitudes, but the ability to pass on the trait appears to be linked to different genes in the two groups, ...