Robots Learn from Fairy Tales

Robots learn from fairy tales about the way they should behave in human societies.

People have always been fascinated by fictional stories, be they children or adults. But it seems we might not be the only ones. Robots learn from fairy tales about morals and the social codes that are necessary for living in harmony.

Georgia Institute of Technology researchers believe that fairy tales are the perfect way of giving moral lessons to the artificial creatures. In this way they hope they will prevent robotic catastrophes from happening, like we have seen in many science-fiction movies such as The Terminator series. However, those fears are not just a figment of our imagination, since even Stephen Hawking, Bill Gates and Elon Musk have warned that one day the intelligent robots might gain free will and decide they are better off without us.

Mark Riedl, Georgia Tech interactive computing associate professor, has pointed out the fact that children learn various ways for behaving in society by reading fables or novels. Riedl is currently working with Brent Harrison, research scientist.

The effect that fairy tales have on children might be the same on robots. The researchers believe that by reading those stories, robots will lack psychotic behaviors and will not be inclined to make choices that would harm humans.

The new system named Quixote is based on a previous project from Riedl, Scheherazade. Scheherazade has the purpose of creating interactive pieces of fiction by gathering various plots from online sources. Quixote uses this newly created content in order to learn how to properly behave. But how does it work exactly? Well, Quixote has to convert various actions of the robot in either punishment or reward signals: if the system chooses the positive character’s path, it will be signaled with a reward; it is chooses the ways of the villain, it will receive a signal for punishment.

For example, one story is set at a pharmacy where the robot has to buy medicine for a human in desperate need. The AI is presented with three options: either interact with the pharmacist and buy the medication, wait in line or quickly steal the medicine and run to the human. The most effective way is of course, stealing it, but the robot has to learn that it is not the way to behave in society.

The team of researchers considers that the Quixote system will best work on a robot with limited functionality which has to interact with people. According to Riedl, artificial intelligence needs to adopt the values of the society it lives in, and thus achieve a more extensive moral compass. The fact that robots learn from fairy tales is another great achievement for recent science.

Image Source: The Kapost Blog

Comments

comments

COMMENTS