An understanding of canine emotional expressions and human responses to them are a promising avenue to pursue in developing the best social robots. Social robots are machines that interact and communicate with humans by following social behaviors and rules that go along with their roles. People want more out of them than simply performing tasks to make our lives easier. They want their Interactions with these robots to feel as natural as possible, which means minimizing the disturbing feeling many people experience with robots. For that to happen, social robots must act in a manner that is socially appropriate, which includes exhibiting the right emotions for the situation.
Much of the work on developing emotionally expressive robots has focused on human facial expressions, with some emphasis on gestures and tone of voice. These subtle forms of communication are difficult to create in an artificial system, and researchers are exploring other options. One promising line of study is to consider interactions between humans and robots as an interaction between two different species that must communicate, and to use a non-human species as a model for the robots.
Dogs are a natural choice because of the ability of humans, even without a lot of experience, to identify the emotional content of dogs’ behavior. Children can correctly identify the emotional content of dogs barks, people tend to ascribe emotions to their dogs, and these two species are able to cooperate and communicate with remarkable success. People are able to understand dogs, which is likely a result of our long-standing relationship and shared evolutionary history.
In a recent paper (“Humans attribute emotions to a robot that shows simple behavioral patterns borrowed from dog behavior”), a group of canine ethologists show that people are capable of understanding the emotions of robots when their actions are based on the behavior of dogs. Using a robot that was not shaped like a dog and could not alter its basic posture, this experiment asked the question, “Can even simple expressions of emotional behavior elicit an acceptable level of emotional attribution by people to the robot?” If so, such behaviors in a robot could lessen the need to develop robots capable of communicating complex emotions through behavior based on human facial expressions.
GET THE BARK NEWSLETTER IN YOUR INBOX!
Sign up and get the answers to your questions.
Photo from “Humans attribute emotions to a robot that shows simple behavioural
patterns borrowed from dog behavior” (Gásci et al. 2016)
The subjects in the experiment watched videos of a trained dog and of the robot and were asked to attribute emotions to them. The dog was a Belgian Malinois and the robot was a touchscreen mounted on a base with wheels. The body of the robot had arm-like limbs attached to it, one of which was capable of moving in a variety of ways and one of which was not movable. The touchscreen, or head-like part of the robot, could not move independently and had no face. The robot made sounds, which were considered vocalizations. The emotions expressed by the dog and by the robot were fear, joy, anger, sadness and neutral (no emotion). Both the dog and the robot made sounds to accompany other aspects of their behavior.
The behaviors of the dog for expressing joy were approaching, wagging his tail and sidling, while in the robot, joy was represented by approaching, lifting one arm and moving the fingers and spinning. Anger in the dog involved approaching and wagging the tail as well as moving the head up and down dynamically, barking and showing his teeth. The angry robot approached, moved its arm high and swung it several times. Sadness in the dog meant sitting followed by lying down with his head down and then not moving. The robot showed sadness by backing away and turning away, lowering its arm and remaining motionless.
People more often attributed emotions to the dogs than to the robots, but the type of emotion was correctly identified with similar levels of success. The amount of experience people had with dogs was not a factor in their ability to identify emotions in either the dog or the robot.
The goal of this study was to investigate the possibility that simple canine behaviors can provide a way to facilitate the understanding of emotional expressions of robots. The robot is not designed to resemble a dog, and indeed a strength of this approach is that robots do not have to match their animal models. That is an advantage because the robots can be built with their function in mind without the extra expense and constraints of creating a specific form in order to maximize emotional expression.
General behaviors such as approaching, backing away, turning to the side, being in motion or staying still can all be performed by a robot of any shape. These behaviors, though based on canine models, are hardly specific to dogs, but apply across a large range of mammals. It is possible that creating the most emotionally expressive and natural-seeming social robots may require developers to consider a number of universal actions that are easily understood by humans as well as by other mammals.
Because human facial expressions are often considered too complex or confusing to mimic in social robots, the use of simple behaviors that convey emotions may provide a better way to make robots that are capable of emotional expression. Future work will explore ways that dogs (and perhaps other mammals) can serve as models for combining functionality with sociality. This approach will allow researchers to develop better social robots that people consider more like companions and with which they are more comfortable.