Researchers at Cornell University in New York have developed a robot prototype that is capable of expressing its emotions. When it "feels" angry, its eyes catch fire and its skin turns spiky.
These changes in emotion through the robot's outer surface is made of texture unit. Underneath the skin is inflated fluidic actuators programmed to respond to a certain stimulus or mood, including happy, sleepy, and sad. The prototype has two shapes -- goosebumps and spikes -- whose actuation units are unified into texture modules. They come with fluidic chambers linking lumps of a similar type.
According to Guy Hoffman, the assistant professor of mechanical and aerospace engineering at Cornell University, the team took the animal world as a huge inspiration for designing the robot and its nonverbal cues, adding that they should not just be mere "copies of humans."
"I've always felt that robots shouldn't just be modeled after humans or be copies of humans. We have a lot of interesting relationships with other species. Robots could be thought of as one of those 'other species,' not trying to copy what we do but interacting with us with their own language, tapping into our own instincts."
#Robot prototype will let you feel how it’s ‘feeling’ @CornellMAE @CornellEng @guyhoffman https://t.co/ThsrDE6jbV pic.twitter.com/kZ8euPKL6X
— Cornell Chronicle (@CU_Chronicle) July 9, 2018
Hoffman said one of the challenges they are facing is the size and noise level of the design, considering "that a lot of shape-changing technologies are quite loud." He has no specific application for the robot in mind at this point; nonetheless, he thinks this gives the industry "another way to think about how robots could be designed."
"At the moment, most social robots express [their] internal state only by using facial expressions and gestures. We believe that the integration of a texture-changing skin, combining both haptic [feel] and visual modalities, can thus significantly enhance the expressive spectrum of robots for social interaction," the researchers said in the paper.
You can read the full text of the research titled "Soft Skin Texture Modulation for Social Robotics" here.