The Texture Robot Project seeks to create a new expressive nonverbal channel for social robots in the form of texture-changing skin.
Collaborators: Yuhan Hu, Zhengnan Zhao, Abheek Vimal, and Guy Hoffman (Cornell).
The vast majority of social robots use body movement and facial expressions to express their internal states. These nonverbal modalities are inspired by human and animal modes of nonverbal expression, but many biological systems also display changes to their skin texture to express their emotional states. This widespread and easily readable behavior has not been utilized for expressive behavior in social robotics.
Robot Texture project aims to implement the skin changes of social robot with the use of soft fluidic actuators. Our work performed fabrication, actuation as well as a framework for mapping emotional states to texture changes. We believe that the integration of a texture-changing skin, combining both haptic and visual effects, can significantly enhance the expressive spectrum of robots for social interaction.