Abstract
Social robots use gestures to express internal and affective states, but their interactive capabilities are hindered by relying on preprogrammed or hand-animated behaviors, which can be repetitive and predictable. We propose a method for automatically synthesizing affective robot movements given manually-generated examples. Our approach is based on techniques adapted from deep learning, specifically generative adversarial neural networks (GANs).