Emotion Expression in a Socially Assistive Robot for Persons with Parkinson's disease

2020

Conference: 13th PErvasive Technologies Related to Assistive Environments Conference (PETRA ’20)

Andrew P. Valenti and Avram Bock and Meia Chita-Tegmark and Michael Gold and Matthias Scheutz

Emotions are crucial for human social interactions and thus people communicate emotions through a variety of modalities: kinesthetic (through facial expressions, body posture and gestures), auditory (the acoustic features of speech) and semantic (the content of what they say). Sometimes however, communication channels for certain modalities can be unavailable (\eg in the case of texting), and sometimes they can be compromised, due to a disorder such as Parkinson's disease (PD) that may affect facial, gestural and speech expressions of emotions. To address this, we developed a prototype for an emoting robot that can detect emotions in one modality, specifically in the content of speech, and then express them in another modality, specifically through gestures. This paper is part of a larger project to develop a prototype for a socially assistive robot for PD persons. The goal is to present the technical implementation of one robot capability: emotion expression.

@inproceedings{Valenti2020a,
  title={Emotion Expression in a Socially Assistive Robot for Persons with Parkinson's disease},
  author={Andrew P. Valenti and Avram Bock and Meia Chita-Tegmark  and Michael Gold and Matthias Scheutz},
  year={2020},
  month={July},
  booktitle={13th PErvasive Technologies Related to Assistive Environments Conference (PETRA ’20)},
  url={https://hrilab.tufts.edu/publications/Valenti2020a.pdf}
}