Investigating Human Perceptions of Robot Capabilities in Remote Human-Robot Team Tasks based on First-Person Robot Video Feeds

2014

Collection: Proceedings of IROS

Cody Canning and Thomas J. Donahue and Matthias Scheutz

It is well-known that a robot's appearance and its observable behavior can affect a human interactant's perceptions of the robot's capabilities and propensities in settings where humans and robots are co-located; for remote interactions the specific effects are less clear. Here, we use a remote interaction setting to investigate possible effects of simulated versus real first-person robot video feeds. The first experiment uses subject-level comparisons of the two video conditions in a multi-robot setting while the second and third experiments focus on a single robot, single video condition using a larger population (via Amazon Mechanical Turk) to study between-subjects effects. The latter experiments also probe the effects of robot appearance, video feed type, and stake humans have in the task. We observe a complex interplay between interaction, robot appearance, and video feed type as they affect perceived collaboration, utility, competence, and warmth of the robot.

@incollection{iros14rp,
  title={Investigating Human Perceptions of Robot Capabilities in Remote Human-Robot Team Tasks based on First-Person Robot Video Feeds},
  author={Cody Canning and Thomas J. Donahue and Matthias Scheutz},
  year={2014},
  booktitle={Proceedings of IROS},
  url={https://hrilab.tufts.edu/publications/iros14rp.pdf}
  doi={10.1109/IROS.2014.6943178}
}