In this paper we show how a social robot's actions can face explanatory demands for how it came to act on its decision, what goals, tasks, or purposes its design had those actions pursue, and what norms or social constraints the system recognizes in the course of its action. As a result, we argue that explanations for social robots will need to be accurate representations of the system's operation along causal, purposive, and justificatory lines.
@article{arnold2021thri, title={Explaining In Time: Meeting Interactive Standards of Explanation for Robotic Systems}, author={Thomas Arnold and Daniel Kasenberg and Matthias Scheutz}, year={2021}, journal={ACM Trans. Hum.-Robot Interact.}, publisher={ACM}, url={https://hrilab.tufts.edu/publications/arnold2021thri.pdf} }