• search hit 4 of 6
Back to Result List

I can see it in your eyes

  • Over the past years, extensive research has been dedicated to developing robust platforms and data-driven dialog models to support long-term human-robot interactions. However, little is known about how people's perception of robots and engagement with them develop over time and how these can be accurately assessed through implicit and continuous measurement techniques. In this paper, we explore this by involving participants in three interaction sessions with multiple days of zero exposure in between. Each session consists of a joint task with a robot as well as two short social chats with it before and after the task. We measure participants' gaze patterns with a wearable eye-tracker and gauge their perception of the robot and engagement with it and the joint task using questionnaires. Results disclose that aversion of gaze in a social chat is an indicator of a robot's uncanniness and that the more people gaze at the robot in a joint task, the worse they perform. In contrast with most HRI literature, our results show that gaze towardOver the past years, extensive research has been dedicated to developing robust platforms and data-driven dialog models to support long-term human-robot interactions. However, little is known about how people's perception of robots and engagement with them develop over time and how these can be accurately assessed through implicit and continuous measurement techniques. In this paper, we explore this by involving participants in three interaction sessions with multiple days of zero exposure in between. Each session consists of a joint task with a robot as well as two short social chats with it before and after the task. We measure participants' gaze patterns with a wearable eye-tracker and gauge their perception of the robot and engagement with it and the joint task using questionnaires. Results disclose that aversion of gaze in a social chat is an indicator of a robot's uncanniness and that the more people gaze at the robot in a joint task, the worse they perform. In contrast with most HRI literature, our results show that gaze toward an object of shared attention, rather than gaze toward a robotic partner, is the most meaningful predictor of engagement in a joint task. Furthermore, the analyses of gaze patterns in repeated interactions disclose that people's mutual gaze in a social chat develops congruently with their perceptions of the robot over time. These are key findings for the HRI community as they entail that gaze behavior can be used as an implicit measure of people's perception of robots in a social chat and of their engagement and task performance in a joint task.show moreshow less

Export metadata

Additional Services

Search Google Scholar Statistics
Metadaten
Author details:Giulia PerugiaORCiD, Maike Paetzel-PrüsmannORCiD, Madelene Alanenpää, Ginevra CastellanoORCiD
DOI:https://doi.org/10.3389/frobt.2021.645956
ISSN:2296-9144
Pubmed ID:https://pubmed.ncbi.nlm.nih.gov/33898532
Title of parent work (English):Frontiers in robotics and AI
Subtitle (English):Gaze as an implicit cue of uncanniness and task performance in repeated interactions with robots
Publisher:Frontiers Media
Place of publishing:Lausanne
Publication type:Article
Language:English
Date of first publication:2021/04/07
Publication year:2021
Release date:2023/12/11
Tag:engagement; long-term interaction; mutual gaze; perception of robots; uncanny valley
Volume:8
Article number:645956
Number of pages:18
Funding institution:Swedish Foundation for Strategic Research under the COIN project [RIT15-0133]
Organizational units:Humanwissenschaftliche Fakultät / Strukturbereich Kognitionswissenschaften / Department Linguistik
DDC classification:0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 004 Datenverarbeitung; Informatik
Peer review:Referiert
Publishing method:Open Access / Gold Open-Access
DOAJ gelistet
License (German):License LogoCC-BY - Namensnennung 4.0 International
Accept ✔
This website uses technically necessary session cookies. By continuing to use the website, you agree to this. You can find our privacy policy here.