dc.contributor.author | Rossi, Alessandra | |
dc.contributor.author | Scheunemann, Marcus | |
dc.contributor.author | L’Arco, Gianluca | |
dc.contributor.author | Rossi, Silvia | |
dc.contributor.editor | Li, Haizhou | |
dc.contributor.editor | Ge, Shuzhi Sam | |
dc.contributor.editor | Wu, Yan | |
dc.contributor.editor | Wykowska, Agnieszka | |
dc.contributor.editor | He, Hongsheng | |
dc.contributor.editor | Liu, Xiaorui | |
dc.contributor.editor | Li, Dongyu | |
dc.contributor.editor | Perez-Osorio, Jairo | |
dc.date.accessioned | 2021-12-13T17:15:01Z | |
dc.date.available | 2021-12-13T17:15:01Z | |
dc.date.issued | 2021-11-02 | |
dc.identifier.citation | Rossi , A , Scheunemann , M , L’Arco , G & Rossi , S 2021 , Evaluation of a Humanoid Robot’s Emotional Gestures for Transparent Interaction . in H Li , S S Ge , Y Wu , A Wykowska , H He , X Liu , D Li & J Perez-Osorio (eds) , Social Robotics : 13th International Conference, ICSR 2021, Singapore, Singapore, November 10–13, 2021, Proceedings . vol. 13086 , Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) , vol. 13086 LNAI , Springer Nature , pp. 397-407 , 13th International Conference on Social Robotics , Singapore , 10/11/21 . https://doi.org/10.1007/978-3-030-90525-5_34 | |
dc.identifier.citation | conference | |
dc.identifier.isbn | 9783030905248 | |
dc.identifier.isbn | 9783030905255 | |
dc.identifier.issn | 0302-9743 | |
dc.identifier.other | ORCID: /0000-0002-0815-7024/work/104970618 | |
dc.identifier.uri | http://hdl.handle.net/2299/25258 | |
dc.description | © Springer Nature Switzerland AG 2021. This is the accepted manuscript version of an article which has been published in final form at https://doi.org/10.1007/978-3-030-90525-5_34 | |
dc.description.abstract | Effective and successful interactions between robots and people are possible only when they both are able to infer the other’s intentions, beliefs, and goals. In particular, robots’ mental models need to be transparent to be accepted by people and facilitate the collaborations between the involved parties. In this study, we focus on investigating how to create legible emotional robots’ behaviours to be used to make their decision-making process more transparent to people. In particular, we used emotions to express the robot’s internal status and feedback during an interactive learning process. We involved 28 participants in an online study where they rated the robot’s behaviours, designed in terms of colours, icons, movements and gestures, according to the perceived intention and emotions. | en |
dc.format.extent | 11 | |
dc.format.extent | 745358 | |
dc.language.iso | eng | |
dc.publisher | Springer Nature | |
dc.relation.ispartof | Social Robotics | |
dc.relation.ispartofseries | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | |
dc.subject | Affective robotics | |
dc.subject | Human-robot interaction | |
dc.subject | Social robotics | |
dc.subject | Transparency | |
dc.subject | Theoretical Computer Science | |
dc.subject | Computer Science(all) | |
dc.title | Evaluation of a Humanoid Robot’s Emotional Gestures for Transparent Interaction | en |
dc.contributor.institution | School of Physics, Engineering & Computer Science | |
dc.contributor.institution | Department of Computer Science | |
dc.date.embargoedUntil | 2022-11-02 | |
dc.identifier.url | http://www.scopus.com/inward/record.url?scp=85119859504&partnerID=8YFLogxK | |
rioxxterms.versionofrecord | 10.1007/978-3-030-90525-5_34 | |
rioxxterms.type | Other | |
herts.preservation.rarelyaccessed | true | |