Show simple item record

dc.contributor.authorRossi, Alessandra
dc.contributor.authorScheunemann, Marcus
dc.contributor.authorL’Arco, Gianluca
dc.contributor.authorRossi, Silvia
dc.contributor.editorLi, Haizhou
dc.contributor.editorGe, Shuzhi Sam
dc.contributor.editorWu, Yan
dc.contributor.editorWykowska, Agnieszka
dc.contributor.editorHe, Hongsheng
dc.contributor.editorLiu, Xiaorui
dc.contributor.editorLi, Dongyu
dc.contributor.editorPerez-Osorio, Jairo
dc.date.accessioned2021-12-13T17:15:01Z
dc.date.available2021-12-13T17:15:01Z
dc.date.issued2021-11-02
dc.identifier.citationRossi , A , Scheunemann , M , L’Arco , G & Rossi , S 2021 , Evaluation of a Humanoid Robot’s Emotional Gestures for Transparent Interaction . in H Li , S S Ge , Y Wu , A Wykowska , H He , X Liu , D Li & J Perez-Osorio (eds) , Social Robotics : 13th International Conference, ICSR 2021, Singapore, Singapore, November 10–13, 2021, Proceedings . vol. 13086 , Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) , vol. 13086 LNAI , Springer , pp. 397-407 , 13th International Conference on Social Robotics , Singapore , 10/11/21 . https://doi.org/10.1007/978-3-030-90525-5_34
dc.identifier.citationconference
dc.identifier.isbn9783030905248
dc.identifier.isbn9783030905255
dc.identifier.issn0302-9743
dc.identifier.otherPURE: 26366243
dc.identifier.otherPURE UUID: bd581f3f-2801-46d3-bbf6-357aaa9a865e
dc.identifier.otherScopus: 85119859504
dc.identifier.otherORCID: /0000-0002-0815-7024/work/104970618
dc.identifier.urihttp://hdl.handle.net/2299/25258
dc.description© Springer Nature Switzerland AG 2021. This is the accepted manuscript version of an article which has been published in final form at https://doi.org/10.1007/978-3-030-90525-5_34
dc.description.abstractEffective and successful interactions between robots and people are possible only when they both are able to infer the other’s intentions, beliefs, and goals. In particular, robots’ mental models need to be transparent to be accepted by people and facilitate the collaborations between the involved parties. In this study, we focus on investigating how to create legible emotional robots’ behaviours to be used to make their decision-making process more transparent to people. In particular, we used emotions to express the robot’s internal status and feedback during an interactive learning process. We involved 28 participants in an online study where they rated the robot’s behaviours, designed in terms of colours, icons, movements and gestures, according to the perceived intention and emotions.en
dc.format.extent11
dc.language.isoeng
dc.publisherSpringer
dc.relation.ispartofSocial Robotics
dc.relation.ispartofseriesLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
dc.subjectAffective robotics
dc.subjectHuman-robot interaction
dc.subjectSocial robotics
dc.subjectTransparency
dc.subjectTheoretical Computer Science
dc.subjectComputer Science(all)
dc.titleEvaluation of a Humanoid Robot’s Emotional Gestures for Transparent Interactionen
dc.contributor.institutionSchool of Physics, Engineering & Computer Science
dc.contributor.institutionDepartment of Computer Science
dc.date.embargoedUntil2022-11-02
dc.identifier.urlhttp://www.scopus.com/inward/record.url?scp=85119859504&partnerID=8YFLogxK
dc.relation.schoolSchool of Physics, Engineering & Computer Science
dcterms.dateAccepted2021-11-02
rioxxterms.versionAM
rioxxterms.versionofrecordhttps://doi.org/10.1007/978-3-030-90525-5_34
rioxxterms.typeOther
herts.preservation.rarelyaccessedtrue


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record