Show simple item record

dc.contributor.authorBeck, A.
dc.contributor.authorCañamero, Lola
dc.contributor.authorBard, K.A.
dc.identifier.citationBeck , A , Cañamero , L & Bard , K A 2010 , Towards an Affect Space for robots to display emotional body language . in Procs of the 19th IEEE Int Symposium on Robot and Human Interactive Communication, RO-MAN . IEEE , pp. 464-469 , 19th IEEE Int Symposium on Robot and Human Interactive Communication , Viareggio , Italy , 12/09/10 .
dc.identifier.otherPURE: 429718
dc.identifier.otherPURE UUID: e5e15e02-d200-47ee-9843-37484624ce75
dc.identifier.otherdspace: 2299/5166
dc.identifier.otherScopus: 78649809554
dc.descriptionOriginal article can be found at: “This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder." “Copyright IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.”
dc.description.abstractIn order for robots to be socially accepted and generate empathy it is necessary that they display rich emotions. For robots such as Nao, body language is the best medium available given their inability to convey facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should significantly improve its sociability. This research investigates the creation of an Affect Space for the generation of emotional body language to be displayed by robots. To create an Affect Space for body language, one has to establish the contribution of the different positions of the joints to the emotional expression. The experiment reported in this paper investigated the effect of varying a robot's head position on the interpretation, Valence, Arousal and Stance of emotional key poses. It was found that participants were better than chance level in interpreting the key poses. This finding confirms that body language is an appropriate medium for robot to express emotions. Moreover, the results of this study support the conclusion that Head Position is an important body posture variable. Head Position up increased correct identification for some emotion displays (pride, happiness, and excitement), whereas Head Position down increased correct identification for other displays (anger, sadness). Fear, however, was identified well regardless of Head Position. Head up was always evaluated as more highly Aroused than Head straight or down. Evaluations of Valence (degree of negativity to positivity) and Stance (degree to which the robot was aversive to approaching), however, depended on both Head Position and the emotion displayed. The effects of varying this single body posture variable were complex.en
dc.relation.ispartofProcs of the 19th IEEE Int Symposium on Robot and Human Interactive Communication, RO-MAN
dc.titleTowards an Affect Space for robots to display emotional body languageen
dc.contributor.institutionScience & Technology Research Institute
dc.contributor.institutionSchool of Computer Science
dc.contributor.institutionCentre for Computer Science and Informatics Research
dc.relation.schoolSchool of Computer Science

Files in this item


There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record