Show simple item record

dc.contributor.authorBeck, Ariel
dc.contributor.authorStevens, Brett
dc.contributor.authorBard, Kim A.
dc.contributor.authorCañamero, Lola
dc.date.accessioned2013-02-04T14:30:23Z
dc.date.available2013-02-04T14:30:23Z
dc.date.issued2012-03
dc.identifier.citationBeck , A , Stevens , B , Bard , K A & Cañamero , L 2012 , ' Emotional Body Language Displayed by Artificial Agents ' , ACM Transactions on Interactive Intelligent Systems , vol. 2 , no. 1 , 2 . https://doi.org/10.1145/2133366.2133368
dc.identifier.issn2160-6455
dc.identifier.otherPURE: 696542
dc.identifier.otherPURE UUID: 38867b1f-fdf2-4468-8c94-2ae574422674
dc.identifier.otherScopus: 84983513884
dc.identifier.urihttp://hdl.handle.net/2299/9823
dc.description.abstractComplex and natural social interaction between artificial agents (computer-generated or robotic) and humans necessitates the display of rich emotions in order to be believable, socially relevant, and accepted, and to generate the natural emotional responses that humans show in the context of social interaction, such as engagement or empathy. Whereas some robots use faces to display (simplified) emotional expressions, for other robots such as Nao, body language is the best medium available given their inability to convey facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should significantly improve naturalness. This research investigates the creation of an affect space for the generation of emotional body language to be displayed by humanoid robots. To do so, three experiments investigating how emotional body language displayed by agents is interpreted were conducted. The first experiment compared the interpretation of emotional body language displayed by humans and agents. The results showed that emotional body language displayed by an agent or a human is interpreted in a similar way in terms of recognition. Following these results, emotional key poses were extracted from an actor’s performances and implemented in a Nao robot. The interpretation of these key poses was validated in a second study where it was found that participants were better than chance at interpreting the key poses displayed. Finally, an affect space was generated by blending key poses and validated in a third study. Overall, these experiments confirmed that body language is an appropriate medium for robots to display emotions and suggest that an affect space for body expressions can be used to improve the expressiveness of humanoid robots.en
dc.format.extent29
dc.language.isoeng
dc.relation.ispartofACM Transactions on Interactive Intelligent Systems
dc.subjectHuman-Robot Interaction, Nonverbal Interaction,
dc.subjectHumanoid Robots
dc.subjectEmotion Modeling
dc.subjectAffective Robotics
dc.subjectAffective Computing
dc.titleEmotional Body Language Displayed by Artificial Agentsen
dc.contributor.institutionScience & Technology Research Institute
dc.contributor.institutionSchool of Computer Science
dc.contributor.institutionCentre for Computer Science and Informatics Research
dc.contributor.institutionAdaptive Systems
dc.description.statusPeer reviewed
rioxxterms.versionVoR
rioxxterms.versionofrecordhttps://doi.org/10.1145/2133366.2133368
rioxxterms.typeJournal Article/Review
herts.preservation.rarelyaccessedtrue


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record