University of Hertfordshire Research Archive

        JavaScript is disabled for your browser. Some features of this site may not work without it.

        Browse

        All of UHRABy Issue DateAuthorsTitlesThis CollectionBy Issue DateAuthorsTitles

        Arkivum Files

        My Downloads
        View Item 
        • UHRA Home
        • University of Hertfordshire
        • Research publications
        • View Item
        • UHRA Home
        • University of Hertfordshire
        • Research publications
        • View Item

        Towards an Affect Space for robots to display emotional body language

        Author
        Beck, A.
        Cañamero, Lola
        Bard, K.A.
        Attention
        2299/9860
        Abstract
        In order for robots to be socially accepted and generate empathy it is necessary that they display rich emotions. For robots such as Nao, body language is the best medium available given their inability to convey facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should significantly improve its sociability. This research investigates the creation of an Affect Space for the generation of emotional body language to be displayed by robots. To create an Affect Space for body language, one has to establish the contribution of the different positions of the joints to the emotional expression. The experiment reported in this paper investigated the effect of varying a robot's head position on the interpretation, Valence, Arousal and Stance of emotional key poses. It was found that participants were better than chance level in interpreting the key poses. This finding confirms that body language is an appropriate medium for robot to express emotions. Moreover, the results of this study support the conclusion that Head Position is an important body posture variable. Head Position up increased correct identification for some emotion displays (pride, happiness, and excitement), whereas Head Position down increased correct identification for other displays (anger, sadness). Fear, however, was identified well regardless of Head Position. Head up was always evaluated as more highly Aroused than Head straight or down. Evaluations of Valence (degree of negativity to positivity) and Stance (degree to which the robot was aversive to approaching), however, depended on both Head Position and the emotion displayed. The effects of varying this single body posture variable were complex.
        Publication date
        2010
        Published in
        Procs of the 19th IEEE Int Symposium on Robot and Human Interactive Communication, RO-MAN
        Published version
        https://doi.org/10.1109/ROMAN.2010.5598649
        Other links
        http://hdl.handle.net/2299/9860
        Metadata
        Show full item record
        Keep in touch

        © 2019 University of Hertfordshire

        I want to...

        • Apply for a course
        • Download a Prospectus
        • Find a job at the University
        • Make a complaint
        • Contact the Press Office

        Go to...

        • Accommodation booking
        • Your student record
        • Bayfordbury
        • KASPAR
        • UH Arts

        The small print

        • Terms of use
        • Privacy and cookies
        • Criminal Finances Act 2017
        • Modern Slavery Act 2015
        • Sitemap

        Find/Contact us

        • T: +44 (0)1707 284000
        • E: ask@herts.ac.uk
        • Where to find us
        • Parking
        • hr
        • qaa
        • stonewall
        • AMBA
        • ECU Race Charter
        • disability confident
        • AthenaSwan