Show simple item record

dc.contributor.authorAmirabdollahian, Farshid
dc.contributor.authorWalters, Michael
dc.contributor.authorHeffernan, Rory
dc.contributor.authorFletcher, Sarah
dc.contributor.authorWebb, Phil
dc.date.accessioned2017-06-19T17:13:50Z
dc.date.available2017-06-19T17:13:50Z
dc.date.issued2017-04-25
dc.identifier.citationAmirabdollahian , F , Walters , M , Heffernan , R , Fletcher , S & Webb , P 2017 , ' Using myoelectric signals for gesture detection: a feasibility study ' , Paper presented at Ergonomics and Human Factors 2017 , Daventry , United Kingdom , 25/04/17 - 27/04/17 pp. 353-358 .
dc.identifier.citationconference
dc.identifier.otherORCID: /0000-0002-0047-1377/work/62748277
dc.identifier.urihttp://hdl.handle.net/2299/18367
dc.descriptionFarshid Amirabdollahian, Michael Walter, Rory Heffernan, Sarah Fletcher, and Phil Webb, ‘Using myoelectric signals for gesture detection: a feasibility study’. Paper presented at the Ergonomics and Human Factors 2017 Conference, 25 – 27 April 2017, Daventry, United Kingdom.
dc.description.abstractAbstract The propose of this study was to assess the feasibility of using myoelectric signals acquired using an off the shelf device, the Myo armband from Thalmic Lab. Background: With the technological advances in sensing human motion, and its potential to drive and control mechanical interfaces remotely, a multitude of input mechanisms are used to link actions between the human and the robot. In this study we explored the feasibility of using human arm’s myoelectric signals with the aim of identifying a number of gestures automatically. Material and methods: Participants (n = 26) took part in a study with the aim to assess the gesture detection accuracy using myoelectric signals. The Myo armband was used worn on the forearm. The session was divided into three phases, familiarisation: where participant learned how to use the armband, training: when participants reproduced a number of requested gestures to train our machine learning algorithm and recognition: when gestures presented on screen where reproduced by participants, and simultaneously recognised using the machine learning routines. Results: One participant did not complete the study due to technical errors during the session. The remaining (n = 25) participants completed the study allowing to calculate individual accuracy for grasp detection using this medium. Our overall accuracy was 65.06%, with the cylindrical grasp achieving the highest accuracy of around 7.20% and the tripod grasp achieving lowest recognition accuracy of 60.15%. Discussions: The recognition accuracy for the grasp performed is significantly lower compared to our earlier work where a mechatronic device was used. This could be due to the choice of grasps for this study, as it is not ideal to the placement of the armband. While tripod, cylindrical and lateral grasps have different finger and wrist articulations, their demand on supporting forearm muscles (mainly biceps and triceps) is less definite and therefore their myoelectric signals are less distinct. Furthermore, drop in accuracy could be caused by the fact that human muscles and consequently the myoelectric signals are substantially variable over time. Muscles change their relative intensity based on the speed of the produced gesture. In our earlier study, the gesture production speed was damped by the worn orthosis, leading to normalising the speed of gestures. This is while in our current study, hand motion is not restricted. Despite these, the recognition accuracy is still significant. Future work: There are remaining questions related to the feasibility of using myoelectric signals as an input to a remote controlled robot in a factory floor as it is anticipated that such a system would enhance control and efficiency in production processes. These questions therefore require further investigations regarding usability of the armband in its intended context, to ensure users are able to effectively control and manipulate the robot using the myoelectric system and enjoy a positive user experience. Future studies will focus on the choice of gestures, so that they are distinct and better identifiable, but also on other key human factors and system design features that will enhance performance, in compliance with relevant standards such as ISO 9241-210:2010 (standards for human-system interaction ergonomic design principles) . Furthermore, aspects of whether a machine learning algorithm should use individually learned events in order to recognise an individual’s gestures, or if it is possible to use normative representation of a substantial set of learnt events, to achieve higher accuracy remains an interesting area for our future work.en
dc.format.extent6
dc.format.extent8223153
dc.language.isoeng
dc.relation.ispartof
dc.subjectgesture detection
dc.subjectclassification
dc.subjectmachine learning
dc.subjecthuman-robot interface
dc.titleUsing myoelectric signals for gesture detection: a feasibility studyen
dc.contributor.institutionSchool of Computer Science
dc.contributor.institutionCentre for Computer Science and Informatics Research
dc.contributor.institutionAdaptive Systems
dc.contributor.institutionScience & Technology Research Institute
dc.description.statusPeer reviewed
rioxxterms.typeOther
herts.preservation.rarelyaccessedtrue


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record