Show simple item record

dc.contributor.authorLivatino, Salvatore
dc.contributor.authorGuastella, Dario C.
dc.contributor.authorMuscato, Giovanni
dc.contributor.authorRinaldi, Vincenzo
dc.contributor.authorCantelli, Luciano
dc.contributor.authorMelita, Carmelo D.
dc.contributor.authorCaniglia, Alessandro
dc.contributor.authorMazza, Riccardo
dc.contributor.authorPadula, Gianluca
dc.date.accessioned2021-02-14T00:07:58Z
dc.date.available2021-02-14T00:07:58Z
dc.date.issued2021-02-08
dc.identifier.citationLivatino , S , Guastella , D C , Muscato , G , Rinaldi , V , Cantelli , L , Melita , C D , Caniglia , A , Mazza , R & Padula , G 2021 , ' Intuitive Robot Teleoperation through Multi-Sensor Informed Mixed Reality Visual Aids ' , IEEE Access , vol. 9 , pp. 25795 - 25808 . https://doi.org/10.1109/ACCESS.2021.3057808
dc.identifier.issn2169-3536
dc.identifier.urihttp://hdl.handle.net/2299/23904
dc.description© 2021 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
dc.description.abstractMobile robotic systems have evolved to include sensors capable of truthfully describing robot status and operating environment as accurately and reliably as never before. This possibility is challenged by effective sensor data exploitation, because of the cognitive load an operator is exposed to, due to the large amount of data and time-dependency constraints. This paper addresses this challenge in remote-vehicle teleoperation by proposing an intuitive way to present sensor data to users by means of using mixed reality and visual aids within the user interface. We propose a method for organizing information presentation and a set of visual aids to facilitate visual communication of data in teleoperation control panels. The resulting sensor-information presentation appears coherent and intuitive, making it easier for an operator to catch and comprehend information meaning. This increases situational awareness and speeds up decision-making. Our method is implemented on a real mobile robotic system operating outdoor equipped with on-board internal and external sensors, GPS, and a reconstructed 3D graphical model provided by an assistant drone. Experimentation verified feasibility while intuitive and comprehensive visual communication was confirmed through an assessment, which encourages further developments.en
dc.format.extent14
dc.format.extent1887033
dc.language.isoeng
dc.relation.ispartofIEEE Access
dc.subjectVirtual reality and interfaces
dc.subjectmixed reality
dc.subjecthuman-robot interface
dc.subjecthead mounted display
dc.subjectrobot teleoperation
dc.subjectstereo vision
dc.subjectfield robotics
dc.subjecttelerobotics
dc.subjecthuman-robot interaction
dc.subjectgraphical user interfaces
dc.subjectaugmented reality
dc.subjectuser interfaces
dc.subjectVirtual reality
dc.subjectGeneral Engineering
dc.subjectGeneral Materials Science
dc.subjectGeneral Computer Science
dc.titleIntuitive Robot Teleoperation through Multi-Sensor Informed Mixed Reality Visual Aidsen
dc.contributor.institutionCentre for Engineering Research
dc.contributor.institutionCommunications and Intelligent Systems
dc.contributor.institutionSchool of Physics, Engineering & Computer Science
dc.contributor.institutionDepartment of Engineering and Technology
dc.description.statusPeer reviewed
dc.identifier.urlhttps://drive.google.com/file/d/1nwdLXeioe4U4-MnGAKbhVSF0CP6FKP-I/view?usp=sharing
dc.identifier.urlhttp://www.scopus.com/inward/record.url?scp=85101100460&partnerID=8YFLogxK
rioxxterms.versionofrecord10.1109/ACCESS.2021.3057808
rioxxterms.typeJournal Article/Review
herts.preservation.rarelyaccessedtrue


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record