Show simple item record

dc.contributor.authorHeffernan, Rory
dc.date.accessioned2017-06-30T13:33:37Z
dc.date.available2017-06-30T13:33:37Z
dc.date.issued2016-04-24
dc.identifier.citationHeffernan , R 2016 , ' Adaptive Smart Environments: Detecting Human Behaviour from Multimodal Observation ' , Paper presented at 9th International Conference on Advances in Computer-Human Interactions (ACHI) , Venice , Italy , 24/04/16 - 28/04/16 pp. 353-358 .
dc.identifier.citationconference
dc.identifier.urihttp://hdl.handle.net/2299/18710
dc.descriptionRory Heffernan, ‘Adaptive Smart Environments: Detecting Human Behaviour from Multimodal Observation’, paper presented at the 9th International Conference on Advances in Computer-Human Interactions (ACHI) 2016, Venice, Italy, 24-28 April, 2016.
dc.description.abstractIt is desirable to enhance the social capabilities of a smart home environment to become more aware of the context of the human occupants’ activities. By taking human behavioural and contextual information into account, this will potentially improve decision making by the various smart house systems. Full mesh Wireless Sensor Networks (WSN) can be used for passive localisation and tracking of people or objects within a smart home. By monitoring changes in the propagation field of the monitored area from the link quality measurements collected from all the nodes of the network, it is feasible to infer target locations. It is planned to apply techniques from Radio Tomographic Imaging (RTI) and machine vision methods, adapted to the idiosyncrasies of RTI, which will facilitate real-time multiple target tracking in the University of Hertfordshire Robot House (UHRH). Using the Robot Operating System (ROS) framework, these data may then be fused with concurrent data acquired from other sensor systems (e.g.) 3-D video tracking and ambient audio detection in order to develop a high level contextual data model for human behaviour in a smart environment. We present experimental results which could provide support for human activity recognition in smart environments.en
dc.format.extent6
dc.format.extent1606525
dc.language.isoeng
dc.relation.ispartof
dc.subjectradio tomography; device-free passive localisation; wireless sensor networks; human-computer interaction; sensor fusion
dc.titleAdaptive Smart Environments: Detecting Human Behaviour from Multimodal Observationen
dc.contributor.institutionScience & Technology Research Institute
dc.contributor.institutionSchool of Computer Science
dc.description.statusPeer reviewed
rioxxterms.typeOther
herts.preservation.rarelyaccessedtrue


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record