Show simple item record

dc.contributor.authorRagusa, Francesco
dc.contributor.authorFurnari, Antonio
dc.contributor.authorLivatino, Salvatore
dc.contributor.authorFarinella, Giovanni
dc.date.accessioned2021-01-10T00:02:13Z
dc.date.available2021-01-10T00:02:13Z
dc.date.issued2021-01-09
dc.identifier.citationRagusa , F , Furnari , A , Livatino , S & Farinella , G 2021 , ' The MECCANO Dataset: Understanding Human-Object Interactions from Egocentric Videos in an Industrial-like Domain. ' , Paper presented at 2021 IEEE Winter Conference on Applications of Computer Vision , Waikoloa , United States , 5/01/21 - 9/01/21 . < https://arxiv.org/abs/2010.05654 >
dc.identifier.citationconference
dc.identifier.otherPURE: 22956063
dc.identifier.otherPURE UUID: f28b5f0f-ea29-41a3-99d9-4a7aec45e716
dc.identifier.urihttp://hdl.handle.net/2299/23660
dc.description.abstractWearable cameras allow to collect images and videos of humans interacting with the world. While human-object interactions have been thoroughly investigated in third person vision, the problem has been understudied in egocentric settings and in industrial scenarios. To fill this gap, we introduce MECCANO, the first dataset of egocentric videos to study human-object interactions in industrial-like settings. MECCANO has been acquired by 20 participants who were asked to build a motorbike model, for which they had to interact with tiny objects and tools. The dataset has been explicitly labeled for the task of recognizing human-object interactions from an egocentric perspective. Specifically, each interaction has been labeled both temporally (with action segments) and spatially (with active object bounding boxes). With the proposed dataset, we investigate four different tasks including 1) action recognition, 2) active object detection, 3) active object recognition and 4) egocentric human-object interaction detection, which is a revisited version of the standard human-object interaction detection task. Baseline results show that the MECCANO dataset is a challenging benchmark to study egocentric human-object interactions in industrial-like scenarios. We publicy release the dataset at https://iplab.dmi.unict.it/MECCANO.en
dc.language.isoeng
dc.titleThe MECCANO Dataset: Understanding Human-Object Interactions from Egocentric Videos in an Industrial-like Domain.en
dc.contributor.institutionCentre for Engineering Research
dc.contributor.institutionCommunications and Intelligent Systems
dc.contributor.institutionSchool of Physics, Engineering & Computer Science
dc.contributor.institutionDepartment of Engineering and Technology
dc.description.statusPeer reviewed
dc.identifier.urlhttps://arxiv.org/abs/2010.05654
rioxxterms.versionAM
rioxxterms.typeOther
herts.preservation.rarelyaccessedtrue


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record