RH-HAR-SK: A Multi-view Dataset with Skeleton Data for Ambient Assisted Living Research
View/ Open
Author
Shahabian Alashti, Mohamad Reza
Bamorovat Abadi, Mohammad
Holthaus, Patrick
Menon, Catherine
Amirabdollahian, Farshid
Attention
2299/26988
Abstract
Human and activity detection has always been a vital task in Human-Robot Interaction (HRI) scenarios, such as those involving assistive robots. In particular, skeleton-based Human Activity Recognition (HAR) offers a robust and effective detection method based on human biomechanics. Recent advancements in human pose estimation have made it possible to extract skeleton positioning data accurately and quickly using affordable cameras. In interaction with a human, robots can therefore capture detailed information from a close distance and flexible perspective. However, recognition accuracy is susceptible to robot movements, where the robot often fails to capture the entire scene. To address this we propose the adoption of external cameras to improve the accuracy of activity recognition on a mobile robot. In support of this proposal, we present the dataset RH-HAR-SK that combines multiple camera perspectives augmented with human skeleton extraction obtained by the HRNet pose estimation. We apply qualitative and quantitative analysis techniques to the extracted skeleton and its joints to demonstrate the additional value of external cameras to the robot's recognition pipeline. Results show that while the robot's camera can provide optimal recognition accuracy in some specific scenarios, an external camera increases overall performance.