Skip to main content

Research Repository

Advanced Search

'Elbows Out' - Predictive tracking of partially occluded pose for Robot-Assisted dressing

Chance, Gregory; Jevtic, Aleksandar; Caleb-Solly, Praminda; Alenya, Guillem; Torras, Carme; Dogramadzi, Sanja

Authors

Gregory Chance Greg.Chance@uwe.ac.uk

Aleksandar Jevtic

Profile Image

Praminda Caleb-Solly Praminda.Caleb-solly@uwe.ac.uk
Professor in Assistive Robotics and Intelligent Health Technologies

Guillem Alenya

Carme Torras



Abstract

© 2016 IEEE. Robots that can assist in the activities of daily living, such as dressing, may support older adults, addressing the needs of an aging population in the face of a growing shortage of care professionals. Using depth cameras during robot-assisted dressing can lead to occlusions and loss of user tracking, which may result in unsafe trajectory planning or prevent the planning task proceeding altogether. For the dressing task of putting on a jacket, which is addressed in this letter, tracking of the arm is lost when the user's hand enters the jacket, which may lead to unsafe situations for the user and a poor interaction experience. Using motion tracking data, free from occlusions, gathered from a human-human interaction study on an assisted dressing task, recurrent neural network models were built to predict the elbow position of a single arm based on other features of the user pose. The best features for predicting the elbow position were explored by using regression trees indicating the hips and shoulder as possible predictors. Engineered features were also created based on observations of real dressing scenarios and their effectiveness explored. Comparison between position and orientation-based datasets was also included in this study. A 12-fold cross-validation was performed for each feature set and repeated 20 times to improve statistical power. Using position-based data, the elbow position could be predicted with a 4.1 cm error but adding engineered features reduced the error to 2.4 cm. Adding orientation information to the data did not improve the accuracy and aggregating univariate response models failed to make significant improvements. The model was evaluated on Kinect data for a robot dressing task and although not without issues, demonstrates potential for this application. Although this has been demonstrated for jacket dressing, the technique could be applied to a number of different situations during occluded tracking.

Citation

Chance, G., Jevtic, A., Caleb-Solly, P., Alenya, G., Torras, C., & Dogramadzi, S. (2018). 'Elbows Out' - Predictive tracking of partially occluded pose for Robot-Assisted dressing. IEEE Robotics and Automation Letters, 3(4), 3598-3605. https://doi.org/10.1109/LRA.2018.2854926

Journal Article Type Article
Acceptance Date Jul 1, 2018
Online Publication Date Jul 11, 2018
Publication Date Oct 1, 2018
Journal IEEE Robotics and Automation Letters
Print ISSN 2377-3766
Electronic ISSN 2377-3766
Publisher Institute of Electrical and Electronics Engineers
Peer Reviewed Peer Reviewed
Volume 3
Issue 4
Pages 3598-3605
DOI https://doi.org/10.1109/LRA.2018.2854926
Keywords human-robot interaction
Public URL https://uwe-repository.worktribe.com/output/857822
Publisher URL http://dx.doi.org/10.1109/LRA.2018.2854926
Additional Information Additional Information : (c) 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.

Files







You might also like



Downloadable Citations