Zhenyu Lu
Visual-tactile robot grasping based on human skill learning from demonstrations using a wearable parallel hand exoskeleton
Lu, Zhenyu; Chen, Lu; Dai, Hengtai; Li, Haoran; Zhao, Zhou; Zheng, Bofang; Lepora, Nathan; Yang, Chenguang
Authors
Lu Chen
Hengtai Dai
Haoran Li
Zhou Zhao
Bofang Zheng
Nathan Lepora
Charlie Yang Charlie.Yang@uwe.ac.uk
Professor in Robotics
Abstract
The soft fingers and strategic grasping skills enable the human hands to grasp objects in a stable manner. This letter is to model human grasping skills and transfer the learned skills to robots to improve grasping quality and success rate. First, we designed a wearable tool-like parallel hand exoskeleton equipped with optical tactile sensors to acquire multimodal information, including hand positions and postures, the relative distance of the exoskeleton claws, and tactile images. Using the demonstration data, we summarized three characteristics observed from human demonstrations, involving varying-speed actions, grasping effect read from tactile images and grasping strategies for different positions. The characteristics were then utilized in the robot skill modelling to achieve a more human-like grasp. Since no force sensors are fixed to the claws, we introduced a new variable, called 'grasp depth', to represent the grasping effect on the object. The robot grasping strategy diagram is constructed as follows: First, grasp quality is predicted using a linear array network (LAN) and global visual images as inputs. The conditions such as grasp width, depth, position, and angle are also predicted. Second, with the grasp width and depth of the object determined, dynamic movement primitives (DMPs) are employed to mimic human grasp actions with varying velocities. To further enhance grasp quality, a final action adjustment based on tactile detection is performed during the near-grasp time. The proposed strategy was validated through experiments conducted with a Franka robot with a self-designed gripper. The results demonstrate that robot grasping test achieved an increase in the grasping success rate from 82% to 96%, compared to the results obtained by pure LAN and constant grasp depth testing.
Journal Article Type | Article |
---|---|
Acceptance Date | Jun 20, 2023 |
Online Publication Date | Jul 13, 2023 |
Publication Date | Sep 30, 2023 |
Deposit Date | Jul 21, 2023 |
Journal | IEEE Robotics and Automation Letters |
Print ISSN | 2377-3766 |
Publisher | Institute of Electrical and Electronics Engineers |
Peer Reviewed | Peer Reviewed |
Volume | 8 |
Issue | 9 |
Pages | 5384-5391 |
DOI | https://doi.org/10.1109/lra.2023.3295296 |
Keywords | Artificial Intelligence, Control and Optimization, Computer Science Applications, Computer Vision and Pattern Recognition, Mechanical Engineering, Human-Computer Interaction, Biomedical Engineering, Control and Systems Engineering |
Public URL | https://uwe-repository.worktribe.com/output/10967496 |
You might also like
Recent advances in robot-assisted echography: Combining perception, control and cognition
(2020)
Journal Article
A gripper-like exoskeleton design for robot grasping demonstration
(2023)
Journal Article
Downloadable Citations
About UWE Bristol Research Repository
Administrator e-mail: repository@uwe.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search