Date created
2022-06-22
Authors/Contributors
Abstract
We present an algorithm that discovers grasp pose solutions for multiple grasp types for a multi-fingered mechanical gripper using partially-sensed point clouds of unknown objects. The algorithm introduces two key ideas: 1) a histogram of finger contact normals is used to represent a grasp "shape" to guide a gripper orientation search in a histogram of object(s) surface normals, and 2) voxel grid representations of gripper contacts and object(s) are cross-correlated to match finger contact points, i.e. grasp "scale`", to discover a grasp pose. Collision constraints are incorporated in the cross-correlation computation. We show via simulations and preliminary experiments that 1) grasp poses for three grasp types (i.e. lateral, power, and tripodal)are found quickly without interrupting the robot's motion, 2) the quality of grasp pose solutions is consistent with respect to voxel resolution changes for both partial and complete point cloud scans, 3) grasp type definitions are scalable for n-contacts and can incorporate constraints for collision checks in one integrated step, and 4) planned grasp poses are successfully executed with a mechanical gripper demonstrating the robustness of grasp pose solutions.
Description
The fulltext will be made available in July 2023 due to the embargo policy of Springer-Nature, the publisher of Autonomous Robots. If you require access to the fulltext prior to July 2023 please contact summit@sfu.ca.
SFU DOI
Scholarly level
Peer reviewed?
Yes
Language
English
Member of collection