Improving Grasp Stability using Tactile Feedback
Bibtex:
@misc{kolamuri2021improving,
title={Improving Grasp Stability with Rotation Measurement from Tactile Sensing},
author={Raj Kolamuri and Zilin Si and Yufan Zhang and Arpit Agarwal and Wenzhen Yuan},
year={2021},
eprint={2108.00301},
archivePrefix={arXiv},
primaryClass={cs.RO}}
Grasping is one of the prime modalities of manipulation for robotics. External vision sensors like RGBD cameras have traditionally been used to guide robots to perform manipulation tasks such as pick-and-place. However, these vision sensors often are positioned away from the point of grasp and provide less information about the success/failure of grasp and the mode of failure. In this work, we address one of such failures, namely failure due to rotation of objects about the grasp point. As shown in the figure below (left), if objects are grasped at points away from their center of gravity, they undergo rotation and this leads to grasp failure. As this rotation happens at the local region of the gripping point, it is challenging for vision sensors to detect and measure the rotation.
We solve this problem using a high resolution, vision based tactile sensor, the GelSight sensor. GelSight sensor outputs images of its gel pad with markers painted on it as shown in the figure below (middle and right).

These markers trace the motion of the object in contact with the sensor. As shown in the figure below, when the object undergoes rotation, these markers exhibit rotational patterns.

We designed an algorithm that could detect these rotational patterns when they surface, measure the degree of rotation and its orientation. We call this rotation-measurement algorithm. The algorithm is model-based, runs in real-time and processes each frame in ~0.04s. The core concept behind the algorithm is the detection of the Center of Rotation and measure the rotation angle about this center. We formulate the problem as the intersection of the normals to the displacement vectors shown in the figure below and then use Least Squares solver to obtain the COR. The rotation angle is then measured about this COR.

We show the effectiveness of the algorithm by performing experiments where several objects are grasped and lifted at different locations which give these rotational motions. We then compare the output of our algorithm with the ground truth rotation value collected from an external sensor. Experimental result plots below show how the algorithm performed against ground truth data for some of the experimental objects.

The aforementioned method works well for objects with flat contacting surfaces. For objects with irregular shapes or small contact areas, when they are explored by the tactile sensor, we propose another contour tracking method to track the contact area’s contour and its principal axis’s rotational angles as shown in the figure below.

Improving Grasp Stability using Tactile Feedback
We also conducted experiments to show how rotation detection and measurement can help a robot reach a stable grasp pose starting from an unstable grasp pose. We let an UR5e robot grasp several objects at multiple locations on their centroidal axis as shown in the figure below. When the grasp location is away from the object’s center of gravity, the objects rotate. However, using the GelSight sensor and our rotation-measurement algorithm, the robot could now be alerted to place the object back when rotation occurs above a threshold. We first estimate the scale of the object by using an RGB-D camera and measuring its principal axis length as L. The robot starts with an initial grasp near the geometric center of the object. If that grasp is ascertained as rotation, the robot will release the object and move along the object’s principal axis to the updated grasp location. The initial regrasp step size is 0.4 L, and then reduced to 0.17 L for the later regrasps. If the rotation orientation changes during two regrasps, it means the robot has passed the center of gravity. The robot then takes the updated step in the backward direction because we know that the center of gravity is between the two grasp locations. The step size is reduced by a factor of 0.5 for a finer adjustment after each grasping. With this coarse-to-fine grasp adjustment design, our algorithm can quickly find the center of gravity with a few grasp trials. We performed such closed-loop regrasping experiments on 109 different configurations of objects, grasping speeds, and forces. In 105 of the 109 cases (96.3% accuracy), the robot could reach a stable grasp pose using feedback from the GelSight sensor.
