Grasp Stability Prediction with Sim-to-Real Transfer from Tactile Sensing

We integrate simulation of robot dynamics and vision-based tactile sensors by modeling the physics of contact. This contact model uses simulated contact forces at the robot’s end-effector to inform the generation of realistic tactile outputs. To eliminate the sim-to-real transfer gap, we calibrate our physics simulator of robot dynamics, contact model, and tactile optical simulator with real-world data, and then we demonstrate the effectiveness of our system on a zero-shot sim-to-real grasp stability prediction task where we achieve high accuracy on various objects.

Paper link: https://arxiv.org/abs/2208.02885

Code link: https://github.com/CMURoboTouch/Taxim/tree/taxim-robot

Bibtex:

@article{si2022grasp,
  title={Grasp Stability Prediction with Sim-to-Real Transfer from Tactile Sensing},
  author={Si, Zilin and Zhu, Zirui and Agarwal, Arpit and Anderson, Stuart and Yuan, Wenzhen},
  journal={arXiv preprint arXiv:2208.02885},
  year={2022}
}

Supplementarty Material

This supplemental video demonstrates the simulation of grasping on various objects under different grasping configurations by using the proposed simulation framework in the paper. We also include the comparison of the external camera view and tactile sensor view between the simulated and real-world data.

Simulation framework

Our integrated simulation framework with tactile sensors includes three parts: a physics simulation, a contact simulation and a tactile simulation. We use PyBullet to simulate the robot dynamics, and transfer the contact forces and poses to the contact deformation of a GelSight tactile sensor. Then the tactile simulator renders tactile images according to the contact deformation.

Our proposed simulation framework includes physics simulation, contact simulation and tactile simulation.

Sim-to-real grasping prediction

We conduct the grasp for both simulated and real experiments. We initialize the robot on top of the object, move the gripper down to a preset height, close the gripper with a preset grasping force to grasp the object, and then lift it. We record the tactile readings from a GelSight sensor after grasping.

Grasp pipeline for both simulation and real experiments.

We show examples of tactile readings under different grasping scenarios. Different grasping locations as marked on the object and grasping forces lead to different grasping outcomes. We shoe the sequence of the tactile readings during grasping, where geometries of contact can be used to predict the grasping outcomes.

Examples of tactile readings under different grasping scenarios.