Tactile Optical Simulation

We built the fully general optical tactile simulation system for GelSight using physics based rendering techniques. We propose physically accurate light models and show in-depth analysis of individual components of our simulation pipeline. The technique used is fully general and can be extended to simulated sensors of arbitrary geometry, surface material and light sources. Our system outperforms previous simulation techniques qualitatively and quantitative on image similarity metrics like SSIM.

Bibtex
@article{agarwal2020simulation,
  title={Simulation of Vision-based Tactile Sensors using Physics based Rendering},
  author={Agarwal, Arpit and Man, Tim and Yuan, Wenzhen},
  journal={arXiv preprint arXiv:2012.13184},
  year={2020}
}

Overview

We develop an optical simulation system using physics-based rendering (PBR) techniques. PBR focuses on accurately modeling the physics of light scattering. PBR allows modifying the physical location of optical elements like cameras and lights; changing the deformable surface geometry; changing the optical properties of sensor surfaces (shiny vs diffuse). Therefore, our system can be used as a design tool and simulation software for producing accurate tactile images. Our system is able to successfully capture the variation in color, light intensity and shape at different locations of the elastomer surface. 

We propose specific models of light, translucent “Gelcore” and deformable elastomer surface generation process which allows us to simulate GelSight.

Simulated Sensor with physically-accurate components (from left to right bottom row) a) Gelcore b) Light models c) Lights mounted on the Gelcore d) Outer supporting shell

Surface heightfield generation
We leverage 2D heightfield representation in our simulation for modelling the geometry of deformable elastomer surface. The representation allows for easy image processing operations to model deformations and leads to faster rendering times.

Surface heightfield generation

Optimization for simulation parameters
We leverage differentiable rendering techniques to optimize for light intensities and elastomer surface properties. The technique is fully general and works for arbitrarily complex physically valid light and material models. This allows us not just to simulate new sensor designs but also to match the existing built sensors and use it for Sim2Real applications.

Experiments

We collected images using our sensor prototype and compare the images generated by our simulation when the object is pressed against the sensor at multiple spatial locations and for arbitrarily complex objects. Please refer to our paper for complete description of the experiments and refer to the supplementary material for additional ablation studies.

Spatial variation: Shapes are pressed on different locations on the deformable sensor surface and multiple surface geometry. The visual highlights the need to model the illumination variation across the sensor surface.

Simulating Complex shapes :  Our approach could be used to model arbitrarily complex geometry and apply image processing operations to match the real world surface deformations.

Resources

3D Printable STL files : objects used in our experiments

Image Dataset 1 : real sensor images used for comparing with prior work

Simulated Images 1 : images generated by our method for dataset 1

SSIM maps 1 : ssim maps representing the pixelwise error for dataset 1

Image Dataset 2 : real sensor images used for comparing the performance for various sensor location

Simulated Images 2 : images generated by our method for dataset 2

SSIM maps 2 : ssim maps representing the pixelwise error for dataset 2