Emergent Mind

Evetac: An Event-based Optical Tactile Sensor for Robotic Manipulation

(2312.01236)
Published Dec 2, 2023 in cs.RO and cs.LG

Abstract

Optical tactile sensors have recently become popular. They provide high spatial resolution, but struggle to offer fine temporal resolutions. To overcome this shortcoming, we study the idea of replacing the RGB camera with an event-based camera and introduce a new event-based optical tactile sensor called Evetac. Along with hardware design, we develop touch processing algorithms to process its measurements online at 1000 Hz. We devise an efficient algorithm to track the elastomer's deformation through the imprinted markers despite the sensor's sparse output. Benchmarking experiments demonstrate Evetac's capabilities of sensing vibrations up to 498 Hz, reconstructing shear forces, and significantly reducing data rates compared to RGB optical tactile sensors. Moreover, Evetac's output and the marker tracking provide meaningful features for learning data-driven slip detection and prediction models. The learned models form the basis for a robust and adaptive closed-loop grasp controller capable of handling a wide range of objects. We believe that fast and efficient event-based tactile sensors like Evetac will be essential for bringing human-like manipulation capabilities to robotics. The sensor design is open-sourced at https://sites.google.com/view/evetac .

Overview

  • Evetac is a high-resolution, event-based optical tactile sensor that mimics human touch sensors and operates at 1000Hz.

  • The sensor uses a commercial event-based camera and a soft silicone gel with imprinted markers to track deformations, detecting vibrations up to 498Hz.

  • Included algorithms process sparse event data into meaningful representations for tracking deformations and estimating shear forces.

  • Evetac has been applied to slip detection and grasp control, demonstrating its ability to predict and react to slip events for stable manipulation.

  • The paper concludes that Evetac is a promising open-source tool that may advance the dexterity of robotic hands in various manipulation tasks.

Event-Based Tactile Sensing for Robotic Manipulation

Introduction to Event-Based Tactile Sensing

Tactile sensing plays a crucial role in robotic manipulation, allowing robots to interact with objects in a manner that emulates human touch. Advances in tactile sensors have opened new doors for enhancing robotic perception and dexterity, with event-based cameras leading the forefront due to their high temporal resolution and data efficiency. These tactile sensors are designed to mimic the sophisticated sensing capabilities of human skin, providing both high spatial resolution and the ability to detect minute vibrations, which are essential for tasks requiring delicate manipulation.

Evetac: A New Optical Tactile Sensor

Evetac is a novel event-based optical tactile sensor that promises to offer substantial improvements in resolution and data processing speed. It is built from a commercially available event-based camera and a soft silicone gel with imprinted markers to track deformations. The sensor operates at 1000Hz, maintaining high spatial resolution similar to that of human tactile receptors. Evetac stands out by achieving higher temporal resolutions, allowing it to detect rapid tactile events such as vibrations up to 498Hz. Moreover, its event-driven nature significantly reduces data rates compared to traditional RGB optical tactile sensors.

Touch Processing Algorithms and Features

Accompanying the hardware, a collection of algorithms has been developed to efficiently process the raw output of Evetac. These algorithms transform the sparse event data into meaningful representations which aid in tracking the gel's deformation and estimating shear forces. The advanced touch processing algorithms maintain awareness of the global configuration of the gel, even given the sensor's sparse outputs. Notable features that have been utilized for sensing tasks include overall event count, events per dot, and dot displacement, enabling a balance between sensing efficiency and the ability to resolve detailed touch-related phenomena.

Slip Detection and Grasp Control Experiments

The paper further details how the sensor and its algorithms lend themselves to the task of slip detection. Using neural network models trained on labelled data, the paper describes how slip can be effectively identified and even predicted ahead of time, demonstrating the potential for reactive and adaptive robotic manipulation. The successful integration of Evetac into a closed-loop grasp controller illustrates its capability for handling a diverse range of objects and maintaining grip stability in the presence of disturbances.

Conclusion and Future Work

Evetac is an open-source contribution to the field of tactile sensing. The paper's thorough experimental validation shows its potential to enhance robot manipulation skills, with notable results in grasping activities and adaptive force control. Looking ahead, further research could explore the integration of Evetac with more complex robotic hands and the application of other neural network architectures that could improve latency and feature extraction. The advancements presented by Evetac mark a significant step towards attaining human-like dexterity in robotic manipulation.

Create an account to read this summary for free:

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.