Emergent Mind

Learning active tactile perception through belief-space control

(2312.00215)
Published Nov 30, 2023 in cs.RO and cs.AI

Abstract

Robots operating in an open world will encounter novel objects with unknown physical properties, such as mass, friction, or size. These robots will need to sense these properties through interaction prior to performing downstream tasks with the objects. We propose a method that autonomously learns tactile exploration policies by developing a generative world model that is leveraged to 1) estimate the object's physical parameters using a differentiable Bayesian filtering algorithm and 2) develop an exploration policy using an information-gathering model predictive controller. We evaluate our method on three simulated tasks where the goal is to estimate a desired object property (mass, height or toppling height) through physical interaction. We find that our method is able to discover policies that efficiently gather information about the desired property in an intuitive manner. Finally, we validate our method on a real robot system for the height estimation task, where our method is able to successfully learn and execute an information-gathering policy from scratch.

Overview

  • Introduction of a framework to enable robots to perceive physical properties of unknown objects through tactile exploration.

  • Use of a generative world model predicting object reactions using a differentiable Bayesian filter for property inference.

  • Development of an exploration policy with model predictive control for autonomous strategy discovery and refined motion.

  • Robots tested in simulation and on real systems, outperforming a Deep Reinforcement Learning baseline in efficiency and accuracy.

  • Demonstration of enhanced robot adaptability in unstructured environments and independent learning of manipulation for information gathering.

Introduction

The research introduces an innovative framework for teaching robots to understand and interact with the physical properties of objects in their environment without any prior knowledge of those objects. The aim is to equip robots with the ability to actively perceive and deduce properties such as mass, friction, and size through tactile exploration, similar to how humans naturally explore unfamiliar objects.

Methodology

The framework outlined in the study consists of two main components. Firstly, a generative world model is created to predict how an object will behave in reaction to the robot's actions. This model utilizes a differentiable Bayesian filtering algorithm to infer the dynamical properties of objects based on a sequence of sensor readings influenced by various physical interactions.

Secondly, an exploration policy that uses model predictive control based on information-gathering strategies is developed. This policy evolves to produce different motions based on the desired property being explored. Importantly, this approach enables the robot to autonomously discover exploration strategies rather than relying on pre-programmed or human-engineered motions.

Experimental Procedures

The model was put to the test in both simulated environments and real-world robot systems. Three experimental tasks were designed to validate the framework's capability to autonomously learn and optimize exploration strategies. These tasks included estimating the mass, height, and minimum toppling height of various objects through interaction, such as pushing or poking.

A comparative analysis was also conducted against a Deep Reinforcement Learning baseline. The method proposed in this study was found to yield better data efficiency and more accurate property estimation, underlining the potential of this framework for developing robots that can engage in meaningful physical interactions with their environments.

Conclusion

The study showcases a significant step forward in robot perception capabilities. By demonstrating that robots can independently learn how to manipulate objects to gather essential information, the framework promises to enhance robotic systems' adaptability and functionality in unstructured environments. The successful application of this novel active perception framework in simulated tasks and on a real robot system signifies an exciting progression in the field of robotics.

Create an account to read this summary for free:

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.