Emergent Mind

Active inference body perception and action for humanoid robots

(1906.03022)
Published Jun 7, 2019 in cs.RO and cs.AI

Abstract

Providing artificial agents with the same computational models of biological systems is a way to understand how intelligent behaviours may emerge. We present an active inference body perception and action model working for the first time in a humanoid robot. The model relies on the free energy principle proposed for the brain, where both perception and action goal is to minimise the prediction error through gradient descent on the variational free energy bound. The body state (latent variable) is inferred by minimising the difference between the observed (visual and proprioceptive) sensor values and the predicted ones. Simultaneously, the action makes sensory data sampling to better correspond to the prediction made by the inner model. We formalised and implemented the algorithm on the iCub robot and tested in 2D and 3D visual spaces for online adaptation to visual changes, sensory noise and discrepancies between the model and the real robot. We also compared our approach with classical inverse kinematics in a reaching task, analysing the suitability of such a neuroscience-inspired approach for real-world interaction. The algorithm gave the robot adaptive body perception and upper body reaching with head object tracking (toddler-like), and was able to incorporate visual features online (in a closed-loop manner) without increasing the computational complexity. Moreover, our model predicted involuntary actions in the presence of sensorimotor conflicts showing the path for a potential proof of active inference in humans.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.