Emergent Mind

Real-time brain machine interaction via social robot gesture control

(1711.07462)
Published Nov 20, 2017 in cs.HC , cs.SY , and q-bio.NC

Abstract

Brain-Machine Interaction (BMI) system motivates interesting and promising results in forward/feedback control consistent with human intention. It holds great promise for advancements in patient care and applications to neurorehabilitation. Here, we propose a novel neurofeedback-based BCI robotic platform using a personalized social robot in order to assist patients having cognitive deficits through bilateral rehabilitation and mental training. For initial testing of the platform, electroencephalography (EEG) brainwaves of a human user were collected in real time during tasks of imaginary movements. First, the brainwaves associated with imagined body kinematics parameters were decoded to control a cursor on a computer screen in training protocol. Then, the experienced subject was able to interact with a social robot via our real-time BMI robotic platform. Corresponding to subject's imagery performance, he/she received specific gesture movements and eye color changes as neural-based feedback from the robot. This hands-free neurofeedback interaction not only can be used for mind control of a social robot's movements, but also sets the stage for application to enhancing and recovering mental abilities such as attention via training in humans by providing real-time neurofeedback from a social robot.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.