Emergent Mind

Drone Control based on Mental Commands and Facial Expressions

(2102.01429)
Published Feb 2, 2021 in cs.HC

Abstract

When it is tried to control drones, there are many different ways through various devices, using either motions like facial motion, special gloves with sensors, red, green, blue cameras on the laptop or even using smartwatches by performing gestures that are picked up by motion sensors. The paper proposes a work on how drones could be controlled using brainwaves without any of those devices. The drone control system of the current research was developed using electroencephalogram signals took by an Emotiv Insight headset. The electroencephalogram signals are collected from the users brain. The processed signal is then sent to the computer via Bluetooth. The headset employs Bluetooth Low Energy for wireless transmission. The brain of the user is trained in order to use the generated electroencephalogram data. The final signal is transmitted to Raspberry Pi zero via the MQTT messaging protocol. The Raspberry Pi controls the movement of the drone through the incoming signal from the headset. After years, brain control can replace many normal input sources like keyboards, touch screens or other traditional ways, so it enhances interactive experiences and provides new ways for disabled people to engage with their surroundings.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.