Emergent Mind

Abstract

The uncertainty and variability of underwater environment propose the request to control underwater robots in real time and dynamically, especially in the scenarios where human and robots need to work collaboratively in the field. However, the underwater environment imposes harsh restrictions on the application of typical control and communication methods. Considering that gestures are a natural and efficient interactive way for human, we, utilizing convolution neural network, implement a real-time gesture-based recognition system, who can recognize 50 kinds of gestures from images captured by one normal monocular camera, and apply this recognition system in human and underwater robot interaction. We design A Flexible and Extendable Interaction Scheme (AFEIS) through which underwater robots can be programmed in situ underwater by human operators using customized gesture-based sign language. This paper elaborates the design of gesture recognition system and AFEIS, and presents our field trial results when applying this system and scheme on underwater robots.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.