Emergent Mind

Abstract

While there is no replacement for the learned expertise, devotion, and social benefits of a guide dog, there are cases in which a robot navigation assistant could be helpful for individuals with blindness or low vision (BLV). This study investigated the potential for an industrial agile robot to perform guided navigation tasks. We developed two interface prototypes that allowed for spatial information between a human-robot pair: a voice-based app and a flexible, responsive handle. The participants (n=21) completed simple navigation tasks and a post-study survey about the prototype functionality and their trust in the robot. All participants successfully completed the navigation tasks and demonstrated the interface prototypes were able to pass spatial information between the human and the robot. Future work will include expanding the voice-based app to allow the robot to communicate obstacles to the handler and adding haptic signals to the handle design.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.