Emergent Mind

Abstract

Quadruped locomotion is currently a vibrant research area, which has reached a level of maturity and performance that enables some of the most advanced real-world applications with autonomous quadruped robots both in academia and industry. Blind robust quadruped locomotion has been pushed forward in control and technology aspects within recent decades. However, in the complicated environment, the capability including terrain perception and path planning is still required. Visual perception is an indispensable ability in legged locomotion for such a demand. This study explores a vision-based navigation method for a small-scale quadruped robot Pegasus-Mini, aiming to propose a method that enables efficient and reliable navigation for the small-scale quadruped locomotion. The vision-based navigation method proposed in this study is applicable in such a small-scale quadruped robot platform in which the computation resources and space are limited. The semantic segmentation based on a CNN model is adopted for the real-time path segmentation in the outdoor environment. The desired traverse trajectory is generated through real-time updating the middle line, which is calculated from the edge position of the segmented path in the images. To enhance the stability of the path planning directly based on the semantic segmentation method, a trajectory compensation method is supplemented considering the temporal information to revise the untrustworthy planned path. Experiments of semantic segmentation and navigation in a garden scene are demonstrated to verify the effectiveness of the proposed method.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.