Emergent Mind

Walk the Lines: Object Contour Tracing CNN for Contour Completion of Ships

(2004.06587)
Published Apr 14, 2020 in cs.CV and cs.LG

Abstract

We develop a new contour tracing algorithm to enhance the results of the latest object contour detectors. The goal is to achieve a perfectly closed, 1 pixel wide and detailed object contour, since this type of contour could be analyzed using methods such as Fourier descriptors. Convolutional Neural Networks (CNNs) are rarely used for contour tracing. However, we find CNNs are tailor-made for this task and that's why we present the Walk the Lines (WtL) algorithm, a standard regression CNN trained to follow object contours. To make the first step, we train the CNN only on ship contours, but the principle is also applicable to other objects. Input data are the image and the associated object contour prediction of the recently published RefineContourNet. The WtL gets a center pixel, which defines an input section and an angle for rotating this section. Ideally, the center pixel moves on the contour, while the angle describes upcoming directional contour changes. The WtL predicts its steps pixelwise in a selfrouting way. To obtain a complete object contour the WtL runs in parallel at different image locations and the traces of its individual paths are summed. In contrast to the comparable Non-Maximum Suppression method, our approach produces connected contours with finer details. Finally, the object contour is binarized under the condition of being closed. In case all procedures work as desired, excellent ship segmentations with high IoUs are produced, showing details such as antennas and ship superstructures that are easily omitted by other segmentation methods.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.