Emergent Mind

Abstract

Automatic recognition and classification of tasks in robotic surgery is an important stepping stone toward automated surgery and surgical training. Recently, technical breakthroughs in gathering data make data-driven model development possible. In this paper, we propose a framework for high-level robotic surgery task recognition using motion data. We present a novel classification technique that is used to classify three important surgical tasks through quantitative analyses of motion: knot tying, needle passing and suturing. The proposed technique integrates state-of-the-art data mining and time series analysis methods. The first step of this framework consists of developing a time series distance-based similarity measure using derivative dynamic time warping (DDTW). The distance-weighted k-nearest neighbor algorithm was then used to classify task instances. The framework was validated using an extensive dataset. Our results demonstrate the strength of the proposed framework in recognizing fundamental robotic surgery tasks.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.