Characterization of Real-time Haptic Feedback from Multimodal Neural Network-based Force Estimates during Teleoperation (2109.11488v3)
Abstract: Force estimation using neural networks is a promising approach to enable haptic feedback in minimally invasive surgical robots without end-effector force sensors. Various network architectures have been proposed, but none have been tested in real time with surgical-like manipulations. Thus, questions remain about the real-time transparency and stability of force feedback from neural network-based force estimates. We characterize the real-time impedance transparency and stability of force feedback rendered on a da Vinci Research Kit teleoperated surgical robot using neural networks with vision-only, state-only, and state and vision inputs. Networks were trained on an existing dataset of teleoperated manipulations without force feedback. To measure real-time stability and transparency during teleoperation with force feedback to the operator, we modeled a one-degree-of-freedom human and surgeon-side manipulandum that moved the patient-side robot to perform manipulations on silicone artificial tissue over various robot and camera configurations, and tools. We found that the networks using state inputs displayed more transparent impedance than a vision-only network. However, state-based networks displayed large instability when used to provide force feedback during lateral manipulation of the silicone. In contrast, the vision-only network showed consistent stability in all the evaluated directions. We confirmed the performance of the vision-only network for real-time force feedback in a demonstration with a human teleoperator.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.