Initial Analysis of Data-Driven Haptic Search for the Smart Suction Cup (2401.06354v1)
Abstract: Suction cups offer a useful gripping solution, particularly in industrial robotics and warehouse applications. Vision-based grasp algorithms, like Dex-Net, show promise but struggle to accurately perceive dark or reflective objects, sub-resolution features, and occlusions, resulting in suction cup grip failures. In our prior work, we designed the Smart Suction Cup, which estimates the flow state within the cup and provides a mechanically resilient end-effector that can inform arm feedback control through a sense of touch. We then demonstrated how this cup's signals enable haptically-driven search behaviors for better grasping points on adversarial objects. This prior work uses a model-based approach to predict the desired motion direction, which opens up the question: does a data-driven approach perform better? This technical report provides an initial analysis harnessing the data previously collected. Specifically, we compare the model-based method with a preliminary data-driven approach to accurately estimate lateral pose adjustment direction for improved grasp success.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.