Emergent Mind

Abstract

Suction cups offer a useful gripping solution, particularly in industrial robotics and warehouse applications. Vision-based grasp algorithms, like Dex-Net, show promise but struggle to accurately perceive dark or reflective objects, sub-resolution features, and occlusions, resulting in suction cup grip failures. In our prior work, we designed the Smart Suction Cup, which estimates the flow state within the cup and provides a mechanically resilient end-effector that can inform arm feedback control through a sense of touch. We then demonstrated how this cup's signals enable haptically-driven search behaviors for better grasping points on adversarial objects. This prior work uses a model-based approach to predict the desired motion direction, which opens up the question: does a data-driven approach perform better? This technical report provides an initial analysis harnessing the data previously collected. Specifically, we compare the model-based method with a preliminary data-driven approach to accurately estimate lateral pose adjustment direction for improved grasp success.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.