Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Stable Object Placing using Curl and Diff Features of Vision-based Tactile Sensors (2403.19129v1)

Published 28 Mar 2024 in cs.RO

Abstract: Ensuring stable object placement is crucial to prevent objects from toppling over, breaking, or causing spills. When an object makes initial contact to a surface, and some force is exerted, the moment of rotation caused by the instability of the object's placing can cause the object to rotate in a certain direction (henceforth referred to as direction of corrective rotation). Existing methods often employ a Force/Torque (F/T) sensor to estimate the direction of corrective rotation by detecting the moment of rotation as a torque. However, its effectiveness may be hampered by sensor noise and the tension of the external wiring of robot cables. To address these issues, we propose a method for stable object placing using GelSights, vision-based tactile sensors, as an alternative to F/T sensors. Our method estimates the direction of corrective rotation of objects using the displacement of the black dot pattern on the elastomeric surface of GelSight. We calculate the Curl from vector analysis, indicative of the rotational field magnitude and direction of the displacement of the black dots pattern. Simultaneously, we calculate the difference (Diff) of displacement between the left and right fingers' GelSight's black dots. Then, the robot can manipulate the objects' pose using Curl and Diff features, facilitating stable placing. Across experiments, handling 18 differently characterized objects, our method achieves precise placing accuracy (less than 1-degree error) in nearly 100% of cases. An accompanying video is available at the following link: https://youtu.be/fQbmCksVHlU

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
  1. M. H. Raibert and J. J. Craig, “Hybrid position/force control of manipulators,” 1981.
  2. W. Yuan, et al., “GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force,” Sensors, vol. 17, no. 12, p. 2762, 2017.
  3. T. Anzai and K. Takahashi, “Deep gated multi-modal learning: In-hand object pose changes estimation using tactile and image data,” in 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2020, pp. 9361–9368.
  4. M. Bauza, et al., “Tac2pose: Tactile object pose estimation from the first touch,” The International Journal of Robotics Research, vol. 42, no. 13, pp. 1185–1209, 2023.
  5. S. Dong, et al., “Tactile-rl for insertion: Generalization to objects of unknown geometry,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 6437–6443.
  6. Y. Zhang, et al., “Effective estimation of contact force and torque for vision-based tactile sensors with helmholtz–hodge decomposition,” IEEE Robotics and Automation Letters, vol. 4, no. 4, pp. 4094–4101, 2019.
  7. J. Sun, et al., “Onepose: One-shot object pose estimation without cad models,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 6825–6834.
  8. B. Li, et al., “VoxDet: Voxel Learning for Novel Instance Detection,” in Proceedings of the Advances in Neural Information Processing Systems (NeurIPS), 2023.
  9. N. Chavan-Dafle and A. Rodriguez, “Stable prehensile pushing: In-hand manipulation with alternating sticking contacts,” in 2018 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2018, pp. 254–261.
  10. F. Von Drigalski, et al., “Contact-based in-hand pose estimation using bayesian state estimation and particle filtering,” in 2020 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2020, pp. 7294–7299.
  11. J. Pankert and M. Hutter, “Learning Contact-Based State Estimation for Assembly Tasks,” in 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2023, pp. 5087–5094.
  12. L. Lach, et al., “Placing by Touching: An empirical study on the importance of tactile sensing for precise object placing,” in 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2023, pp. 8964–8971.
  13. S. Pai, et al., “Laboratory Automation: Precision Insertion with Adaptive Fingers utilizing Contact through Sliding with Tactile-based Pose Estimation,” arXiv preprint arXiv:2309.16170, 2023.
  14. K. Ota, et al., “Tactile Estimation of Extrinsic Contact Patch for Stable Placement,” arXiv preprint arXiv:2309.14552, 2023.
  15. D. E. Whitney et al., “Quasi-static assembly of compliantly supported rigid parts,” Journal of Dynamic Systems, Measurement, and Control, vol. 104, no. 1, pp. 65–77, 1982.
  16. C. Higuera, et al., “Neural contact fields: Tracking extrinsic contact with tactile sensing,” in 2023 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2023, pp. 12 576–12 582.
  17. M. Shimizu and K. Kosuge, “Spatial parts mating with fiction using structured compliance with compliance center,” in IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 2.   IEEE, 2002, pp. 1585–1590.
  18. “Gelsight mini,” https://www.gelsight.com/gelsightmini/, 2024.
  19. S. Garrido-Jurado, et al., “Automatic generation and detection of highly reliable fiducial markers under occlusion,” Pattern Recognition, vol. 47, no. 6, pp. 2280–2292, 2014.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Kuniyuki Takahashi (17 papers)
  2. Shimpei Masuda (7 papers)
  3. Tadahiro Taniguchi (74 papers)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com
Youtube Logo Streamline Icon: https://streamlinehq.com