Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multi-fingered Dynamic Grasping for Unknown Objects (2310.17923v3)

Published 27 Oct 2023 in cs.RO

Abstract: Dexterous grasping of unseen objects in dynamic environments is an essential prerequisite for the advanced manipulation of autonomous robots. Prior advances rely on several assumptions that simplify the setup, including environment stationarity, pre-defined objects, and low-dimensional end-effectors. Though easing the problem and enabling progress, it undermined the complexity of the real world. Aiming to relax these assumptions, we present a dynamic grasping framework for unknown objects in this work, which uses a five-fingered hand with visual servo control and can compensate for external disturbances. To establish such a system on real hardware, we leverage the recent advances in real-time dexterous generative grasp synthesis and introduce several techniques to secure the robustness and performance of the overall system. Our experiments on real hardware verify the ability of the proposed system to reliably grasp unknown dynamic objects in two realistic scenarios: objects on a conveyor belt and human-robot handover. Note that there has been no prior work that can achieve dynamic multi-fingered grasping for unknown objects like ours up to the time of writing this paper. We hope our pioneering work in this direction can provide inspiration to the community and pave the way for further algorithmic and engineering advances on this challenging task. A video of the experiments is available at https://youtu.be/b87zGNoKELg.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (32)
  1. H. Liu, K. Wu, P. Meusel, N. Seitz, G. Hirzinger, M. Jin, Y. Liu, S. Fan, T. Lan, and Z. Chen, “Multisensory five-finger dexterous hand: The dlr/hit hand ii,” in 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2008, pp. 3692–3697.
  2. V. Mayer, Q. Feng, J. Deng, Y. Shi, Z. Chen, and A. Knoll, “Ffhnet: Generating multi-fingered robotic grasps for unknown objects in real-time,” in 2022 International Conference on Robotics and Automation (ICRA), 2022, pp. 762–769.
  3. D. Morrison, P. Corke, and J. Leitner, “Closing the loop for robotic grasping: A real-time, generative grasp synthesis approach,” CoRR, 2018.
  4. D. Morrison, P. Corke, and J. Leitner, “Learning robust, real-time, reactive robotic grasping,” The International Journal of Robotics Research, vol. 39, pp. 183 – 201, 2019.
  5. M. Tuscher, J. Hörz, D. Driess, and M. Toussaint, “Deep 6-dof tracking of unknown objects for reactive grasping,” 2021.
  6. N. Marturi, M. Kopicki, A. Rastegarpanah, V. Rajasekaran, M. Adjigble, R. Stolkin, A. Leonardis, and Y. Bekiroglu, “Dynamic grasp and trajectory planning for moving objects,” in Autonomous Robots, 2019, pp. 1241–1256.
  7. C. De Farias, M. Adjigble, B. Tamadazte, R. Stolkin, and N. Marturi, “Dual quaternion-based visual servoing for grasping moving objects,” in 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE), 2021, pp. 151–158.
  8. I. Akinola, J. Xu, S. Song, and P. K. Allen, “Dynamic grasping with reachability and motion awareness,” 2021.
  9. C.-C. Wong, M.-Y. Chien, R.-J. Chen, H. Aoyama, and K.-Y. Wong, “Moving object prediction and grasping system of robot manipulator,” IEEE Access, vol. 10, pp. 20 159–20 172, 2022.
  10. H.-S. Fang, C. Wang, H. Fang, M. Gou, J. Liu, H. Yan, W. Liu, Y. Xie, and C. Lu, “Anygrasp: Robust and efficient grasp perception in spatial and temporal domains,” IEEE Transactions on Robotics (T-RO), pp. 1–17, 2023.
  11. P. Rosenberger, A. Cosgun, R. Newbury, J. Kwan, V. Ortenzi, P. Corke, and M. Grafinger, “Object-independent human-to-robot handovers using real time robotic vision,” IEEE Robotics and Automation Letters (RA-L), vol. 6, pp. 17–23, 01 2021.
  12. W. Yang, C. Paxton, A. Mousavian, Y.-W. Chao, M. Cakmak, and D. Fox, “Reactive human-to-robot handovers of arbitrary objects,” in 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021, pp. 3118–3124.
  13. F. Husain, A. Colomé, B. Dellen, G. Alenyà, and C. Torras, “Realtime tracking and grasping of a moving object from range video,” in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014, pp. 2617–2622.
  14. X. Ye and S. Liu, “Velocity decomposition based planning algorithm for grasping moving object,” in 2018 IEEE 7th Data Driven Control and Learning Systems Conference (DDCLS), 2018, pp. 644–649.
  15. T.-H. Pham, G. De Magistris, and R. Tachibana, “Optlayer - practical constrained optimization for deep reinforcement learning in the real world,” in 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018, pp. 6236–6243.
  16. Z. Tu, C. Yang, X. Wu, Y. Zhu, W. Wu, and N. Jia, “Moving object flexible grasping based on deep reinforcement learning,” in 2022 8th International Conference on Control, Automation and Robotics (ICCAR), 2022, pp. 34–39.
  17. N. Houshangi, “Control of a robotic manipulator to grasp a moving target using vision,” in Proceedings., IEEE International Conference on Robotics and Automation (ICRA), 1990, pp. 604–609 vol.1.
  18. P. Allen, A. Timcenko, B. Yoshimi, and P. Michelman, “Automated tracking and grasping of a moving object with a robotic hand-eye system,” IEEE Transactions on Robotics and Automation (T-RO), vol. 9, no. 2, pp. 152–165, 1993.
  19. C. Smith and N. Papanikolopoulos, “Grasping of static and moving objects using a vision-based control approach,” in Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Human Robot Interaction and Cooperative Robots, vol. 1, 1995, pp. 329–334 vol.1.
  20. H. Nomura and T. Naito, “Integrated visual servoing system to grasp industrial parts moving on conveyer by controlling 6dof arm,” in 2000 IEEE International Conference on Systems, Man and Cybernetics (SMC), vol. 3, 2000, pp. 1768–1775 vol.3.
  21. J. Fuentes-Pacheco, J. Ascencio, and J. Rendon-Mancha, “Binocular visual tracking and grasping of a moving object with a 3d trajectorypredictor,” Journal of Applied Research and Technology, vol. 7, p. 259, 12 2009.
  22. A. Cowley, B. Cohen, W. Marshall, C. J. Taylor, and M. Likhachev, “Perception and motion planning for pick-and-place of dynamic objects,” in 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013, pp. 816–823.
  23. S. Escaida Navarro, D. Weiss, D. Stogl, D. Milev, and B. Hein, “Tracking and grasping of known and unknown objects from a conveyor belt,” in ISR/Robotik 2014; 41st International Symposium on Robotics, 2014, pp. 1–8.
  24. X. Chen, B. Yan, J. Zhu, H. Lu, X. Ruan, and D. Wang, “High-performance transformer tracking,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 7, pp. 8507–8523, 2023.
  25. Y. Huang, D. Liu, Z. Liu, K. Wang, Q. Wang, and J. Tan, “A novel robotic grasping method for moving objects based on multi-agent deep reinforcement learning,” Robotics and Computer-Integrated Manufacturing, vol. 86, p. 102644, 2024.
  26. X. Chen, B. Yan, J. Zhu, D. Wang, X. Yang, and H. Lu, “Transformer tracking,” in CVPR, 2021.
  27. M. Kristan, J. Matas, A. Leonardis, M. Felsberg, R. Pflugfelder, J.-K. Kamarainen, H. J. Chang, M. Danelljan, L. Čehovin Zajc, A. Lukežič, O. Drbohlav, J. Kapyla, G. Hager, S. Yan, J. Yang, Z. Zhang, G. Fernandez, and et. al., “The ninth visual object tracking vot2021 challenge results,” 2021.
  28. P. Besl and N. D. McKay, “A method for registration of 3-d shapes,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 2, pp. 239–256, 1992.
  29. Y. Chen and G. Medioni, “Object modelling by registration of multiple range images,” Image and Vision Computing, vol. 10, no. 3, pp. 145–155, 1992.
  30. S. Prokudin, C. Lassner, and J. Romero, “Efficient learning on point clouds with basis point sets,” in IEEE International Conference on Computer Vision (ICCV), 2019, pp. 4331–4340.
  31. R. E. Kalman, “A new approach to linear filtering and prediction problems,” Transactions of the ASME–Journal of Basic Engineering, vol. 82, no. Series D, pp. 35–45, 1960.
  32. B. Calli, A. Walsman, A. Singh, S. Srinivasa, P. Abbeel, and A. M. Dollar, “Benchmarking in manipulation research: Using the yale-CMU-berkeley object and model set,” IEEE Robotics Automation Magazine (RAM), vol. 22, no. 3, pp. 36–52, sep 2015.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yannick Burkhardt (2 papers)
  2. Qian Feng (35 papers)
  3. Jianxiang Feng (15 papers)
  4. Karan Sharma (3 papers)
  5. Zhaopeng Chen (21 papers)
  6. Alois Knoll (190 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.