Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

All the Feels: A dexterous hand with large-area tactile sensing (2210.15658v3)

Published 27 Oct 2022 in cs.RO and cs.LG

Abstract: High cost and lack of reliability has precluded the widespread adoption of dexterous hands in robotics. Furthermore, the lack of a viable tactile sensor capable of sensing over the entire area of the hand impedes the rich, low-level feedback that would improve learning of dexterous manipulation skills. This paper introduces an inexpensive, modular, robust, and scalable platform -- the DManus -- aimed at resolving these challenges while satisfying the large-scale data collection capabilities demanded by deep robot learning paradigms. Studies on human manipulation point to the criticality of low-level tactile feedback in performing everyday dexterous tasks. The DManus comes with ReSkin sensing on the entire surface of the palm as well as the fingertips. We demonstrate effectiveness of the fully integrated system in a tactile aware task -- bin picking and sorting. Code, documentation, design files, detailed assembly instructions, trained models, task videos, and all supplementary materials required to recreate the setup can be found on https://sites.google.com/view/roboticsbenchmarks/platforms/dmanus.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (56)
  1. L. Pinto and A. Gupta, “Supersizing self-supervision: Learning to grasp from 50k tries and 700 robot hours,” in 2016 IEEE international conference on robotics and automation (ICRA), pp. 3406–3413, IEEE, 2016.
  2. S. Levine, P. Pastor, A. Krizhevsky, J. Ibarz, and D. Quillen, “Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection,” The International journal of robotics research, vol. 37, no. 4-5, pp. 421–436, 2018.
  3. C. Bodnar, A. Li, K. Hausman, P. Pastor, and M. Kalakrishnan, “Quantile qt-opt for risk-aware vision-based robotic grasping,” in Proceedings of Robotics: Science and Systems, (Corvalis, Oregon, USA), July 2020.
  4. O. M. Andrychowicz, B. Baker, M. Chociej, R. Jozefowicz, B. McGrew, J. Pachocki, A. Petron, M. Plappert, G. Powell, A. Ray, et al., “Learning dexterous in-hand manipulation,” The International Journal of Robotics Research, vol. 39, no. 1, pp. 3–20, 2020.
  5. A. Handa, A. Allshire, V. Makoviychuk, A. Petrenko, R. Singh, J. Liu, D. Makoviichuk, K. Van Wyk, A. Zhurkevich, B. Sundaralingam, et al., “Dextreme: Transfer of agile in-hand manipulation from simulation to reality,” arXiv preprint arXiv:2210.13702, 2022.
  6. A. Rajeswaran, V. Kumar, A. Gupta, G. Vezzani, J. Schulman, E. Todorov, and S. Levine, “Learning complex dexterous manipulation with deep reinforcement learning and demonstrations,” arXiv preprint arXiv:1709.10087, 2017.
  7. T. Chen, J. Xu, and P. Agrawal, “A system for general in-hand object re-orientation,” in Conference on Robot Learning, pp. 297–307, PMLR, 2022.
  8. M. Ahn, H. Zhu, K. Hartikainen, H. Ponte, A. Gupta, S. Levine, and V. Kumar, “Robel: Robotics benchmarks for learning with low-cost robots,” in Conference on robot learning, pp. 1300–1313, PMLR, 2020.
  9. K. Chin, T. Hellebrekers, and C. Majidi, “Machine learning for soft robotic sensing and control,” Advanced Intelligent Systems, vol. 2, no. 6, p. 1900171, 2020.
  10. R. Bhirangi, T. Hellebrekers, C. Majidi, and A. Gupta, “Reskin: versatile, replaceable, lasting tactile skins,” arXiv preprint arXiv:2111.00071, 2021.
  11. T. Hellebrekers, O. Kroemer, and C. Majidi, “Soft magnetic skin for continuous deformation sensing,” Advanced Intelligent Systems, vol. 1, no. 4, p. 1900025, 2019.
  12. M. T. Mason and J. K. Salisbury Jr, “Robot hands and the mechanics of manipulation,” 1985.
  13. G. A. Bekey, R. Tomovic, and I. Zeljkovic, “Control architecture for the belgrade/usc hand,” in Dextrous robot hands, pp. 136–149, Springer, 1990.
  14. S. Jacobsen, E. Iversen, D. Knutti, R. Johnson, and K. Biggers, “Design of the utah/mit dextrous hand,” in Proceedings. 1986 IEEE International Conference on Robotics and Automation, vol. 3, pp. 1520–1532, IEEE, 1986.
  15. T. Iberall, “Human prehension and dexterous robot hands,” The International Journal of Robotics Research, vol. 16, no. 3, pp. 285–299, 1997.
  16. P. J. Kyberd and P. H. Chappell, “The southampton hand: an intelligent myoelectric prosthesis,” Journal of rehabilitation Research and Development, vol. 31, no. 4, p. 326, 1994.
  17. R. Tomovic and G. Boni, “An adaptive artificial hand,” IRE Transactions on Automatic Control, vol. 7, no. 3, pp. 3–10, 1962.
  18. C. Pfeiffer, K. DeLaurentis, and C. Mavroidis, “Shape memory alloy actuated robot prostheses: initial experiments,” in Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No. 99CH36288C), vol. 3, pp. 2385–2391, IEEE, 1999.
  19. C. Piazza, G. Grioli, M. Catalano, and A. Bicchi, “A century of robotic hands,” Annual Review of Control, Robotics, and Autonomous Systems, vol. 2, pp. 1–32, 2019.
  20. H. Zhu, A. Gupta, A. Rajeswaran, S. Levine, and V. Kumar, “Dexterous manipulation with deep reinforcement learning: Efficient, general, and low-cost,” in 2019 International Conference on Robotics and Automation (ICRA), pp. 3651–3657, IEEE, 2019.
  21. M. Wüthrich, F. Widmaier, F. Grimminger, J. Akpo, S. Joshi, V. Agrawal, B. Hammoud, M. Khadiv, M. Bogdanovic, V. Berenz, et al., “Trifinger: An open-source robot for learning dexterity,” arXiv preprint arXiv:2008.03596, 2020.
  22. V. Kumar, A. Gupta, E. Todorov, and S. Levine, “Learning dexterous manipulation policies from experience and imitation,” arXiv preprint arXiv:1611.05095, 2016.
  23. A. Nagabandi, K. Konolige, S. Levine, and V. Kumar, “Deep dynamics models for learning dexterous manipulation,” in Conference on Robot Learning, pp. 1101–1112, PMLR, 2020.
  24. M. Kopicki, R. Detry, M. Adjigble, R. Stolkin, A. Leonardis, and J. L. Wyatt, “One-shot learning and generation of dexterous grasps for novel objects,” The International Journal of Robotics Research, vol. 35, no. 8, pp. 959–976, 2016.
  25. A. Zeng, P. Florence, J. Tompson, S. Welker, J. Chien, M. Attarian, T. Armstrong, I. Krasin, D. Duong, V. Sindhwani, et al., “Transporter networks: Rearranging the visual world for robotic manipulation,” arXiv preprint arXiv:2010.14406, 2020.
  26. B. Shih, D. Shah, J. Li, T. G. Thuruthel, Y.-L. Park, F. Iida, Z. Bao, R. Kramer-Bottiglio, and M. T. Tolley, “Electronic skins and machine learning for intelligent soft robots,” Science Robotics, vol. 5, no. 41, p. eaaz9239, 2020.
  27. G. Cannata, M. Maggiali, G. Metta, and G. Sandini, “An embedded artificial skin for humanoid robots,” in 2008 IEEE International conference on multisensor fusion and integration for intelligent systems, pp. 434–438, IEEE, 2008.
  28. T. Hoshi and H. Shinoda, “A large area robot skin based on cell-bridge system,” in SENSORS, 2006 IEEE, pp. 827–830, IEEE, 2006.
  29. N. Wettels, V. J. Santos, R. S. Johansson, and G. E. Loeb, “Biomimetic tactile sensor array,” Advanced Robotics, vol. 22, pp. 829–849, 2008.
  30. D. V. Dao, S. Sugiyama, S. Hirai, et al., “Analysis of sliding of a soft fingertip embedded with a novel micro force/moment sensor: Simulation, experiment, and application,” in 2009 IEEE International Conference on Robotics and Automation, pp. 889–894, IEEE, 2009.
  31. T. P. Tomo, A. Schmitz, W. K. Wong, H. Kristanto, S. Somlor, J. Hwang, L. Jamone, and S. Sugano, “Covering a robot fingertip with uskin: A soft electronic skin with distributed 3-axis force sensitive elements for robot hands,” IEEE Robotics and Automation Letters, vol. 3, no. 1, pp. 124–131, 2017.
  32. D. Gandhi, A. Gupta, and L. Pinto, “Swoosh! rattle! thump!–actions that sound,” arXiv preprint arXiv:2007.01851, 2020.
  33. S. Clarke, T. Rhodes, C. G. Atkeson, and O. Kroemer, “Learning audio feedback for estimating amount and flow of granular material,” Proceedings of Machine Learning Research, vol. 87, 2018.
  34. M. Du, O. Y. Lee, S. Nair, and C. Finn, “Play it by ear: Learning skills amidst occlusion through audio-visual imitation learning,” arXiv preprint arXiv:2205.14850, 2022.
  35. K. Hosoda, Y. Tada, and M. Asada, “Anthropomorphic robotic soft fingertip with randomly distributed receptors,” Robotics and Autonomous Systems, vol. 54, no. 2, pp. 104–109, 2006.
  36. H. Van Hoof, T. Hermans, G. Neumann, and J. Peters, “Learning robot in-hand manipulation with tactile features,” in 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids), pp. 121–127, IEEE, 2015.
  37. L. U. Odhner, L. P. Jentoft, M. R. Claffee, N. Corson, Y. Tenzer, R. R. Ma, M. Buehler, R. Kohout, R. D. Howe, and A. M. Dollar, “A compliant, underactuated hand for robust manipulation,” The International Journal of Robotics Research, vol. 33, no. 5, pp. 736–752, 2014.
  38. W. Yuan, S. Dong, and E. H. Adelson, “Gelsight: High-resolution robot tactile sensors for estimating geometry and force,” Sensors, vol. 17, no. 12, p. 2762, 2017.
  39. M. Lambeta, P.-W. Chou, S. Tian, B. Yang, B. Maloon, V. R. Most, D. Stroud, R. Santos, A. Byagowi, G. Kammerer, et al., “Digit: A novel design for a low-cost compact high-resolution tactile sensor with application to in-hand manipulation,” IEEE Robotics and Automation Letters, vol. 5, no. 3, pp. 3838–3845, 2020.
  40. E. Donlon, S. Dong, M. Liu, J. Li, E. Adelson, and A. Rodriguez, “Gelslim: A high-resolution, compact, robust, and calibrated tactile-sensing finger,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1927–1934, IEEE, 2018.
  41. B. Heyneman and M. R. Cutkosky, “Slip classification for dynamic tactile array sensors,” The International Journal of Robotics Research, vol. 35, no. 4, pp. 404–421, 2016.
  42. P. Mittendorfer, E. Yoshida, and G. Cheng, “Realizing whole-body tactile interactions with a self-organizing, multi-modal artificial skin on a humanoid robot,” Advanced Robotics, vol. 29, no. 1, pp. 51–67, 2015.
  43. S. Sundaram, P. Kellnhofer, Y. Li, J.-Y. Zhu, A. Torralba, and W. Matusik, “Learning the signatures of the human grasp using a scalable tactile glove,” Nature, vol. 569, no. 7758, pp. 698–702, 2019.
  44. E. Dean-Leon, J. R. Guadarrama-Olvera, F. Bergner, and G. Cheng, “Whole-body active compliance control for humanoid robots with robot skin,” in 2019 International Conference on Robotics and Automation (ICRA), pp. 5404–5410, IEEE, 2019.
  45. S. Funabashi, G. Yan, A. Geier, A. Schmitz, T. Ogata, and S. Sugano, “Morphology-specific convolutional neural networks for tactile object recognition with a multi-fingered hand,” in 2019 International Conference on Robotics and Automation (ICRA), pp. 57–63, IEEE, 2019.
  46. S. Funabashi, T. Isobe, F. Hongyi, A. Hiramoto, A. Schmitz, S. Sugano, and T. Ogata, “Multi-fingered in-hand manipulation with various object properties using graph convolutional networks and distributed tactile sensors,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 2102–2109, 2022.
  47. T. P. Tomo, M. Regoli, A. Schmitz, L. Natale, H. Kristanto, S. Somlor, L. Jamone, G. Metta, and S. Sugano, “A new silicone structure for uskin—a soft, distributed, digital 3-axis skin sensor and its integration on the humanoid robot icub,” IEEE Robotics and Automation Letters, vol. 3, no. 3, pp. 2584–2591, 2018.
  48. R. S. Johansson, “Sensory control of dexterous manipulation in humans,” in Hand and brain, pp. 381–414, Elsevier, 1996.
  49. L. Righetti, M. Kalakrishnan, P. Pastor, J. Binney, J. Kelly, R. C. Voorhies, G. S. Sukhatme, and S. Schaal, “An autonomous manipulation system based on force control and optimization,” Autonomous Robots, vol. 36, pp. 11–30, 2014.
  50. Z. Xia, Z. Deng, B. Fang, Y. Yang, and F. Sun, “A review on sensory perception for dexterous robotic manipulation,” International Journal of Advanced Robotic Systems, vol. 19, no. 2, p. 17298806221095974, 2022.
  51. N. Hogan, “Impedance control: An approach to manipulation: Part ii—implementation,” 1985.
  52. M. T. Mason, “Compliance and force control for computer controlled manipulators,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 11, no. 6, pp. 418–432, 1981.
  53. Y. Lin, A. S. Wang, G. Sutanto, A. Rai, and F. Meier, “Polymetis.” https://facebookresearch.github.io/fairo/polymetis/, 2021.
  54. U. Martinez-Hernandez, T. J. Dodd, M. H. Evans, T. J. Prescott, and N. F. Lepora, “Active sensorimotor control for tactile exploration,” Robotics and Autonomous Systems, vol. 87, pp. 15–27, 2017.
  55. Y. Yang, C. Fermuller, Y. Li, and Y. Aloimonos, “Grasp type revisited: A modern perspective on a classical feature for vision,” in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 400–408, 2015.
  56. T. Feix, J. Romero, H.-B. Schmiedmayer, A. M. Dollar, and D. Kragic, “The grasp taxonomy of human grasp types,” IEEE Transactions on human-machine systems, vol. 46, no. 1, pp. 66–77, 2015.
Citations (8)

Summary

We haven't generated a summary for this paper yet.

Youtube Logo Streamline Icon: https://streamlinehq.com