Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
11 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
40 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
37 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

A Mixed Reality System for Interaction with Heterogeneous Robotic Systems (2307.05280v2)

Published 11 Jul 2023 in cs.RO

Abstract: The growing spread of robots for service and industrial purposes calls for versatile, intuitive and portable interaction approaches. In particular, in industrial environments, operators should be able to interact with robots in a fast, effective, and possibly effortless manner. To this end, reality enhancement techniques have been used to achieve efficient management and simplify interactions, in particular in manufacturing and logistics processes. Building upon this, in this paper we propose a system based on mixed reality that allows a ubiquitous interface for heterogeneous robotic systems in dynamic scenarios, where users are involved in different tasks and need to interact with different robots. By means of mixed reality, users can interact with a robot through manipulation of its virtual replica, which is always colocated with the user and is extracted when interaction is needed. The system has been tested in a simulated intralogistics setting, where different robots are present and require sporadic intervention by human operators, who are involved in other tasks. In our setting we consider the presence of drones and AGVs with different levels of autonomy, calling for different user interventions. The proposed approach has been validated in virtual reality, considering quantitative and qualitative assessment of performance and user's feedback.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. V. Villani, F. Pini, F. Leali, and C. Secchi, “Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications,” Mechatronics, vol. 55, pp. 248–266, 2018.
  2. A. Hietanen, R. Pieters, M. Lanz, J. Latokartano, and J.-K. Kämäräinen, “Ar-based interaction for human-robot collaborative manufacturing,” Robotics and Computer-Integrated Manufacturing, vol. 63, p. 101891, 2020.
  3. euRobotics, “Strategic research agenda for robotics in Europe 2014–2020,” euRobotics SPARC, Tech. Rep., Feb. 2013.
  4. P. Neto, J. Norberto Pires, and A. Paulo Moreira, “High-level programming and control for industrial robotics: using a hand-held accelerometer-based input device for gesture and posture recognition,” Industrial Robot: An Int. J., vol. 37, no. 2, pp. 137–147, 2010.
  5. P. Tsarouchi, S. Makris, and G. Chryssolouris, “Human–robot interaction review and challenges on task planning and programming,” International Journal of Computer Integrated Manufacturing, vol. 29, no. 8, pp. 916–931, 2016.
  6. J. Lambrecht, M. Kleinsorge, M. Rosenstrauch, and J. Krüger, “Spatial programming for industrial robots through task demonstration,” Int. J. Advanced Robotic Systems, vol. 10, no. 5, p. 254, 2013.
  7. S. van Delden, M. Umrysh, C. Rosario, and G. Hess, “Pick-and-place application development using voice and visual commands,” Industrial Robot: An Int. J., vol. 39, no. 6, pp. 592–600, 2012.
  8. A. Rogowski, “Web-based remote voice control of robotized cells,” Robotics and Computer-Integrated Manufacturing, vol. 29, no. 4, pp. 77–89, 2013.
  9. ——, “Industrially oriented voice control system,” Robotics and Computer-Integrated Manufacturing, vol. 28, no. 3, pp. 303–315, 2012.
  10. K. Darvish, F. Wanderlingh, B. Bruno, E. Simetti, F. Mastrogiovanni, and G. Casalino, “Flexible human–robot cooperation models for assisted shop-floor tasks,” Mechatronics, vol. 51, pp. 97–114, 2018.
  11. P. Neto, M. Simão, N. Mendes, and M. Safeea, “Gesture-based human-robot interaction for human assistance in manufacturing,” The International Journal of Advanced Manufacturing Technology, vol. 101, no. 1, pp. 119–135, 2019.
  12. A. Chaudhary, J. L. Raheja, K. Das, and S. Raheja, “Intelligent approaches to interact with machines using hand gesture recognition in natural way: a survey,” International Journal of Computer Science & Engineering Survey (IJCSES), vol. 1, no. 2, pp. 122–133, 2013.
  13. D. Szafir, “Mediating human-robot interactions with virtual, augmented, and mixed reality,” in Virtual, Augmented and Mixed Reality. Applications and Case Studies, J. Y. Chen and G. Fragomeni, Eds.   Cham: Springer International Publishing, 2019, pp. 124–149.
  14. F. Loch and B. Vogel-Heuser, “A virtual training system for aging employees in machine operation,” in 2017 IEEE 15th International Conference on Industrial Informatics (INDIN).   IEEE, 2017, pp. 279–284.
  15. N. Gavish, T. Gutiérrez, S. Webel, J. Rodríguez, M. Peveri, U. Bockholt, and F. Tecchia, “Evaluating virtual reality and augmented reality training for industrial maintenance and assembly tasks,” Interactive Learning Environments, vol. 23, no. 6, pp. 778–798, 2015.
  16. V. Villani, B. Capelli, and L. Sabattini, “Use of virtual reality for the evaluation of human-robot interaction systems in complex scenarios,” in 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 2018, pp. 422–427.
  17. S. Webel, U. Bockholt, T. Engelke, N. Gavish, M. Olbrich, and C. Preusche, “An augmented reality training platform for assembly and maintenance skills,” Robotics and Autonomous Systems, vol. 61, no. 4, pp. 398–403, 2013.
  18. G. Michalos, P. Karagiannis, S. Makris, O. Tokcalar, and G. Chryssolouris, “Augmented reality (ar) applications for supporting human-robot interactive cooperation,” Procedia CIRP, vol. 41, pp. 370–375, 2016.
  19. R. S. Andersen, O. Madsen, T. B. Moeslund, and H. B. Amor, “Projecting robot intentions into human environments,” in 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 2016, pp. 294–301.
  20. L. Kastner and J. Lambrecht, “Augmented-reality-based visualization of navigation data of mobile robots on the microsoft hololens - possibilities and limitations,” in 2019 IEEE International Conference on Cybernetics and Intelligent Systems (CIS) and IEEE Conference on Robotics, Automation and Mechatronics (RAM), 2019, pp. 344–349.
  21. E. Rosen, D. Whitney, E. Phillips, G. Chien, J. Tompkin, G. Konidaris, and S. Tellex, “Communicating and controlling robot arm motion intent through mixed-reality head-mounted displays,” The International Journal of Robotics Research, vol. 38, no. 12-13, pp. 1513–1526, 2019.
  22. P. Milgram and F. Kishino, “A taxonomy of mixed reality visual displays,” IEICE TRANSACTIONS on Information and Systems, vol. 77, no. 12, pp. 1321–1329, 1994.
  23. D. Krupke, F. Steinicke, P. Lubos, Y. Jonetzko, M. Görner, and J. Zhang, “Comparison of multimodal heading and pointing gestures for co-located mixed reality human-robot interaction,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2018, pp. 1–9.
  24. W. Hoenig, C. Milanes, L. Scaria, T. Phan, M. Bolas, and N. Ayanian, “Mixed reality for robotics,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2015, pp. 5382–5387.
  25. J. A. Frank, M. Moorhead, and V. Kapila, “Mobile mixed-reality interfaces that enhance human–robot interaction in shared spaces,” Frontiers in Robotics and AI, vol. 4, p. 20, 2017.
  26. M. Ostanin, S. Mikhel, A. Evlampiev, V. Skvortsova, and A. Klimchik, “Human-robot interaction for robotic manipulator programming in mixed reality,” in 2020 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2020, pp. 2805–2811.
  27. E. Rosen, D. Whitney, M. Fishman, D. Ullman, and S. Tellex, “Mixed reality as a bidirectional communication interface for human-robot interaction,” in 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2020, pp. 11 431–11 438.
  28. F. Kennel-Maushart, R. Poranne, and S. Coros, “Interacting with multi-robot systems via mixed reality,” in IEEE international conference on robotics and automation (ICRA).   IEEE, 2023.
  29. O. Mörth, C. Emmanouilidis, N. Hafner, and M. Schadler, “Cyber-physical systems for performance monitoring in production intralogistics,” Computers & industrial engineering, vol. 142, p. 106333, 2020.
  30. V. Villani, B. Capelli, and L. Sabattini, “Use of virtual reality for the evaluation of human-robot interaction systems in complex scenarios,” in 27th IEEE Int. Symp. Robot and Human Interactive Communication (RO-MAN), 2018, pp. 422–427.
  31. J. Brooke et al., “SUS: A quick and dirty usability scale,” Usability evaluation in industry, vol. 189, no. 194, pp. 4–7, 1996.
  32. V. Villani, L. Sabattini, D. Żołnierczyk-Zreda, Z. Mockałło, P. Barańska, and C. Fantuzzi, “Worker satisfaction with adaptive automation and working conditions: a theoretical model and questionnaire as an assessment tool,” International Journal of Occupational Safety and Ergonomics, pp. 1–16, 2021.
  33. A. Bangor, P. T. Kortum, and J. T. Miller, “An empirical evaluation of the system usability scale,” Intl. J. Human-Computer Interaction, vol. 24, no. 6, pp. 574–594, 2008.
Citations (1)

Summary

We haven't generated a summary for this paper yet.