Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 47 tok/s
Gemini 2.5 Pro 44 tok/s Pro
GPT-5 Medium 13 tok/s Pro
GPT-5 High 12 tok/s Pro
GPT-4o 64 tok/s Pro
Kimi K2 160 tok/s Pro
GPT OSS 120B 452 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Cluster Haptic Texture Database: Haptic Texture Database with Varied Velocity-Direction Sliding Contacts (2407.16206v2)

Published 23 Jul 2024 in cs.HC

Abstract: Haptic sciences and technologies benefit greatly from comprehensive datasets that capture tactile stimuli under controlled, systematic conditions. However, existing haptic databases collect data through uncontrolled exploration, which hinders the systematic analysis of how motion parameters (e.g., motion direction and velocity) influence tactile perception. This paper introduces Cluster Haptic Texture Database, a multimodal dataset recorded using a 3-axis machine with an artificial finger to precisely control sliding velocity and direction. The dataset encompasses 118 textured surfaces across 9 material categories, with recordings at 5 velocity levels (20-60 mm/s) and 8 directions. Each surface was tested under 160 conditions, yielding 18,880 synchronized recordings of audio, acceleration, force, position, and visual data. Validation using convolutional neural networks demonstrates classification accuracies of 96% for texture recognition, 88.76% for velocity estimation, and 78.79% for direction estimation, confirming the dataset's utility for machine learning applications. This resource enables research in haptic rendering, texture recognition algorithms, and human tactile perception mechanisms, supporting the development of realistic haptic interfaces for virtual reality systems and robotic applications.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. S. J. Lederman and R. L. Klatzky, “Haptic perception: A tutorial,” Attention, Perception & Psychophysics, vol. 71, no. 7, pp. 1439–1459, Oct. 2009.
  2. A. El Saddik, “The potential of haptics technologies,” IEEE Instrumentation & Measurement Magazine, vol. 10, no. 1, pp. 10–17, Feb. 2007.
  3. R. Picard, C. Graczyk, S. Mann, J. Wachman, L. Picard, and L. Campbell, “VisTex: Vision texture database,” 1995.
  4. J. Deng, W. Dong, R. Socher, L.-J. Li, K. Li, and L. Fei-Fei, “Imagenet: A large-scale hierarchical image database,” in Proceedings of 2009 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2009, pp. 248–255.
  5. H. Culbertson, J. Unwin, B. E. Goodman, and K. J. Kuchenbecker, “Generating haptic texture models from unconstrained tool-surface interactions,” in Proceedings of 2013 World Haptics Conference (WHC), Apr. 2013, pp. 295–300.
  6. H. Culbertson, J. J. Lopez Delgado, and K. J. Kuchenbecker, “One hundred data-driven haptic texture models and open-source methods for rendering on 3D objects,” in Proceedings of 2014 IEEE Haptics Symposium (HAPTICS), Feb. 2014, pp. 319–325.
  7. M. Strese, J.-Y. Lee, C. Schuwerk, Q. Han, H.-G. Kim, and E. Steinbach, “A haptic texture database for tool-mediated texture recognition and classification,” in Proceedings of 2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE), Oct. 2014, pp. 118–123.
  8. H. Zheng, L. Fang, M. Ji, M. Strese, Y. Ozer, and E. Steinbach, “Deep learning for surface material classification using haptic and visual information,” IEEE Transactions on Multimedia, vol. 18, no. 12, pp. 2407–2416, Dec. 2016.
  9. M. Strese, Y. Boeck, and E. Steinbach, “Content-based surface material retrieval,” in Proceedings of 2017 IEEE World Haptics Conference (WHC), Jun. 2017, pp. 352–357.
  10. A. Burka, S. Hu, S. Helgeson, S. Krishnan, Y. Gao, L. A. Hendricks, T. Darrell, and K. J. Kuchenbecker, “Proton: A visuo-haptic data acquisition system for robotic learning of surface properties,” in Proceedings of 2016 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), 2016, pp. 58–65.
  11. T. Yoshioka, S. J. Bensmaïa, J. C. Craig, and S. S. Hsiao, “Texture perception through direct and indirect touch: an analysis of perceptual space for tactile textures in two modes of exploration,” Somatosensory & Motor Research, vol. 24, no. 1-2, pp. 53–70, 2007.
  12. R. L. Klatzky and S. J. Lederman, “Tactile roughness perception with a rigid link interposed between skin and surface,” Perception & Psychophysics, vol. 61, no. 4, pp. 591–607, May 1999.
  13. M. Wiertlewski, C. Hudin, and V. Hayward, “On the 1/f noise and non-integer harmonic decay of the interaction of a finger sliding on flat and sinusoidal surfaces,” in Proceedings of 2011 IEEE World Haptics Conference (WHC), Jun. 2011, pp. 25–30.
  14. J. Platkiewicz, A. Mansutti, M. Bordegoni, and V. Hayward, “Recording device for natural haptic textures felt with the bare fingertip,” in Haptics: Neuroscience, Devices, Modeling, and Applications (Proceedings of 2014 EuroHaptics Conference).   Springer, 2014, pp. 521–528.
  15. Y. Tanaka, Y. Horita, and A. Sano, “Finger-Mounted skin vibration sensor for active touch,” in Haptics: Perception, Devices, Mobility, and Communication (Proceedings of 2012 EuroHaptics Conference).   Springer, 2012, pp. 169–174.
  16. L. P. Kirsch, X. E. Job, M. Auvray, and V. Hayward, “Harnessing tactile waves to measure skin-to-skin interactions,” Behavior Research Methods, vol. 53, no. 4, pp. 1469–1477, Aug. 2021.
  17. J. Jiao, Y. Zhang, D. Wang, X. Guo, and X. Sun, “HapTex: A database of fabric textures for surface tactile display,” in Proceedings of 2019 IEEE World Haptics Conference (WHC), 2019, pp. 331–336.
  18. A. Devillard, A. Ramasamy, D. Faux, V. Hayward, and E. Burdet, “Concurrent haptic, audio, and visual data set during bare finger interaction with textured surfaces,” in Proceedings of 2023 IEEE World Haptics Conference (WHC), 2023, pp. 101–106.
  19. K. Kuramitsu, T. Nomura, S. Nomura, T. Maeno, and Y. Nonomura, “Friction evaluation system with a human finger model,” Chemistry Letters, vol. 42, no. 3, pp. 284–285, Mar. 2013.
  20. M. Kouchi, N. Miyata, and M. Mochimaru, “An analysis of hand measurements for obtaining representative japanese hand models,” in SAE Technical Paper, no. 2005-01-2734, Jun. 2005, pp. 1–7.
  21. Artificial Intelligence Research Center, AIST, “AIST japanese hand dimension data,” https://www.airc.aist.go.jp/dhrt/hand/data/list.html, accessed: 2023-9-25.
  22. M. Strese, L. Brudermueller, J. Kirsch, and E. Steinbach, “Haptic material analysis and classification inspired by human exploratory procedures,” IEEE Transactions on Haptics, vol. 13, no. 2, pp. 404–424, 2020.
  23. A. Isleyen, Y. Vardar, and C. Basdogan, “Tactile roughness perception of virtual gratings by electrovibration,” IEEE Transactions on Haptics, vol. 13, no. 3, pp. 562–570, 2020.
  24. Y. Vardar, A. Isleyen, M. K. Saleem, and C. Basdogan, “Roughness perception of virtual textures displayed by electrovibration on touch screens,” in Proceedings of 2017 IEEE World Haptics Conference (WHC), Jun. 2017, pp. 263–268.
  25. D. J. Meyer, M. Wiertlewski, M. A. Peshkin, and J. E. Colgate, “Dynamics of ultrasonic and electrostatic friction modulation for rendering texture on haptic surfaces,” in Proceedings of 2014 IEEE Haptics Symposium (HAPTICS), Feb. 2014, pp. 63–67.
  26. B. Vimal, M. Surya, Darshan, V. S. Sridhar, and A. Ashok, “MFCC based audio classification using machine learning,” in Proceedings of 12th International Conference on Computing Communication and Networking Technologies (ICCCNT).   IEEE, Jul. 2021, pp. 1–4.
  27. F. Rong, “Audio classification method based on machine learning,” in Proceedings of 2016 International Conference on Intelligent Transportation, Big Data & Smart City (ICITBS), Dec. 2016, pp. 81–84.
  28. K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Dec. 2015, pp. 770–778.
  29. S. Cai, L. Zhao, Y. Ban, T. Narumi, Y. Liu, and K. Zhu, “GAN-based image-to-friction generation for tactile simulation of fabric material,” Computers & Graphics, vol. 102, pp. 460–473, Feb. 2022.
  30. R. Song, X. Sun, and G. Liu, “Cross-Modal generation of tactile friction coefficient from audio and visual measurements by transformer,” IEEE Transactions on Instrumentation and Measurement, 2023.
  31. O. Bau, I. Poupyrev, A. Israr, and C. Harrison, “TeslaTouch: electrovibration for touch surfaces,” in Proceedings of the 23rd Annual ACM Symposium on User Interface Software and Technology (UIST), Oct. 2010, pp. 283–292.
  32. Y. Miyatake, T. Hiraki, D. Iwai, and K. Sato, “HaptoMapping: Visuo-haptic augmented reality by embedding user-imperceptible tactile display control signals in a projected image,” IEEE Transactions on Visualization and Computer Graphics, vol. 29, no. 4, pp. 2005–2019, Apr. 2023.
  33. M. Ito, R. Sakuma, H. Ishizuka, and T. Hiraki, “AirHaptics: Vibrotactile presentation method using an airflow from audio speakers of smart devices,” in Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology (VRST), no. Article 39, Nov. 2022, pp. 1–2.

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets