Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Active Clothing Material Perception using Tactile Sensing and Deep Learning (1711.00574v2)

Published 2 Nov 2017 in cs.RO

Abstract: Humans represent and discriminate the objects in the same category using their properties, and an intelligent robot should be able to do the same. In this paper, we build a robot system that can autonomously perceive the object properties through touch. We work on the common object category of clothing. The robot moves under the guidance of an external Kinect sensor, and squeezes the clothes with a GelSight tactile sensor, then it recognizes the 11 properties of the clothing according to the tactile data. Those properties include the physical properties, like thickness, fuzziness, softness and durability, and semantic properties, like wearing season and preferred washing methods. We collect a dataset of 153 varied pieces of clothes, and conduct 6616 robot exploring iterations on them. To extract the useful information from the high-dimensional sensory output, we applied Convolutional Neural Networks (CNN) on the tactile data for recognizing the clothing properties, and on the Kinect depth images for selecting exploration locations. Experiments show that using the trained neural networks, the robot can autonomously explore the unknown clothes and learn their properties. This work proposes a new framework for active tactile perception system with vision-touch system, and has potential to enable robots to help humans with varied clothing related housework.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Wenzhen Yuan (43 papers)
  2. Yuchen Mo (7 papers)
  3. Shaoxiong Wang (18 papers)
  4. Edward Adelson (21 papers)
Citations (119)

Summary

  • The paper presents a robotic framework utilizing tactile sensing and deep learning for autonomous perception and classification of clothing material properties.
  • The system combines a UR5 robot, GelSight sensor, and CNNs trained on tactile data to classify properties like thickness and softness, demonstrating accuracy on seen items.
  • This work supports home robots for tasks like laundry and suggests future research focus on larger datasets and refined neural networks to improve generalization.

Overview of "Active Clothing Material Perception using Tactile Sensing and Deep Learning"

The research outlined in "Active Clothing Material Perception using Tactile Sensing and Deep Learning" addresses an essential facet of intelligent robotics: enabling robots to discern fine-grained material properties through tactile sensing. The authors have developed a sophisticated robotic framework that autonomously perceives and categorizes material properties of clothing using a novel integration of tactile sensing and deep learning methodologies. This paper stands as a significant contribution to the field of robotics, particularly in its approach to employing tactile data for detailed material property recognition, which has implications for domestic robotic applications.

Tactile Perception and Robotic Autonomy

The research focuses on creating a robotic system that mimics human-like perception of clothing attributes such as thickness, fuzziness, and other semantic properties like wearing season and preferred washing methods. The authors detail the construction of a robotic setup involving a UR5 robot arm in conjunction with a GelSight tactile sensor and Kinect guidance systems. These technologies synergize to allow the robot to autonomously explore clothing items, executing tactile data acquisition and processing through a controlled gripping mechanism.

Neural Network Application and Performance

To navigate the complexity inherent in tactile data, the research employs Convolutional Neural Networks (CNNs). The neural network architectures used here aim to decipher tactile information and have demonstrated the capability to classify numerous clothing properties from high-resolution tactile images. The CNNs were trained using a comprehensive dataset amassed through the robot's exploration activities, covering a diverse range of clothing attributes and semantic features. However, the network displays some challenges in generalizing beyond the training instances, particularly with nuanced fabric characteristics.

Experimental Findings and Results

The system's efficacy was tested across seen and unseen clothing categories, revealing varying levels of success. The single-frame and video-based tactile data predictions illustrated effective recognition of clothing properties for known samples, yet demonstrated room for improvement in generalization to novel items. The paper reports substantial accuracy rates for categories like thickness and softness, especially on seen samples. Nevertheless, challenges persist for certain semantic properties like washing preference, which are contextually dependent attributes beyond mere tactile sensation.

Practical Implications and Future Directions

This research opens avenues for practical implementation in home assistant robots, specifically in tasks such as laundry sorting and garment organization. By enabling an advanced level of material differentiation, robots could play an increased role in daily chores. The dependence on tactile data uniquely supplements existing vision-based recognition systems, offering richer context for material assessment.

Considering the limitations acknowledged, future research could explore larger, more diverse datasets to enhance CNN training and robustness. Further, refining neural network designs specifically tailored to tactile imaging data might bridge the generalization gap observed with unseen clothing items. The integration of additional sensory data, such as auditory or proprioceptive feedback during material manipulation, could further bolster model reliability and performance.

In conclusion, the paper presents a compelling framework for active tactile exploration in robotic systems, laying a foundation for expanded robotic autonomy in material perception. The trajectories defined herein suggest fruitful opportunities to augment the role of robotics in routine human activities, guided by nuanced understanding and interaction with complex physical environments.

Youtube Logo Streamline Icon: https://streamlinehq.com