Open Access System for Information Sharing

Login Library

 

Thesis
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads
Full metadata record
Files in This Item:
There are no files associated with this item.
DC FieldValueLanguage
dc.contributor.author김재현-
dc.date.accessioned2024-05-10T16:34:23Z-
dc.date.available2024-05-10T16:34:23Z-
dc.date.issued2024-
dc.identifier.otherOAK-2015-10344-
dc.identifier.urihttp://postech.dcollection.net/common/orgView/200000734345ko_KR
dc.identifier.urihttps://oasis.postech.ac.kr/handle/2014.oak/123296-
dc.descriptionDoctor-
dc.description.abstractRobotic hands have been developed to mimic complex human manipulation and have evolved to be able to perform everything from simple movements to complex object manipulation. These developments have been inspired by human’s complex sensory perception and use of it. Humans enable stable operation by detecting the changing environment in real time through complex sensory perceptions and providing feedback to perform complex operations. Stable grasping for skilled object manipulation is a prerequisite for handling dexterous objects. And for the stable, detailed object recognition, detection of slippage, and the ability to sense and respond to the environment are essential. With the introduction of tactile sensors in robotic hands, the tactile sensors enable robotic hands to perform a variety of tasks by accurately detecting and processing pressure, friction, temperature, and the shape and position of objects. Therefore, tactile sensors have been developed to provide diverse and accurate tactile recognition to robot hands. Early-stage tactile sensors were able to determine contact or grip of the objects with an object through pressure and force. Afterwards, more specific information about the contact location was provided through multi-array tactile sensors, and multimodal tactile sensors provided various information about objects and environments, such as temperature, pressure, and strain. However, tactile sensors developed so far still have limitations in human-level tactile perception for stable grasping and dexterous manipulation, and much progress is still needed. In this thesis, to achieve human-level dexterous manipulation and stable grasping, through the structural design of the tactile sensor, (1) omnidirectional tactile sensing, (2) conformal object surface sensing, and (3) grasping safety margin perception were secured. First, omnidirectional tactile sensing was secured by manufacturing a CNT/PU foam-based deformable pressure sensor. In this process, an insulating mash structure is formed inside the PU foam, reducing signal interference caused by deformation that occurs in the process of integrating the sensor onto a 3D structure. Second, a thin, stretchable pressure sensor was integrated onto a soft 3D substrate to implement conformal object surface sensing. It was possible to fabricate a thin, stretchable pressure sensor using a fiber mat with a fiber entangled structure. This thin, stretchable pressure sensor allowed sensing only the area in contact with the object, regardless of deformation of the soft substrate. Third, I developed a multimodal soft tactile sensor that can recognize friction changes, making grasping safety margin perception possible. A strain sensor array and a pressure sensor array were integrated into a vertical structure and designed to move independently each, allowing static friction limit and real-time friction to be obtained through pressure and strain information. Through changes in the static friction limit and real-time friction obtained in this way, it was possible to confirm the change in grasping safety margins according to the change in external force acting on the grasped object.-
dc.languageeng-
dc.publisher포항공과대학교-
dc.titleStructural design of soft tactile sensors for stable grasping in robotic hand-
dc.typeThesis-
dc.contributor.college신소재공학과-
dc.date.degree2024- 2-

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Views & Downloads

Browse