Multimodal Sensing Interface for Haptic Interaction

Resource type
Date created
2017-04-04
Authors/Contributors
Abstract
This paper investigates the integration of a multimodal sensing system for exploring limits of vibrato tactile haptic feedback when interacting with 3D representation of real objects. In this study, the spatial locations of the objects are mapped to the work volume of the user using a Kinect sensor. The position of the user’s hand is obtained using the marker-based visual processing. The depth information is used to build a vibrotactile map on a haptic glove enhanced with vibration motors. The users can perceive the location and dimension of remote objects by moving their hand inside a scanning region. A marker detection camera provides the location and orientation of the user’s hand (glove) to map the corresponding tactile message. A preliminary study was conducted to explore how different users can perceive such haptic experiences. Factors such as total number of objects detected, object separation resolution, and dimension-based and shape-based discrimination were evaluated. The preliminary results showed that the localization and counting of objects can be attained with a high degree of success. The users were able to classify groups of objects of different dimensions based on the perceived haptic feedback.
Document
Published as
Carlos Diaz and Shahram Payandeh, “Multimodal Sensing Interface for Haptic Interaction,” Journal of Sensors, vol. 2017, Article ID 2072951, 24 pages, 2017. DOI: 10.1155/2017/2072951.
Publication title
Journal of Sensors
Document title
“Multimodal Sensing Interface for Haptic Interaction,”
Date
2017
Volume
2017
First page
1
Last page
24
Publisher DOI
10.1155/2017/2072951
Copyright statement
Copyright is held by the author(s).
Scholarly level
Peer reviewed?
Yes
Language
Member of collection
Attachment Size
2072951.pdf 12.22 MB