Open Access System for Information Sharing

Login Library

 

Article
Cited 24 time in webofscience Cited 28 time in scopus
Metadata Downloads

Saliency-Driven Real-Time Video-to-Tactile Translation SCIE SCOPUS

Title
Saliency-Driven Real-Time Video-to-Tactile Translation
Authors
Kim, MLee, SChoi, S
Date Issued
2014-07
Publisher
IEEE
Abstract
Tactile feedback coordinated with visual stimuli has proven its worth in mediating immersive multimodal experiences, yet its authoring has relied on content artists. This article presents a fully automated framework of generating tactile cues from streaming images to provide synchronized visuotactile stimuli in real time. The spatiotemporal features of video images are analyzed on the basis of visual saliency and then mapped into the tactile cues that are rendered on tactors installed on a chair. We also conducted two user experiments for performance evaluation. The first experiment investigated the effects of visuotactile rendering against visual-only rendering, demonstrating that the visuotactile rendering improved the movie watching experience to be more interesting, immersive, and understandable. The second experiment was performed to compare the effectiveness of authoring methods and found that the automated authoring approach, used with care, can produce plausible tactile effects similar in quality to manual authoring.
URI
https://oasis.postech.ac.kr/handle/2014.oak/27269
DOI
10.1109/TOH.2013.58
ISSN
1939-1412
Article Type
Article
Citation
IEEE TRANSACTIONS ON HAPTICS, vol. 7, no. 3, page. 394 - 404, 2014-07
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Views & Downloads

Browse