About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ACM-ASSETS 2002
Conference paper
Auditory and tactile interfaces for representing the visual effects on the Web
Abstract
In this paper, we describe auditory and tactile interfaces to represent visual effects nonvisually for blind users, allowing intuitive recognition of visual content that appears on the Web. This research examines how visual effects could be recognized by blind subjects using the senses of heating and touch, aiming at integrating the results into a practical system in the future. As an initial step, two experiments were performed, one for sonification and tactilization of a page overview based on color-based fragmented groupings without speech, and one for sonification and tactilization of emphasized text based on analyzing rich text information with speech. The subjects could recognize visual representations presented by auditory and tactile interfaces throughout the experiment, and were conscious of the importance of the visual structures. We believe this shows our approach may be practical and available in the future. We will summarize our results and discuss what kind of information is suitable for each sense, as well as the next planned experiment and other future work.