2004 Conference Proceedings

Go to previous article 
Go to next article 
Return to 2004 Table of Contents 


TACTILE GRAPHICS SUPPORTING AUDIO INFORMATION DEPICTING VISUAL REPRESENTATIONS OF WINDOWS SCREENS

Presenters
Janice Walth
San Joaqin Delta College
Disable Students Programs and Services
5151 Pacific Ave.
Stockton, California 95207
(209) 365-0345
Email: junitt@softcom.net

Ted Wattenberg, M.S., C.R.C.
San Joaqin Delta College
Disable Students Programs and Services
5151 Pacific Ave.
Stockton, California 95207
(916) 736-2251
Email: twattenberg@deltacollege.edu

Introduction:

Growing numbers of college students with visual impairments need to learn how to use computers to support higher levels of cognition and learning (Yesilada, Stevens &, Goble, 2003). One common method used by people with visual impairments to access computer information is with specialized screen readers. Current studies have identified the need for all computer users to form mental images of Windows applications and Web based information to efficiently perform cognitive processes of learning (Aldrich & Sheppard, 2000; Saariluoma & Kalakoski, 1997).

People with visual impairments have difficulty forming mental images of Windows screens because of the necessity of translating audio descriptions of the graphical information and often little previous experience with the visual relationships used in computer design (Yesilada, Stevens &, Goble, 2003).

Studies tracking the use of bi-modal use of Braille and audio information have shown that tactile representations of visual information supports the retention and cognitive processing for people with vision impairments (Sadato et al., 1996). However, the ability to form mental images, either through vision or supported with bi-modal tactile/audio formats, is not naturally understood or performed by people using graphically presented information in computer applications (Aldrich & Sheppard, 2000). Instructors teaching people with low-vision how to use a computer utilizing specialized screen reader applications do not have instructional aids, curriculum, or experience in helping students learn how to form mental images of the Windows environment (Yesilada, Stevens &, Goble, 2003).

The purpose of this study is to develop instructional aids, curriculum, and an experiential base of knowledge that supports visually impaired students in learning how to use a specialized screen reader for operating a computer using Windows applications. Furthermore, the authors of this report present a method of evaluating the effectiveness of the instructional strategies in helping students learn how to form mental images from audio information.

The Use of Graphics and Visual Information in Computer Applications

Windows based computer application designers use graphical interfaces to facilitate navigation, organization of information, and the explanation of information on computer systems (Yesilada, Stevens &, Goble, 2003). Graphic presentation of information offers advantages to text based formats in helping make the content more concise, memorable, and explanatory of complex data relationships (Aldrich & Sheppard, 2000). To make use of graphical images on computer systems, a computer user must be able to travel between Window's screens, understand needed information to make navigational decisions, and ultimately retrieve and process the information desired to complete the original task (Yesilada, Stevens &, Goble, 2003).

Travel within and between Windows applications screens and on the Web is defined as the "confident navigation and orientation with purpose, ease and accuracy within an environment, that is to say, the notion of travel extends navigation and orientation to include environment, mobility and purpose of the journey" (Yesilada, Stevens &, Goble, 2003, 422). Navigation refers to the potential movement within the Windows environment. Orientation is the cognitive awareness and understanding of the special relationships and objects within that environment. And, the level of mobility is the ease and confidence at which travel is accomplished. Often, visually impaired people have difficulty accessing computers because graphical designs of the applications or the Web environment are incompatible with assistive technologies or because of an inability of the user to fully understand the mental image concepts.

Mental Images and the Relationship between Visual, Audio, and Tactile Information

Higher cognitive learning is performed when humans are able to process mental images of given information by facilitating cross-modal activity between working memory, long-term memory, and diverse sensory input modalities (Sadato et al., 1996; Saariluoma & Kalakoski, 1997).

Mental imagery is necessary for the human perceptual systems to activate cognitive processes necessary to understand mental content, the symbolic representation of information, and to further activate higher order cognitive processes of reflection and creative thinking. Current studies using PET scanning technologies have associated specific brain locations with the processing of differing sensorial informational modalities for audio, tactile, and vision (Ladavas, di Pellegrino, Farne &, Zeloni, 1998; Knudsen & Brainard, 1991). Until recently, the existence or relationships between these brain locations and functions were not understood well. Now it is known that cross-modal links exist and that when one modality is injured or lost, the other sensory modalities can offset the cognitive loss by supplying information to other brain locations. Initial studies by Sadato et al. and validated by Macaluso et al. establish that tactile and audio cross-modal links allow people with loss of vision to develop and use mental images for higher cognitive learning processes.

Methodology:

The authors of this report developed tactile representations of Windows applications in order to help students learning how to use the Jaws screen reader application to develop mental images of the audio information of the screen images. Jaws provides complex information about the screen images, allowing users to navigate, orientate themselves to the content of the page, and make user decisions. There is diverse experiential knowledge of visual images among people with visual impairments, ranging from those born without vision to those that have lost vision as adults. It is not known if the previous visual experience affects the impact of tactile support of visual imagery. Twenty-three graphic representations of Windows's screens were created depicting the initial Desktop, use of the Start functions, opening and setting up Jaws, using Microsoft Word, and the use of Window's Explorer.

Items placed on the tactile images were chosen from the audio information provided by Jaws within each screen. All audio descriptors were placed on the tactile image. ICONS and written text were expanded to allow for adequate tactile sensory stimulation. The tactile pages were made by printing onto specialized paper that can be run through the Picture In a Flash machine that heats up the paper causing the darkened areas to rise up higher than the lighter areas.

The original twenty-three tactile representations were presented to three visually impaired college students and one visually impaired practicing attorney at law. All of the four subjects are users of Jaws with capabilities ranging from beginners to expert. One of the participants has been blind since birth and the others have experienced visual loss gradually since childhood. All participants reported that the tactile representations aided their understanding of the visual images of the Windows environment and expressed excitement about the experience. Two of the students with the least experience with Jaws voiced feelings of "Aha, now I know what this means." Another participant, who is active in providing support services for people with disabilities, thought that the product could be very important in their work.

The authors of this report developed a usability evaluation process based on the methods formulated by Nielson's discount usability heuristics, commonly used in the computer industry (Nielsen, 1994). The target populations consistent of five groups:

  1. Totally blind since birth.
  2. Totally blind since before adolescence:
  3. Totally blind during adolescence.
  4. Totally blind during adulthood.
  5. Limited vision, but considered legally blind.

The authors of this report developed the following heuristics to help evaluate usability observations:

  1. Visibility of system status: The system should always keep users informed about what is going on, through appropriate feedback within a reasonable time period. Do the tactile representations aid in user visibility?

  2. Match between system and the real world: The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order. Do the tactile representations aid the user in understanding computer terminology and the logical order of system design?

  3. User control and freedom: Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Do the tactile representations aid in locus of control?

  4. Error prevention: Even better than good error messages is a careful design, which prevents a problem from occurring in the first place. Do the tactile representations aid in preventing user navigational errors?

  5. Flexibility and efficiency of use: Accelerators may often speed up the interaction for the users such that they can process the information quick enough to make efficient use of their working memory. Do the tactile representations help lesson the amount of time a user navigates before making a decision?

Summary

The authors of this report developed tactile instructional materials for students with visual impairments to develop mental images that support the use of computers. The tactile representations of Windows screens have initially shown promise as instructional aids.

Additional usability evaluations, using the proposed heuristics, are needed to provide validity for the tactile instructional products and to identify further design criteria and instructional strategies. The authors of this report also feel that additional studies are needed about the potential affect that these instructional aids might provide in supporting higher cognitive learning processes for people with visual impairments.

References

Aldrich, F. K., & Sheppard, L. (2000). Graphicacy: The fourth R. Primary Science Review, 64(1), 8-11.

Knudsen, E. I., & Brainard, M. S. (1991, July 5). Visual instruction of the neural map of auditory space in the developing optic tectum. Science, 253(5015), 85-88.

Ladavas, E. Di Pellegrino, G. Farne, A. & Zeloni, G. (1998, September). Neuropsychological evidence of an integrated visuotactile representation of peripersonal space. Journal of Cognitive Neuroscience, 10(5), 581-590.

Macaluso, E. Frith, C. D., & Driver, J. (2000, August 18). Modulation of human visual cortex by crossmodal spatial attention. Science, 289(5482), 1206-1213.

Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods, John Wiley & Sons, New York, NY.

Saariluoma, P. & Kalakoski, V. (1997, Summer). Skilled imagery and long-term working memory. American Journal of Psychology, 110(2), 177-202.

Sadato, N. Pascual-Leone, A. Grafman, J. Ibanez, V. Deiber, M. Dold, G. & Hallett, M. (1996, April 11). Activation of the primary visual cortex by Braille reading in blind subjects. Nature, 380(6574), 526-528.

Yesilada, Y. Stevens, R. & Goble, C. (2003, May). A foundation for tool based mobility support for visually impaired Web users. ACM, 1(1), 422-430.


Go to previous article 
Go to next article 
Return to 2004 Table of Contents 
Return to Table of Proceedings


Reprinted with author(s) permission. Author(s) retain copyright.