2003 Conference Proceedings

Go to previous article 
Go to next article 
Return to 2003 Table of Contents 


A NEW APPROACH TO INTERACTIVE AUDIO/TACTILE COMPUTING: THE TALKING TACTILE TABLET

Presenter
Steven Landau
President of Touch Graphics
140 Jackson Street
Brooklyn, NY 11211
Email: sl@touchgrahpics.com

Karen Gourgey
Director of Computer Center for Visually Impaired People
Baruch College
17 Lexington Avenue, Box H648
New York, NY 10010
Email: karen_gourgey@baruch.cuny.edu

Introduction

Over the last five years, collaborators from Baruch College Computer Center for Visually Impaired People and Touch Graphics Company of Brooklyn, New York, have worked to develop a new approach to audio-tactile computing. Now, these materials are being introduced to the marketplace as a unified suite of hardware and software components that promise to open up new opportunities for interactive learning and entertainment for individuals who are blind or visually impaired. In this presentation, we will describe our research, the products themselves, and the potential that the system shows for establishing a new benchmark in accessible multi-media.

Background

Teachers, map-makers and others have long understood that raised-line and textured (tactile) diagrams and pictures can be useful to illustrate spatial concepts for people with severe visual disabilities. Typically, these materials depict maps, graphs, pictorial representations and many other kinds of images. Often, the tactile graphic illustration is annotated with Braille text as a way of labeling various parts and regions (Edman, 1991). While often very effective, this technique has limitations, and some blind and visually impaired people and their teachers are dubious about such materials based on experiences of confusion or inability to interpret what the tactile artist is trying to show (Ungar, et. al, 1996). If, however, the tactile drawing is placed on a touch-sensitive tablet, then a computer can be used to give voice to appropriate audio labels, making for a more intuitive, direct and universal learning experience. In the late 1980's, such a device was developed by Dr. Donald Parkes of the University of New South Wales, and brought to market. The Nomad, as it is known, allowed a user to mount tactile sheets and to press on parts of a drawing to hear synthesized-speech descriptions (Parkes, 1994). This concept of "touch-and-tell" proved to be quite potent; however, the Nomad did not succeed commercially, because it was expensive, had low tactile resolution, and because good quality dedicated software and accompanying tactile media was not generated in sufficient quantity to support the product.

The Talking Tactile Tablet

We have created a new system that goes well beyond the pioneering work that was carried out by Dr. Parkes and his colleagues. Our Talking Tactile Tablet incorporates a high-resolution touch screen built into a rugged housing. The device is connected to a PC or Macintosh computer with a single USB cable. A user open the hinged frame, mount a tactile sheet, and close the frame, thereby trapping the tactile sheet motionless against the touch-sensitive surface (Landau & Gourgey, 2001).

two pictures the first is of A student using the Talking Tactile Tablet and the second is an example of The Talking Tactile Tablet connected to a laptop computer.

Figure 1 (left): A student using the Talking Tactile Tablet. He is pressing the control buttons to navigate through the application.

Figure 2 (right): The Talking Tactile Tablet connected to a laptop computer, shown with an assortment of tactile overlay sheets.

The TTT acts as a pointing device, and the tactile overlay, as a static display. When a user touches a point on the tactile overlay, his finger pressure is transmitted through the flexible PVC sheet to the touch pad below, and the computer interprets that touch as a set of x,y coordinates. The computer compares the coordinates of each touch to a database of regions of any shape. By this means, the computer can identify each tactile entity on the overlay, as long as the pre-defined regions in the current database match the tactile image. By developing applications using the multi-media authoring system Macromedia Director, it becomes possible to construct sophisticated interactive computer applications that are completely accessible to a person with no or low vision. This is an important achievement, because it demonstrates that blind and visually impaired people are not automatically excluded from the burgeoning world of multi-media, simply because they cannot see a video screen, or manipulate a mouse.

The Tactile Graphical User Interface

The TTT concept relies on a standardized approach to laying out the tactile surface; this mirrors the highly successful approach to Windows computing that allows users familiar with the system to use any new program with some intuitive familiarity. Touch Graphics has designed and tested a Tactile Graphical User Interface (TGUI) that is particularly well-suited to a visually impaired user (see fig. 3, an illustration of the TGUI, as it appears in a plate from the Talking Tactile Atlas of the World). In this system, as in Windows, all applications designed to run on the TTT employ an identical set of control and data entry tools arrayed around a standard rectangular box that represents the "workspace". All features unique to a particular application appear in the workspace.

Map of Africa

Figure 3: The Africa Continent Plate of the Talking Tactile Atlas of the World, carried out in collaboration with National Geographic Maps.

When a user begins a session with the TTT, or each time a new tactile overlay is called for, an initialization process is carried out according to the virtual narrator's spoken instructions. This process consists of two steps: first, it is necessary to calibrate the unit, then the user must identify to the computer which of a potentially large collection of tactile overlays has been placed. For the first step, the user is asked to press small raised dots in three corners, one at a time. An audio confirmation sound is heard as these points are pressed. When this is complete, the computer can calculate a correction factor that compensates for any misalignment, reversal or dimensional instability of the tactile overlay. Every subsequent pick is then interpreted by correcting for any scalar, translational or rotational error. Next, the user is instructed to lightly run her finger across the top edge of the overlay, where she will encounter a long horizontal line with three distinct raised bars along its length. The user is directed to run her finger along the line, and press down on each raised bar that is encountered from left to right. A different confirming tone indicates that each pick has been registered. For each tactile overlay, these bars are in different locations along the line, and the computer can determine which plate of which application is currently fixed into place on the device by comparing the positions of the bars to a list of known combinations. At this point, the start-up application invokes the executable file appropriate to the desired application, and then relinquishes control of the system until it is time to put the next tactile overlay in place, or the session ends. This entire sequence, in the "expert mode" (without extensive verbal instructions and cues), consists of a total of seven picks, and when the user is accustomed to the system, it can be carried out in about ten seconds.

User testing has indicated that individuals with a wide range of visual impairments (including total blindness) are able to use the system with a minimum of training, and found the experience both rewarding and entertaining. It is important to note that not all blind or visually impaired people are skillful tactile readers; however, great attention has been paid in the design of the system to creating a simple and intuitive interface that requires a minimum of physical dexterity or tactile graphic experience.

Applications for the TTT

We have created a variety of compelling software titles for use with the Talking Tactile Tablet. Some of these are described here:

Purchase Information

The TTT is now available for purchase. The device and the applications are offered separately, however, the Match Game is bundled with the unit as a "starter" program. The introductory package includes:

Touch Graphics will provide one year of unlimited phone and e-mail tech support, and the device is under warrantee for one year from the date of purchase against manufacturer's defect. The price for the TTT is $1100 for one device or $880 for ten or more units.

The Talking Tactile Atlas of the World will be released in August of 2003; this product, including 43 tactile and color printed map sheets and the application program on CD ROM will be available for purchase for a price of $550 per copy.

The Teacher's Authoring System is currently available for a cost of $350; this price includes a starter set of 50 capsule paper format sheets that can be used to create custom audio-tactile materials.

Those interested in learning more about this new approach to audio-tactile interactive computing should contact the authors.

References

Edman, P. Tactile Graphics. New York: American Foundation for the Blind, 1991.

Landau, S. and Gourgey, K. Development of a Talking Tactile Tablet. Information Technologies and Disability, April 2001.

Landau, S. and Russell, K. Development of an audio/tactile accommodation for administering standardized math tests to blind and visually impaired students. In review.

Parkes, D. (1994). Audio-tactile systems for designing and learning complex environments as a vision impaired person: static and dynamic spatial information access. In J. Steele and J.G. Hedberg (eds.) Cognitive Mapping: Past, Present and Future. London: Routledge.

M.A. Espinosa & E. Ochaita, Using tactile maps to improve the practical spatial knowledge of adults who are blind, Journal of Visual Impairment and Blindness, 92 (1998) 338-345.

Ungar, S., Blades, M. & Spencer, C. (1996). The use of tactile maps to aid navigation by blind and visually impaired people in unfamiliar urban environments.Proceedings of the Royal Institute of Navigation, Orientation and Navigation Conference 1997., Oxford; Royal Institute of Navigation


Go to previous article 
Go to next article 
Return to 2003 Table of Contents 
Return to Table of Proceedings


Reprinted with author(s) permission. Author(s) retain copyright.