1998 Conference Proceedings

Go to previous article. 
Go to next article. 
Return to 1998 Conference Table of Contents 


Chris Law
Email: law@tracecenter.org 

Gregg Vanderheiden
Email: gv@tracecenter.org 

Trace R&D Center
Department of Industrial Engineering
College of Engineering
University of Wisconsin-Madison
Madison, WI 53706
Day phone: 608-262-6966


Increasingly, electronic products from Video Cassette Recorders (VCRs) to telephones to microwave ovens, are being designed with electronic interfaces. While these add features and convenience for some, they are also making some products that were previously accessible now inaccessible to individuals with low vision, blindness, reading, cognitive or physical problems. The increasing use of sound is also creating new barriers for individuals who are hard of hearing or deaf. A package of coordinated, inter-compatible strategies has been developed that allows individuals with a wide range of disabilities to access and use these new products. At the same time, these techniques make these products easier for the population as a whole to use. The techniques have already been deployed commercially in over thirty touchscreen kiosk systems, and designs are being developed for incorporating them into VCRs, telephones, and other common electronic products. This paper describes the techniques, and how they can be applied on various devices.


Since 1991, it has been mandated through the Americans with Disabilities Act Accessibility Guidelines (ADAAG) that Automated Teller Machines (ATMs) and similar machines which provide public services (a class of machines hereinafter referred to as "Information / Transaction Machines" (ITMs)) be accessible to people with disabilities, including those with physical, hearing, and vision impairments (ATBCB, 1991). To address the ADAAG mandate, a practical set of techniques which provide cross-disability access to input and output elements used on ITMs is required.

One popular interface choice for providers of ITMs is a touchscreen. Touchscreens provide users with a means of direct selection: items such as instructions, informational texts, pictures, multimedia movies, and buttons, can be shown on screen, and the buttons can be directly pressed by the user. Ordinarily, the use of a touchscreen requires the ability to see, the ability to physically touch the objects on screen and, if sounds are made by the device, the ability to hear. A set of techniques which provides cross-disability access to touchscreens has been presented and demonstrated (Vanderheiden, 1997a, 1997b). These interface techniques, collectively called "EZ Access", use voice output and enhanced visual and auditory displays to allow the user to adjust the interface according to the their needs. The techniques provide direct access to touchscreen systems for people who have low vision, are blind, are hard of hearing, are deaf, have trouble reading, are unable to read at all, or who have physical disabilities. In addition, via indirect access (an infrared link), access is also provided to people who are completely paralyzed or deaf-blind. The EZ Access techniques have been practically applied in a commercial setting: currently over 30 public information kiosks have been installed in and around the Minneapolis area, in libraries, stores, and shopping malls (including two jobs kiosks in the Mall of America).


EZ Access has been developed as a set of software and hardware enhancements which provide a small number of powerful, flexible interface extensions which together can provide great adaptability in how a user interacts with a touchscreen interface: providing the user with a means to select from different modalities which are used to present information ('Flex-Sensory-Modal'), so that they can use the senses and abilities they have available to them, to augment the ones they do not have. Thus, if the user cannot see a visual display, the mode can be adjusted so that the interface is auditory; if the user cannot hear sounds emanating from the interface, visual events can be added to augment the display. If the user cannot use the regular input technique (touching on-screen buttons), the way input is given can be adjusted. This constitutes Flexible / adaptable Input (or simply 'Flex-Input'). Users can adjust the way they operate devices via menus, shortcuts, or by having their preferred means of interaction stored on a personal card (for devices which accept cards) or other means of personal identification, if available.

It was determined that the flexible techniques (Vanderheiden, 1997b) for accessing touchscreens could be modified to provide the same level of access to other electronic devices. A package has therefore evolved which extends the touchscreen strategies to essentially any electronic device. The touchscreen techniques have been modified to exist as a 'generic set' which can be applied differently depending upon the device, but which keeps the same underlying structure to allow users to access all devices, without going through a different learning process for each.


Note: there is also the Standard Mode for each device (the mode that most devices would normally use as the default).


The following demonstrates, by way of example, how the EZ Access package can be incorporated into public information kiosks, and into two everyday products: Telephones and VCRs. Note: The following explanations are given as if the individuals were experienced users. The EZ access-based devices also have a tutorial module built in to allow new users to discover and master the techniques, though the tutorial is not described here.


The process of implementing EZ Access varies depending on the product and the company that makes it. In many cases adding the techniques will entail adding functionality such as speech output or audio system interoperability. Such changes are, in general, inexpensive when applied to mass-market products: the cost of creating voice output (either for "canned", small vocabularies or for open synthesized vocabularies) is dropping rapidly, making it possible to add speech to a wide variety of products (Meisel, 1997). Changes will have to be made to adjust (or add to) existing software code in a device. To make the software change-process easier, a 'Common Table Architecture' is proposed (Vanderheiden, 1997b), which enables the storage of information required for EZ Access in a modality-independent or a modality-parallel form. An 'EZ Access Development Package' is being created to enable developers from different organizations to create devices with cross-product consistency for EZ Access elements, should they decide to install them.


With standardized electronic components, and interface adjustments which are consistent across-products, users can adjust devices to work in the way that suits them, without having to affect the way the devices work for others. Lower cost components, and the availability of Flex-Sensory-Modal / Flex-input common architectures for interface elements can allow cross-disability access to be integrated into diverse and widespread public ITMs and consumer electronic products. Further information on the components of EZ Access can be obtained from the authors at the website address given at the head of the paper.


This project is funded in part by the National Institute on Disability and Rehabilitation Research of the Department of Education under grants number H133E30012 & H133E5002. The opinions herein are those of the grantee and do not necessarily reflect those of the Department of Education.


ATBCB (Architectural & Transportation Barriers Compliance Board) (1991) Americans with Disabilities Act Accessibility Guidelines (ADAAG). Published in the Federal Register, July 26, 1991.

Meisel, W.S. (1997) Developments in the telephony voice user interface. Tarzna, CA: TMA Associates.

Vanderheiden, G.C. (1997a) Cross-disability Access to Touch Screen Kiosks and ATMs. Advances in Human Factors / Ergonomics, 21A. pp.417-420. Presented at the HCI International Conference, San Francisco 1997, August 24-29, 1997.

Vanderheiden, G.C. (1997b) Use of a common table architecture for creating hands free, eyes free, noisy environment (flex-modal, flex-input) interfaces. Advances in Human Factors / Ergonomics, 21A. pp.449-452. Presented at the HCI International Conference, San Francisco 1997, August 24-29, 1997.

Vanderheiden, G.C., Kelso, D., and Brykman, L. (submitted) Proposal for a Universal Remote Console Communication (URCC) protocol. Paper submitted for inclusion in the 1998 RESNA (Rehabilitation Society of North America) Annual Conference, Minneapolis, MN, June 26 - July 1, 1998

Go to previous article. 
Go to next article. 
Return to 1998 Conference Table of Contents 
Return to Table of Proceedings 

Reprinted with author(s) permission. Author(s) retain copyright.