Go to previous article.
Go to next article.
Return to 1998 Conference Table of Contents
Trace R&D Center
Department of Industrial Engineering
College of Engineering
University of Wisconsin-Madison
Madison, WI 53706
Day phone: 608-262-6966
Increasingly, electronic products from Video Cassette Recorders (VCRs) to telephones to microwave ovens, are being designed with electronic interfaces. While these add features and convenience for some, they are also making some products that were previously accessible now inaccessible to individuals with low vision, blindness, reading, cognitive or physical problems. The increasing use of sound is also creating new barriers for individuals who are hard of hearing or deaf. A package of coordinated, inter-compatible strategies has been developed that allows individuals with a wide range of disabilities to access and use these new products. At the same time, these techniques make these products easier for the population as a whole to use. The techniques have already been deployed commercially in over thirty touchscreen kiosk systems, and designs are being developed for incorporating them into VCRs, telephones, and other common electronic products. This paper describes the techniques, and how they can be applied on various devices.
Since 1991, it has been mandated through the Americans with Disabilities Act Accessibility Guidelines (ADAAG) that Automated Teller Machines (ATMs) and similar machines which provide public services (a class of machines hereinafter referred to as "Information / Transaction Machines" (ITMs)) be accessible to people with disabilities, including those with physical, hearing, and vision impairments (ATBCB, 1991). To address the ADAAG mandate, a practical set of techniques which provide cross-disability access to input and output elements used on ITMs is required.
One popular interface choice for providers of ITMs is a touchscreen. Touchscreens provide users with a means of direct selection: items such as instructions, informational texts, pictures, multimedia movies, and buttons, can be shown on screen, and the buttons can be directly pressed by the user. Ordinarily, the use of a touchscreen requires the ability to see, the ability to physically touch the objects on screen and, if sounds are made by the device, the ability to hear. A set of techniques which provides cross-disability access to touchscreens has been presented and demonstrated (Vanderheiden, 1997a, 1997b). These interface techniques, collectively called "EZ Access", use voice output and enhanced visual and auditory displays to allow the user to adjust the interface according to the their needs. The techniques provide direct access to touchscreen systems for people who have low vision, are blind, are hard of hearing, are deaf, have trouble reading, are unable to read at all, or who have physical disabilities. In addition, via indirect access (an infrared link), access is also provided to people who are completely paralyzed or deaf-blind. The EZ Access techniques have been practically applied in a commercial setting: currently over 30 public information kiosks have been installed in and around the Minneapolis area, in libraries, stores, and shopping malls (including two jobs kiosks in the Mall of America).
EZ Access has been developed as a set of software and hardware enhancements which provide a small number of powerful, flexible interface extensions which together can provide great adaptability in how a user interacts with a touchscreen interface: providing the user with a means to select from different modalities which are used to present information ('Flex-Sensory-Modal'), so that they can use the senses and abilities they have available to them, to augment the ones they do not have. Thus, if the user cannot see a visual display, the mode can be adjusted so that the interface is auditory; if the user cannot hear sounds emanating from the interface, visual events can be added to augment the display. If the user cannot use the regular input technique (touching on-screen buttons), the way input is given can be adjusted. This constitutes Flexible / adaptable Input (or simply 'Flex-Input'). Users can adjust the way they operate devices via menus, shortcuts, or by having their preferred means of interaction stored on a personal card (for devices which accept cards) or other means of personal identification, if available.
It was determined that the flexible techniques (Vanderheiden, 1997b) for accessing touchscreens could be modified to provide the same level of access to other electronic devices. A package has therefore evolved which extends the touchscreen strategies to essentially any electronic device. The touchscreen techniques have been modified to exist as a 'generic set' which can be applied differently depending upon the device, but which keeps the same underlying structure to allow users to access all devices, without going through a different learning process for each.
The following demonstrates, by way of example, how the EZ Access package can be incorporated into public information kiosks, and into two everyday products: Telephones and VCRs. Note: The following explanations are given as if the individuals were experienced users. The EZ access-based devices also have a tutorial module built in to allow new users to discover and master the techniques, though the tutorial is not described here.
To use the touchscreen kiosk, the person invokes and auditory List Mode: all items on screen are put into a list which is displayed vertically on the edge of the screen. As they slide their finger down the vertical edge, listed items are spoken aloud. When the user gets to the item they want, they hit a confirmation button, usually located below the screen. To use the telephone, the person invokes the same mode: all of the telephone functions are put into lists which are spoken one item at a time (incidentally, items are also displayed visually on a one-line LCD for users who can see). The user presses volume up and down buttons (or other specified controls) to step up and down the list. On each step the function is spoken through the unit's speaker or the telephone handset. Items are selected using a diamond-shaped pushbutton which is similar (though smaller) to that used on the kiosk. To use the VCR, the user uses a 'jog / shuttle' control, which is a detented (or non-detented) disc on the fascia and / or the remote control which can be rotated left or right and pushed in for selection. This control (or any other up/down or left/right control is used to navigate the auditory list.
On the kiosk screen, the user with low vision might see blurred shapes for the buttons or icons. They may be able to read, with difficulty, some of the larger fonts which have been used for screen titles but not the smaller text. To use the touchscreen, these users would use an auditory Select and Confirm Mode: when the user touches the blurred objects, they are both highlighted and spoken aloud. Each screen can be explored to find the buttons which are desired for the transaction. When the desired button on each screen is found, the user presses and holds their finger on the screen for a half second, which activates the button. On the telephone, the same problem is experienced, and so the same mode is invoked. As the person touches buttons, the sounds come through the handset or the speaker on the unit. If the user were, say, transferring a call, the commands are spoken into their ear-piece, but the caller on the other end would not be able to hear them. In using the VCR, the same feature can be used. When the items on the list are spoken, the volume from the VCR is automatically attenuated. An alternative to this, if attenuation were not desired for some reason (for example while watching an opera) is to have the commands shown in very large letters on the TV screen.
The same modes for people who have difficulty seeing are useful for people who have moderate or severe difficulty reading. In addition, there is a useful mode for people with a mild reading (or vision) difficulty: the Quick-Read Mode. In this mode, the user can hold down the diamond-shaped confirmation button, and touch items on the touchscreen, telephone or VCR. As items are touched, they are spoken aloud. Thus, a user who can read most of the items can use this mode momentarily (when they release the diamond button the speech is terminated, and the device returns to its standard operating mode).
If video clips with voice-overs are shown on kiosks, people who cannot hear the dialog will not be able to access the information using the Standard Mode. The Showsounds Mode can instead be turned on which causes the text captions which accompany the clip to be displayed. Turning on the same mode on the VCR would have the same effect as showing 'closed captions' on regular TV sets, but with an important enhancement: other visual events can be shown on the screen when the VCR would normally give only an audible error beep (for example, when a user tries to record on a write-protected cassette). Although a person who is deaf cannot use a telephone for spoken conversation, the addition of a standard audio connection jack would allow them to attach a portable text telephone (TTY) that they might carry in their pocket (e.g. an enhanced TTY-emulating Personal Digital Assistant (PDA)). Thus a TTY could be used from anywhere, including payphones, by means of a standardized connector.
The Showsounds Mode described above can assist people who have difficulty hearing, as can the inclusion of volume controls which have a broad enough range. In addition, the same audio connection jack for TTY use could also be used to carry an audio signal which enables a user to plug in a set of personal headphones, or a direct connection into a wearable hearing aid.
The auditory feedback providing voice support described above, can be used to give assistance to some people with mild or even moderate cognitive impairments (i.e. those with cognitive impairments which could be ameliorated by having text spoken out loud instead of having to be read). If the person has difficulty understanding text which is spoken too fast, they might slow it down, and vice versa. Depending upon the level of ability, the user may not be able to navigate menus to adjust the necessary settings. In such cases, user preferences can determined in advance with assistance elsewhere, and be stored on a bank card. Inserting this card immediately tells the kiosk which user preference settings to use. In the case of personal telephones and VCRs the default mode can be changed to reflect the needs of the owner (as would be the case for all modes).
The Select and Confirm Mode can be used with voice support, or highlighting, or with a text display, or combinations of these. This mode allows users to bump or touch other keys without activating them. For example, when using the touchscreen to enter letters on an on-screen alphabetic keyboard, a user who did not have fine motor control might hit 'B','D','F','D' when they really wanted to hit 'C'. With Select and Confirm they could keep trying until they hit 'C', at which point they remove their hand from the screen and hit the confirm button. The same technique would work on the telephone, but because the buttons have a tactile element to them (the touchscreen has no tactile feedback), the same user might be able to rest their finger on the button when they have found it, using it as a kind of 'anchor', for delayed-activation. Again, when they hold it down for half a second (or longer if configured to the user's preference), the button would be activated. Another alternative is to use the Scanning Mode for a user who has a severe movement difficulty and who can only reliably press one button. In this case, the afore-mentioned confirm button is used as a scanning switch: the items on the interface are highlighted sequentially, and the user presses (or releases if configured differently) the confirmation button to make a selection.
A person who is quadriplegic would not be able to access any of the devices directly. However, if they used a sip-and-puff control for an electric wheelchair, and a laptop computer, they could control all of the devices remotely using one software program installed on the laptop. The different devices (kiosk, telephone, VCR etc.) would all accept a single standard infrared protocol (the protocol currently proposed is called the "Universal Remote Console Communications (URCC) protocol" (Vanderheiden et. al., submitted). Using URCC, the user would approach the kiosk and be able to get an image of the screen (or a schematic if need be) sent to the laptop. The user would select a button on their laptop screen, using direct selection or text input, and it would be transmitted to the kiosk as if it were a regular screen-touch. The same system could be used, in conjunction with the speakerphone system, to control any telephone with URCC capability. This functionality could be included in private household phones, business ('PBX') phones, and payphones. URCC could also be used as an alternative to using the regular remote control of the VCR.
A commonly used tool by people who are deaf-blind is a device called a "Braille Lite", which is a personal computer which can be used for note-taking (via a Braille keyboard), and reading blocks of text (via a dynamic Braille display). An infrared port, and software, can be added to the device which would enable the URCC protocol to be used with all devices. This Direct Text Control technique uses the same information which is given in List Mode (described above), sent as a text stream to the Braille Lite. URCC supports plain text for such devices, and also schematic or realistic screen representations for devices with visual displays, as described for the user who is quadriplegic.
The process of implementing EZ Access varies depending on the product and the company that makes it. In many cases adding the techniques will entail adding functionality such as speech output or audio system interoperability. Such changes are, in general, inexpensive when applied to mass-market products: the cost of creating voice output (either for "canned", small vocabularies or for open synthesized vocabularies) is dropping rapidly, making it possible to add speech to a wide variety of products (Meisel, 1997). Changes will have to be made to adjust (or add to) existing software code in a device. To make the software change-process easier, a 'Common Table Architecture' is proposed (Vanderheiden, 1997b), which enables the storage of information required for EZ Access in a modality-independent or a modality-parallel form. An 'EZ Access Development Package' is being created to enable developers from different organizations to create devices with cross-product consistency for EZ Access elements, should they decide to install them.
With standardized electronic components, and interface adjustments which are consistent across-products, users can adjust devices to work in the way that suits them, without having to affect the way the devices work for others. Lower cost components, and the availability of Flex-Sensory-Modal / Flex-input common architectures for interface elements can allow cross-disability access to be integrated into diverse and widespread public ITMs and consumer electronic products. Further information on the components of EZ Access can be obtained from the authors at the website address given at the head of the paper.
This project is funded in part by the National Institute on Disability and Rehabilitation Research of the Department of Education under grants number H133E30012 & H133E5002. The opinions herein are those of the grantee and do not necessarily reflect those of the Department of Education.
ATBCB (Architectural & Transportation Barriers Compliance Board) (1991) Americans with Disabilities Act Accessibility Guidelines (ADAAG). Published in the Federal Register, July 26, 1991.
Meisel, W.S. (1997) Developments in the telephony voice user interface. Tarzna, CA: TMA Associates.
Vanderheiden, G.C. (1997a) Cross-disability Access to Touch Screen Kiosks and ATMs. Advances in Human Factors / Ergonomics, 21A. pp.417-420. Presented at the HCI International Conference, San Francisco 1997, August 24-29, 1997.
Vanderheiden, G.C. (1997b) Use of a common table architecture for creating hands free, eyes free, noisy environment (flex-modal, flex-input) interfaces. Advances in Human Factors / Ergonomics, 21A. pp.449-452. Presented at the HCI International Conference, San Francisco 1997, August 24-29, 1997.
Vanderheiden, G.C., Kelso, D., and Brykman, L. (submitted) Proposal for a Universal Remote Console Communication (URCC) protocol. Paper submitted for inclusion in the 1998 RESNA (Rehabilitation Society of North America) Annual Conference, Minneapolis, MN, June 26 - July 1, 1998
Go to previous article.
Go to next article.
Return to 1998 Conference Table of Contents
Return to Table of Proceedings