Go to previous article
Go to next article
Return to 2004 Table of Contents
330 West 38 Street Suite 1204
New York, NY 10018
330 West 38 Street Suite 1204
New York, NY 10018
University of Arizona
College of Education
Department of Special Education, Rehabilitation and School Psychology
The speakers will present findings from research on the usefulness the Talking Tactile Tablet, a new computer peripheral audio/tactile device, for delivering statewide assessments and other standardized tests to students who are blind, visually impaired or otherwise print disabled. The presentation will focus on results from recently completed human subjects trials carried out in Tucson, Phoenix, Philadelphia and Boston.
Increasingly in the United States, high-stakes testing is relied on as a means of measuring educational outcomes, and informing policy decisions. As this emphasis on testing continues to grow, individual states are motivated to find ways to improve access for populations who have not always been able to demonstrate their competency in some areas due to limitations of the test instruments themselves. Blind and visually impaired students have historically performed well below the mean on test items that reference graphic images, such as geometrical figures, pie charts, maps and graphs (Bennet, et al, 1985). This outcome is not always the result of an intellectual shortcoming on the part of the student: existing methods for illustrating and annotating these images are partly to blame for poor test scores.
At the same time, states are beginning to develop computer systems for delivering standardized tests, to take advantage of the efficiencies of automated delivery and grading, as well as opportunities for increased accuracy in assessment of the test-taker's abilities. As with other areas where computers have become ubiquitous, people who cannot see a video monitor clearly or manipulate a mouse are at risk for being further isolated, unless accommodation for their disabilities is considered by those who design these systems (Hansen, Lee & Forer, 2002). In the case of standardized testing, the migration to computerized platforms represents both a risk and an opportunity for this group. The creation of new test vehicles could potentially erect further barriers to full inclusion for blind and visually impaired students, and thus exacerbate the problems associated with depressed test scores, or it could open the way to unprecedented levels of participation and accomplishment for these individuals. The project discussed here explores the development of a new approach that seeks to derive the maximum benefit from the advent of computerized testing. The project could lead to the creation of new products to help states improve statutory compliance; improve correlation between blind and visually impaired students' test scores and their academic ability and knowledge; and better prepare members of this group for future academic and professional achievement.
The Talking Tactile Tablet
The Talking Tactile Tablet is a new low-cost computer peripheral device that was developed by Touch Graphics and is manufactured and marketed by American Thermoform Corporation of LaVerne, California (see figure 1, a photograph of the TTT connected to a laptop computer, and figure 2, a photograph of a student using the TTT). The TTT allows users who are blind or visually impaired to interact with sophisticated audio/tactile computer applications, including the National Geographic Talking Tactile World Atlas, the TTT Authoring Tool, and the TTT Match Game. A new curriculum for college-level statistics is in development. The device is designed to be used in conjunction with computer-generated plastic raised-line and textured (tactile) overlay sheets.
Inspired by the groundbreaking Nomad device from the early 1990's (Parkes, 1994), the TTT acts as a pointing device, and the tactile overlay, as a static display. When a user touches a point on the tactile overlay, his finger pressure is transmitted through the flexible PVC sheet to the touch pad below, and the computer interprets that touch as a set of x,y coordinates. The computer compares the coordinates of each touch to a database of regions of any shape. By this means, the computer can identify each tactile entity on the overlay, as long as the pre-defined regions in the current database match the tactile image. By developing applications using the multi-media authoring system Macromedia Director, it becomes possible to construct sophisticated interactive computer applications that are completely accessible to a person with no or low vision. This is an important achievement, because it demonstrates that blind and visually impaired people are not automatically excluded from the burgeoning world of multi-media, simply because they cannot see a video screen, or manipulate a mouse.
The TTT concept relies on a standardized approach to laying out the tactile surface of each sheet; this mirrors the successful approach to Windows-style computing that allows users familiar with the system to use any new program with some intuitive familiarity. Touch Graphics has designed and tested a Tactile Graphics User Interface (TGUI; see figure 3, an illustration of a sample sheet from the Test Taker with explanatory text labels). In this system, as in Windows, all applications designed to run on the TTT employ an identical set of control and data entry tools arrayed around a standard rectangular region that represents the "workspace".
When a user begins a session with the TTT, or each time a new tactile overlay sheet is mounted, an initialization process is carried out according to the virtual narrator's spoken instructions. This process consists of two steps: first, it is necessary to calibrate the unit, then the user must identify to the computer which of a potentially extensive collection of tactile overlays has been placed. For the first step, the user is asked to press small raised dots in the upper left and then lower right corners. An audio confirmation sound is heard as these points are pressed. When this is complete, the computer can calculate a correction factor that compensates for any misalignment, reversal or dimensional instability (e.g., stretching) of the tactile overlay. Each subsequent pick is then interpreted by correcting for scalar, translational or rotational error. This process is required by the high level of tactile resolution of the TTT's touch screen, and makes possible the kinds of precise and accurate responses that are required by applications such as the World Atlas maps.
Next, the user is instructed to find a long horizontal strip running along the top edge of the overlay, and then to press each of three short vertical bars that cross the strip, in sequence as they are encountered from left to right. Different confirming tones indicate that each pick has been registered. For each tactile overlay sheet, these bars are in different positions along the line and the computer can determine which plate is mounted by comparing the positions of the bars with a list of all known combinations. At this point, the generic launcher application that has controlled all interactions in the set-up process relinquishes control of the system, as it invokes the specific application files that are needed to work with the current overlay. This process must be repeated every time that a new sheet is placed; however, by choosing the "Expert Mode", an experienced user can carry out the set up in about eight seconds, with minimal audio prompting.
The Test Taker
Under funding from the US Department of Education, Touch Graphics and its collaborators, including the University of Arizona College of Education, Department of Special Education, Rehabilitation and School Psychology, and the Center for the Study of Testing, Evaluation and Education Policy at Boston College, has created a new system for standardized tests that relies on the Talking Tactile Tablet as a delivery platform. Students take test in the following way, as illustrated in the program logic diagram (see figure 4):
1. The user is prompted to press anywhere on the TTT's surface for three seconds to "wake up" the system.
2. The Narrator offers to provide introductory information, and explains that the experienced user can skip this by pressing anywhere to jump to the "expert mode".
3. The user is prompted to press two "set up dots", and then to press the three ID Bars.
4. A tutorial is provided on how to take tests with the TTT; the student is introduced to each element of the Tactile Graphic User Interface, and is asked to indicate that he or she is ready to go forward by pressing each tool or icon as they are described.
5. The Narrator explains that there are three different tests available: Math, Science and History, and explains how the student must press the up and down arrows to scroll through a listing of these, and then press the circle button to select one of them to work on.
6. Next, the user is instructed to select a question to work on by pressing the up and down arrows to move through a list. For each item, the Narrator announces whether that item has already been answered or not. If the item has been answered, the Narrator reminds the student of his or her previous answer. The student presses the circle button to select an item to work on from the list.
7. If the item that the student has selected includes a graphic that does not appear on the currently mounted sheet, he or she is instructed to remove the current sheet and replace it with the required one. Once that is done, the Narrator walks the student through the process of setting up the new sheet.
8. The Narrator reads the question aloud. The user can press the left arrow at any time to hear the question again, or he or she can choose to enter a response. There are four different types of items, and each one calls for a different ways of responding:
a. Letter-answer multiple choice. For these items, the student presses the right arrow to move through the answer choices, then presses the circle button to select one.
b. Graphical multiple choice. For some items, an answer choice can be made directly, by pressing a feature of the graphic for three seconds. This is useful for map items, for example, where the student is asked to identify which of a group of dots marks the spot on a map where a particular historical event took place.
c. Single part open-ended. Some test items call for the student to enter a response directly; for these, the student types in their answer on the computer's keyboard. A rudimentary screen reader is built into the program that reads back letters and words as they are typed, and permits the student to review what he or she has written prior to pressing the circle button to register the answer and move on to the next item.
d. Multi-part open-ended. A few open-ended items require that the student enter individual answers for up to four interrelated questions. Once the student indicates that he or she wants to work on one of these items, he or she must press the right arrow to move through the parts and then choose one to work on. Otherwise, these items are handled the same as the single part open-ended style described above.
9. When an answer choice is registered, the system instructs the student to press the circle button if he or she would like to work on the next question in numerical sequence, or to use the up and down arrows to move to out-of-sequence items.
10. When the final item is reached, the Narrator reads a listing of all the items that have not yet been answered, and the student is free to go back to those to work on those, or to review already answered items, to consider changing the previous answers.
11. If the student is satisfied with his or her work on that test, he or she can choose to move to a different test. To quit working altogether, the student presses on the plus sign for a count of three to quit the application.
In November and December of 2004, we ran usability tests at schools in Tucson, Phoenix, Boston and Philadelphia. Preliminary results indicate that the system is usable by a wide range of blind and visually impaired students. The purpose of this first round of tests was only to study the comfort level of students with the system, and to study their behavior to uncover possible improvements. In the next round of testing, to be held at the same schools in September of 2004, we will record quantitative information about student performance (numbers of right and wrong answers, time spent on tests and individual items and other criteria) and compare that with the results of similar tests given to the same students with their choice of traditional accommodations, including Braille copy, audio tape, and a human reader.
This ongoing research is showing that the Talking Tactile Tablet, a new audio/tactile computer peripheral device, provides a more effective means of testing academic ability and knowledge as compared with existing methods. This development is of special interest now, as the use of standardized assessments is emphasized to a greater extent in determining educational outcomes than in the past. The researchers aspire to use the results of this project to make the case for establishing the TTT or similar accommodation as an option in statewide tests and other high-stakes assessments for students who are blind or visually impaired.
Bennett, R.E.(1999). Computer-based testing of examines with disabilities: on the road to generalized accommodation. IN S. Messick (ed.) Assessment in higher education: issues of access, quality, student development, and public policy. Mahway, NJ: Lawrence Erlbaum Associates.
Hansen, Lee & Forer (2002). A self-voicing test for individuals with visual impairments. In Journal of Visual Impairment and Blindness, April 2002. AFB Press, New York.
Landau, S., Russell, M., Erin, J. & Gourgey, K. Use of the Talking Tactile Tablet in mathematics tests. Journal for Visual Impairment and Blindness, February 2003. AFB Press, New York. Landau, S. and Gourgey, K. Development of a Talking Tactile Tablet. Information Technologies and Disability, April 2001.
Parkes, D. (1994). Audio-tactile systems for designing and learning complex environments as a vision impaired person: static and dynamic spatial information access. In J. Steele and J.G. Hedberg (eds.) Cognitive Mapping: Past, Present and Future. London: Routledge.
Go to previous article
Go to next article
Return to 2004 Table of Contents
Return to Table of Proceedings