2003 Conference Proceedings

Go to previous article 
Go to next article 
Return to 2003 Table of Contents 


Ruth Loebl
RNIB Technology in Learning and Employment
Email: Ruth.Loebl@RNIB.org.uk

Richard Orme
Royal National Inst. F/T Blind (RNIB)
Birmingham, United Kingdom

At the CSUN conference in 2001, we outlined our approach to practical testing of software and systems for accessibility. This included ideas about testing for compliance with guidelines and standards such as those contained in Section 508, which at that time was not yet in force. In the intervening two years we have taken our ideas further, and we have now developed a software tool which should enable a user to carry out compliance testing according to any relevant standards or guidelines they may choose.

Because the tool is intended to be used by people who have little knowledge or experience of accessibility, disability or assistive technology, our intention was not only to create a practical testing tool, but also to make the process of testing into a learning experience for the tester. While using the tool and applying standards, the user should start to understand why the standards are important, and what the effect of inaccessibility is to various users with or without disabilities. As a result of this increased awareness, that tester might choose to avoid incorporating accessibility barriers in their future design and development work, point out potential problems to others, and incorporate standards in the development of software, websites or other electronic and information technologies.

Having developed a testing concept, we needed to produce some code. In 2001 we had a front-end just to demonstrate the idea, but when the time came to create a prototype, we had to get to grips with the thorny issue of the coding environment. We chose Java!

You may be aware that for people using assistive technology, there is very limited access to Java programs at present. If the right version of the developer tools is used to create the code, users then need the Java Access Bridge to expose components to assistive technology, which has to have been adapted explicitly to make use of the Access Bridge. Only a couple of assistive technology developers are currently addressing access to Java. So the choice of Java for our accessibility tool may seem odd.

Thinking about the target audience, however, it seems a little less illogical. We want to educate and inform people who aren't already experts on inclusive design and accessibility, and we don't want to limit them to the Win32 platform. Standards and guidelines don't only apply to software, and our tool could be used in any arena where checking against discrete standards or guidelines is appropriate. Most contentious is our assertion that to test for compliance with accessibility standards, the tester has to be able to identify and explore inaccessible aspects of the software, website or device. Insisting that a tool to explore inaccessibility is accessible seemed paradoxical.

One of our aims has been to separate the process of checking from the reference data, and the recorded outcomes. We also wanted to offer users the opportunity to tailor the reference standards to suit their own purposes. We are therefore offering some pre-existing standards so that a starting point is provided, but allowing users to amend, delete or add standards so that accessibility and inclusive design testing might even be incorporated into their standard testing schedule.

The tool differs from some accessibility checkers in that there is no automated checking process. The user is given the criteria for the testing, and has to carry out the process manually. This is partly to avoid the common criticism of an automated tool like Bobby: it may be able to detect if a standard is not met, but cannot distinguish, say, between an appropriately described image and a image labelled with nonsense. The manual process is also necessary because at present, there are no automated testing routines that are universally applicable to systems that the tool is intended to check.

The output from the tool is simply a report of the testing carried out, presented in a repeatable, standardised format. The content is made up of the inclusive design standards chosen by the user, and the notes that the user has made throughout the checking process.

We intend to find a way to distribute the tool free of charge in a downloadable format, as we feel it is more important to see the principles applied in the real world than to prove a commercial concept. To this end, we are looking for sponsors and partners, as well as anyone prepared to help us test the concept and provide feedback to improve the end product.

Go to previous article 
Go to next article 
Return to 2003 Table of Contents 
Return to Table of Proceedings

Reprinted with author(s) permission. Author(s) retain copyright.