EXTENDING AN EXISTING TOOL TO CREATE NON VISUAL INTERFACES
45 rue des Entrepreneurs Charguia II
Tunis Tunis 2035
Day Phone: +21698200097
This paper exposes how to modify an existing programming environment to make it efficient for developing interfaces for the blind and partially sighted people.
Non-visual interfaces (interfaces for the visually handicapped persons or NVI) represent a specific type of software development. Creating software for people with visual deficiencies can be achieved using a small number of options: adapting existing software, using accessibility features of existing environments, or developing specific interfaces using specific peripherals and interactions. This last option is necessary when existing applications do not respond to the handicapped specific needs.
A software developer in
front of the task of creating an application for the visually handicapped persons
has to face the use of complex peripherals, and different interaction
principles like multimodality. But there is no existing tool to make NVI
development easier. With the emergence of interfaces for all. , I think that
the most urgent thing to do is to provide the developers community with tools
that help them in creating NVIs.
There are two interesting
approaches that inspired my work:
* the extended User Interface Managing System proposed by Stephanidis . It consists in adding a specific toolkit to an existing toolkit (figure UIMS_Ext.JPG); the use of generic tools to create NVIs, recommended by Burger .
Besides to its classic
functionalities, an Integrated Development Environment (IDE) may be optimized
by adding the components missing for the developing of NVIs.
SPECIFICATION OF THE NON
After developing NVIs mainly in the field of education , it came out that a standard platform must be completed with the following components:
* a system to deal with specific input/output peripherals;
* a tool to define events and time constraints for events;
* a library of adapted controls.
The IDE chosen is Microsoft Visual C++ which is made out from four components:
* the application framework called AppWizard;
* the MFC classes toolkit;
* the resource editor or AppStudio;
* the dialog controller or ClassWizard which associates callbacks to controls.
These components work with an event manager that deals with standard peripherals only. This IDE must be extended as shown in figure MSVC_Ext.JPG. Comparatively to Stephanidis recommendation  my work does not end to the function toolkit, but it is extended to all the components of the original IDE.
The Extended Toolkit
The original toolkit is completed with functions permitting to communicate with a non standard peripheral, and to implement new interactions.
I designed abstract data
types which in object oriented modeling and programming are represented and
implemented using the class concept. Input/Output peripherals managing, I
created a library of non visual classes, which take place in elaborating the
non visual interactions: abstract classes representing each of the possible
peripherals, and classes defining communication protocols.
BrailleTerminalType is an abstract data type on which I based the
implementation of CBrailleTerminal: an abstract class representing all types of
Braille terminals. Specific models of Braille terminals inherit general
characteristics from class CBrailleTerminal, and implement their specific
characteristics. Each of the peripheral classes developed implements a basic
Controls like buttons, list boxes, menus are used for the design of a GUI. It would be interesting to find matching non visual objects. I have proposed adaptations for graphical interactive objects . An example of a Braille edit box is given in figure Textbox_adap.JPG. A line of Braille text is displayed in the edit box, and the Braille terminal function keys are programmed to navigate in the whole text, while interactive Braille keys allow to point on a single character or to make a word selection.
To implement these
controls I made two propositions:
* creating classes implementing the functionalities of adapted controls that can be modified by the developer;
* creating reusable controls.
At the end of this step we are in presence of a personalized and specific version of the MFC and the component library. The resource editor AppStudio integrates the reusable controls.
The Application Framework
I propose to implement an application framework called NonVisualAppWizard for NVIs.
NonVisualAppWizard permits to start an application by giving some options to the developer. The options in a sequence of dialog boxes, concern choices relative to the NVI. Once the choices made, NonVisualAppWizard generates some of the application source code.
NonVisualAppWizard designing goes trough three major steps:
* designing the general options of the application,
* designing the general event manager,
* implementing the template files.
Different dialog boxes present the general options. The exact number of dialog boxes depends on the choices made in the first step which concerns the I/O peripherals of the application. There is a dialog box per device permitting to define its specific features (model, configuration, protocol and events).
Figure NVApp1.JPG is the dialog box proposed after the developer has validated his choices illustrated in figure NVApp0.JPG.
Dialog boxes are designed to describe the different combinations of events allowed depending on the peripherals chosen in the initial steps:
* simple events coming from input peripherals;
* events on output peripherals;
* multimodal input events, some realistic multimodal events are proposed;
* time constraints on multimodal events and on some simple events will be entered by the developer.
The figure NVApp2.JPG shows the definition of events that may come from the Braille terminal.
The Template Files
The template files are used by the application framework as models to generate source files. There are template files for each one of the application header and source files, together with a main template file, containing the description of the application framework, and controlling the generation of the final source files.
Template files consist in conditional instructions, source code and keywords that are stocked in a dictionary; they can be instantiated with Boolean or numeric values. These values are taken from the NonVisualAppWizard sequential dialog boxes. Each dialog box permits to update one or more keywords in the dictionary. This latter is then used to generate the source files. For example by choosing the Braille terminal button in NVApp0.JPG, the keyword (BRAILLE_TERMINAL) is set to TRUE. This will instantiate every instruction containing this keyword in the template files to TRUE, and due to this action, code is generated that takes into account the presence of the Braille terminal.
The Template Event
The template event manager is a simple algorithm that scrutinizes all peripherals. Each peripheral class contains a local event manager which waits for events, and puts them in a local queue. The principal event manager, reads all events in the queues, interprets them to distinguish between simple and multimodal events, and puts them in a general queue.
A rule base is used by
the algorithm to help classification and interpretation of the events. The rule
base is generated after the definition of simple and multimodal events and
temporal constraints. The result of these implementations is a wizard which is
completely integrated in MSVC++ as shown in figure NVApp3.JPG. The developer of
a NVI has to complete the generated code, especially when he deals with
peripherals we did not have the opportunity to consider. But he can always rely
on the abstract classes offered, because they give a complete model for
The Dialog Controller
The dialog controller has a tight relationship with the application framework and the toolkit components. It works normally with the controls developed as reusable controls and integrate classes defined for peripherals or adapted objects.
This experimentation could lead to a project allowing the use of an existing interface manager to implement specific user interfaces. By developing libraries which command specific peripherals or transactions, by designing and creating adapted interactive objects, the method is applicable to other kinds of disability and other types of interfaces. My work has gone through four important phases: studying the field of accessibility for the visually handicapped people, defining needs, specifying and designing abstract data types, implementing libraries and objects and finally realizing a solid documentation targeted to the developers.
 Stephanidis, C, User interfaces for all: new perspectives into Human-Computer Interaction, In C Stehanidis (Ed), User interfaces for all concepts-methods, and tools. 2001
 Stephanidis, C, Savidis, A, Chomatas, G, Spyridou, N, Sfyrakis, M, Weber, G, Concerted action on technology and blindness, Medical and Health research programme of the European community CEE, Bruxelles, 1991.
 BURGER Dominique (1994) "Improved access to computers for the visually handicapped: New prospects and principles". In: IEEE transactions on rehabilitation engineering, vol.2, N?3, September 1994.
 Bouraoui, A, Burger D., Tactison: a multimedia learning tool for blind children, In Computers for Handicapped Persons, ICCHP 94,
 Bouraoui A, Etude et r顬isation d?un 餩teur d?interfaces non visuelles multim餩as, Thesis,
 IHM 92 ??Quatri譥s journ饳 sur l?ing鮩erie des interfaces Homme-Machine??, Compte rendu des ateliers, Paris, Septembre 1992.