2003 Conference Proceedings

Go to previous article 
Go to next article 
Return to 2003 Table of Contents 

AMIS: The Open Source Adaptive Multimedia Information System

Hiroshi Kawamura
Markku Hakkinen
Marisa DeMeglio
Information Center
Japanese Society for Rehabilitation of Persons with Disabilities


This paper provides an overview of the concepts and design of the Adaptive Multimedia Information System (AMIS) and the goals of the AMIS Project. This project has developed an open source a Daisy play back system that allows easy adaption and extension to support multiple disabilities.


AMIS [1, 2] is a software application for the playback of natively accessible content in the DAISY Format [3]. DAISY playback includes support for SMIL elements and the Navigation Control Center. AMIS enables accessible presentation through pre-recorded audio, synthesized speech, large print, and Braille renderings.

AMIS employs a flexible XML-based architecture that allows for adaptation of the standard interface to meet the needs of both users and assistive technologies. The mutable application interface allows it to adapt to user preference, content delivery mode, and assistive device capabilities. Users may customize font size, color contrast, spacing, volume, playback speed, and presence/absence of interface regions. Core interface features are derived from both the DAISY playback model and the W3C User Agent Accessibility Guidelines [4]. AMIS XML documents describe application components (controls, content layout regions, dialogs, content renderings) aurally, textually, and visually.

The default application interface allows for visual and aural output; and touchscreen, mouse, and keyboard input. Through the use of the AMIS Plug-in SDK, developers can write interfaces to a variety of assistive devices. These devices can gain access to the same application functionality and content as does the native interface. Localization is also very easily obtainable in AMIS, because every aspect of the interface is customizable. Labels on buttons and regions are all imported directly from the system's Interface Markup documents. The plug-in architecture allows for the addition of new input methods, to accommodate techniques such as IME (Input Method Editor generally used for East Asian languages) and on-screen keyboards. The adaptable interface framework and the content rendering capabilities, coupled with the use of open standards, enables the customization or addition of features to meet a broad range of user requirements.

The goals of the AMIS Project focus not only on the availability of a Daisy playback system for the international community, but also look to AMIS as a vehicle for technology and knowledge transfer in developing countries. By making the source code freely available, programmers and technicians can learn about the open standards behind the AMIS software (such as Daisy, XML, and SMIL), the underlying software architecture, and can make use of such knowledge to adapt the software to local requirements or create enhancements and extensions to the core software. As the AMIS Project continues, training courses for software developers are being planned in a number of developing countries.

Using AMIS

AMIS has been designed to offer easy adaptation to a variety of input and output devices. Using the plug-in architecture, new user interfaces can be easily created to provide custom controls or display styles. Examples of AMIS interfaces developed to date include:

Touch Screen Interface

AMIS has built-in support for touch screen interfaces, and using the skins, it is possible to create a easy to use interfaces for DAISY book reading. Large, easy to locate on-screen buttons allow children and adults to control reading activity. An "explain" mode is available, so that users can explore the interface functions without actually activating the controls. The touch screen interface is shown in Picture 1.

Image of girl using AMIS Touch Screen Interface. She is reaching toward the screen with a big smile on her face, while a JSRPD staff member looks observes.

Picture 1. AMIS Touch Screen Interface in use.

Large Font Display

Textual information presented during Daisy playback is normally highlighted in the context of the overall publication in the main AMIS display, synchronized with the audio presentation of the corresponding text. Display in context, normally an HTML-style formatted page, is not appropriate to a user with low vision. The large font plug-in was created to present the current text in a large type font, using high contrast colors. This text is displayed in a floating window that can be resized and moved.

In Picture 2, an AMIS configuration using two displays is shown. A notebook PC is used as the primary interface to AMIS, with a secondary monitor attached and configured to use the "extended desktop" functionality found in many modern PCs. The extended desktop on the secondary monitor allows for the large font window to be "dragged" over the second screen and maximized.

Flat Panel Display, showing large text in yellow on black, next to a Notebook PC running AMIS on a table top.

Picture 2. Large Font Display

Game Pad

A traditional keyboard may present challenges to some users. In order to explore user interface models for children with learning disabilities, a simple control model was developed using an off the shelf "game controller". An AMIS plug-in was created that maps the joystick and buttons on the game controller to allow "reading" of the Daisy book. The controller is shown in Picture 3 and Picture 4 shows the controller in use.
Display screen showing AMIS, with a game controller below, which has large buttons labelled so that they look similar to those on screen.

Picture 3. AMIS Game Controller Interface

Teenage boy using a game controller to operate AMIS, with an adult observing.

Picture 4. AMIS Game Controller in use.

Scanning Input

In cases where a simple switch mechanism is required, scanning menus can be used to allow users to access program controls. A plug-in was created that allows a user of a sip/puff tube or other switch mechanism to control AMIS and thus read a Daisy publication. An on-screen menu presents the available commands and a moving highlight indicates which menu choice is available for selection. The menu choices and scan rates are easily defined within an XML file. The scanning display is shown in Picture 5.

AMIS Display with scanning menu shown in lower third of screen.

Picture 5. AMIS with scanning menu.


A Braille plug-in was created for AMIS that allows Daisy publications to be read through a refreshable Braille display. The plug-in, initially designed for the Alva Satellite 544, maps buttons on the Braille display to AMIS commands, and allows for easy control of both text only and audio reading. The Braille display is automatically synchronized with the corresponding audio. While reading a Daisy publication, users may pause the audio to review a particular word in Braille, and then resume listening. This interface opens Daisy to deaf-blind users, and provides an interesting method, when used with the large font plug-in, to allow users who are losing their vision to learn Braille with the synchronized presentation of audio, large font, and Braille. A low vision user is shown using a Braille display with AMIS in Picture 6.

Low vision user using both Braille and Large Font display.

Picture 6. Low vision user using both Braille and Large Font display.

Speech Recognition

When physical controls are not an option, speech recognition can provide an effective control interface. A speech recognition plug-in was created to allow simple voice commands to be mapped to AMIS control functions. By using a standard microphone, the user may speak commands such as "play", "pause", "set bookmark"; and the command-words are customized by editing an XML file. This helps us to reach different languages and speaker preferences.


The Flexiboard [5], developed in Sweden by Handitek, is another powerful tool for extending the AMIS interface. The Flexiboard uses a standard keyboard interface to communicate keystrokes. The keys are not ordinary keyboard keys; rather they are "overlays" such as pictures or tangible objects. By using an overlay that looks like an object that is more familiar to the user than a computer (such as images related to reading a paper book), the native accessibility of AMIS and DAISY is coupled with a friendlier computer input method. By using tangible overlays of many different sizes and styles, we can address issues related to motor-control, vision-loss, and deaf-blindness. The combination of the Flexiboard and AMIS is an example of assistive technologies working together to further the scope of DAISY.


AMIS provides a flexible, open source platform for creating Daisy reading systems. The plug-in and skins model allows the AMIS user interface to be easily adapted to a variety of user requirements. We look forward to contributions from users, disability researchers and specialists around the world to create an open library of plug-ins and skins to help bring the potential of Daisy to everyone.


1. DeMeglio, M., Hakkinen, M., & Kawamura, H. Accessible Interface Design: Adaptive Multimedia Information System (AMIS). ICCHP 2002, Linz. Springer Lecture Notes in Computer Science.

2. The AMIS Project, http://www.amisproject.org 

3. The Daisy Consortium, http://www.daisy.org 

4. World Wide Web Consortium (W3C), User Agent Accessibility Guidelines, http://www.w3.org/TR/UAAG10/ 

5. Handitek: http://www.handitek.se/text/eng/flexi.htm 

Go to previous article 
Go to next article 
Return to 2003 Table of Contents 
Return to Table of Proceedings

Reprinted with author(s) permission. Author(s) retain copyright.