1994 VR Conference Proceedings

Go to previous article 
Go to next article 
Return to the 1994 VR Table of Contents 


Input Interfacing to the CAVE by Persons with Disabilities

By: Drew R. Browning
Carolina Cruz-Neira
Daniel J. Sandin
Thomas A. DeFanti
John G. Edel
The Design Visualization Laboratory (DVL)
The Electronic Visualization Laboratory (EVL)
University of Illinois at Chicago

Contact:
Drew R. Browning, Assoc. Prof.
Director, Design Visualization Laboratory
UIC School of Art and Design m/c 036
929 W. Harrison
Chicago, IL 60607-7038
E-mail: u23283@uicvm.uic.edu

ABSTRACT

Common VR systems use head mounted displays that require duplicating user input devices in virtual space for interactive environments. A major advantage of projection based VR systems such as the CAVE is the ability of the user to see his/her own body and real objects in a virtual environment. This paper describes the development of specialized interfaces for virtual environment exploration by people who use wheelchairs. These real, tangible interfaces are intuitive and most appropriate for wheelchair simulations. Examples of these are interfaces that match the user's wheelchair dynamics for joystick or tiller operated electric wheelchairs. The importance of this research is in applications where proposed constructed environments are analyzed by users of wheelchairs. Additional applications include wheelchair mobility training and wheelchair controller design.


THE VIRTUAL REALITY CAVE

Virtual reality (VR) is created by a three dimensional computer graphics system using real-time interactive control and displaying viewer centered perspective. VR usually has panoramic binocular display with a large angle of view. These features are included in the immersive technologies of head-mounted displays (HMD), boom-mounted displays and surround-screen projection-based displays.

The Electronic Visualization Laboratory at the University of Illinois at Chicago has developed a new virtual reality interface called the CAVE (CAVE Automatic Virtual Environment). It surrounds the viewer with projected images of a virtual environment. Three rear-projection screens make up three walls of a ten-foot cube that all but disappear when illuminated with computer graphics. A fourth data projector illuminates the floor for complete immersion.

The projectors are high resolution data type projecting stereo images on alternating fields. Active liquid crystal display (LCD) stereo shutter type glasses are worn by viewers to separate the alternate fields to the appropriate eyes, while viewer head position is tracked by an electromagnet sensing device attached to one of the pairs of stereo glasses. A Silicon Graphics Onyx with multiple processors generates the computer graphics for each projector. The viewer can move around the virtual environment and see his own body as he interacts with real and virtual objects.


THE REAL WORLD IN THE CAVE

Although projection based VR and Head Mount Displays share the essential features of a VR system, they differ in their ability to connect the user to the real world or to augment reality. At UIC we are developing interfaces to the CAVE that are real, tangible and intuitive [2].


Real Objects and Real Interfaces

With typical head-mounted or boom mounted displays the viewer is isolated from the real world. To see ones' own body parts in a virtual environment, those parts must be recreated graphically. [5]. A glove input device is often used to control a representation of the hand while interacting with the virtual world. Other real world objects must also be modeled in computer graphics to include in the virtual space. These objects might involve interface devices such as vehicle controls.

Superimposition of virtual objects and real objects is possible with the use of half-silvered mirrors in head-mounted displays [5]. Because the viewer sees through the virtual objects this technique is most useful for virtual objects in a real environment as opposed to real objects in a virtual environment. A useful application of this feature would be virtual overlays on real control panels for instructional purposes [7].

The CAVE also allows for combining real world objects in a virtual environment but these objects are unobstructed by the virtual environment. Real objects can, however, occlude virtual objects if the real object is behind the virtual object's virtual location. This would be the case with virtual overlays. Alternatively, highlights can be employed with virtual outlining of a real object or adjacent graphics and text can provide directions. For example the sequence of operations in using wheelchair controls could be highlighted with animated symbols and virtual directions. Occlusion also happens if a virtual object is intended to be near the center of the CAVE and a viewer or real object is between the virtual object and the screen. Careful planning can avoid these situations.


Real Navigation with Real Objects

For virtual environments that require less than 10 feet of movement to navigate, the user can move as normal in the CAVE. In simulating accessible interior spaces, real interfaces and real objects can be installed in the virtual environment. An example would be the interior of a mass-transit vehicle.

Wheelchair securement devices such as the Seattle Red-belt can be used with a folding transit-style seat to mock up a wheelchair tie down, while the rest of the interior is modeled in VR. The user can analyze reach requirements of the belts as well as clearances in maneuvering his/her wheelchair into the tie-down area. Using real objects, with their ability to have physical feedback and greater detail, a simulation can be made far more accurate.


Virtual Navigation

Larger spaces can be virtually navigated using a joystick interface. For example a person could virtually traverse the length of a train station platform while checking accessibility features. This joystick interface was built to simulate an electric wheelchair. It can be attached to an existing wheelchair, manual or electric, or to any chair. The physical nature of this joystick interface can make a simulation very compelling. Although a "treadmill" style interface would be most appropriate for manual wheelchairs [14], a joystick approach was convenient and more universal.


Shared or Guided Experiences

The CAVE has the ability to augment reality for multiple viewers as well. Although only one viewer would be position-tracked, additional viewers need only wear the stereo glasses to see what the other people see. This is very important for collaborative work such as analyzing design models for accessibility or evaluation of a disabled client for reach limitations. Since only one person's motion can be tracked, priority must be given to the person who is evaluating distances. For evaluation of a transit model, the tracker glasses can be passed between the person in a wheelchair and the person who is checking clearances. For training or therapy situations the client can be guided through a session in a truly shared experience with effective communication, not second hand interpretation. Guided sessions are also important for the potential occurrence of cybersickness and the need for intervention.


Cybersickness and Real World References

Real world references assure a viewer of their balance and equilibrium. Losing a sense of location and orientation in a virtual world can lead to fear in the viewer and potential nausea (cybersickness) [4,6,9,11]. In a projection based system like the CAVE disorientation and nausea are less of an issue because the user's view is not isolated. The viewer can be immersed in a virtual environment but still be conscious of the real world surroundings and their own body.

Cybersickness can be very important in applications involving people with disability, particularly those disabilities that affect balance and equilibrium. If VR is used for mobility training, the user must be able to deal with the potential isolation of a virtual environment. If the user cannot maintain their equilibrium in VR, the training will not only be useless, it could also be harmful [1,12].


INPUT DEVICES FOR VIRTUAL REALITY

Access to virtual reality interfaces by persons with disabilities is similar to the problems of access to computers in general. For people with physical disabilities the input devices of greatest concern have been the keyboard and mouse. A variety of keyboard and mouse aids and substitutes are now available. These devices are typically used for text and numeric data entry and cursor steering. In virtual reality these functions also exist with an emphasis on three dimensional cursor steering or navigation.

Another common adaptive input technique is head control. With this technique an ultrasonic device is strapped to the user's head for cursor steering. Since a position tracker would normally be mounted to the CAVE user's head, head monitoring is automatic. With 3D position-tracking, movement recognition techniques can be used for head movement or wand movement as an alternative input method. Movements can be small and subtle or large and obvious.


The Digital Wand

Until recently we have used a prototype wand input device that contains a position sensing device and three digital button switches. The position sensing device is the same as the device attached to the stereo glasses for head position monitoring. The three buttons were simply wired in parallel to the buttons of a mouse attached to one of our workstations. In many ways it can function as a "3D mouse" for cursor steering in three dimensions. Navigation is accomplished by pointing in the desired direction of travel while pushing the appropriate button to move. Object manipulation is accomplished by pointing at the object and intersecting it with a virtual wand extension.

Our first wand was quickly constructed of a plastic flashlight case with the switches and position sensor mounted inside. Testing and repairs were difficult because the switches had to be removed each time the sensor was changed. A second wand was designed to allow better access to the parts and to improve the ergonomics. Physically, this wand was constructed of an acrylic armature with a grip that was bent off axis for a more natural handle. The switches were mounted to the acrylic in a row for thumb operation. Because the position sensor works with magnetic fields, it had to be kept away from the switches to avoid interference. Consequently, it was mounted approximately three inches away at, the front end of the wand. This produced an extension that emphasized its pointing function. The potential of interference also prevented us from using metal in the wand structure. Finally the entire wand was enclosed in a thermal plastic "split case" shell that conformed to the structure.


The Analog Wand

Connecting to the existing mouse interface of a workstation, although expedient, did not allow for more accurate navigation or precise control as afforded by analog devices. To remedy this situation we decide to use an IBM PC clone as our input device interface. By doing this we could take advantage of the wide variety of devices and software available on the market. For example most PC clones now include a mouse input and game port. Many types of joysticks for flight simulators or other game software exist at low cost. In addition a wide variety of digital and analog I/O boards are available at reasonable prices.

Using the PC interface also meant that the many adaptive input devices available for persons with disability could be used for virtual environments. For example chin operated track balls or foot operated joysticks can replace typical mouse operations. For text or numeric input specialized keyboards are available that replace or augment a standard keyboard (expanded keyboard, mini keyboard, key latches, key guards). Even simple switches that can be placed conveniently where the person can most easily use it makes many features accessible.

Software is also available that can replace or make a standard keyboard more usable (character scanning virtual keyboards, Morse code input switch, key repeat eliminators, sticky keys, macro keys). Of course voice input systems hold great potential for controlling all aspects of the virtual reality interface by persons with movement limitations and are readily available for the PC at prices that keep falling. Head control software may also be adapted to the CAVE environment.

Initially the PC communicated with five workstations controlling the CAVE through a shared memory network. However, recently we have replaced these single processor workstations with a multiprocessor machine. This machine, a Silicon Graphics Onyx RE2, uses internal shared memory. Although we can continue to communicate between the PC and the Onyx using the shared memory method, we decided in favor of serial communication for reasons of simplicity and cost. Analog devices are connected to the PC through an analog to digital converter board installed in the PC. A box actually connects the I/O board in the PC and the input devices. This box serves as a way to connect multiple input devices and conditions the inputs for the I/O board parameters.


Analog Wand Development

A new wand prototype was developed for the PC interface that includes a thumb operated force type joystick along with the tracking device and three switches. This joystick produces two analog values as the user applies force in a particular direction. Force type joysticks (as opposed to displacement type) are used in flight sticks on military aircraft. These small devices take up little room and are well suited to hand held devices. Its two dimensions of control can be programmed for any applications where continuous variability is appropriate. For example navigation can now be accomplished by pointing the wand in the direction of travel and controlling velocity with pressure on the joystick.


Wheelchair Simulation

Although wheelchair simulations could be performed using the capabilities of the wand joystick we decided to develop a separate wheelchair displacement style joystick that can be mounted to the armrest of a chair or wheelchair. Navigation with this joystick can be programmed to accurately simulate electric wheelchairs of many sorts. For conventional electric wheelchairs pushing the stick forward produces forward travel while reverse travel would be controlled by pushing the stick backward. Pushing the stick further in either direction increases the speed of travel. Left/right movement of the joystick would rotate the user relative to the virtual environment as if turning the wheelchair. Simply stated, pushing the joystick in the desired direction of travel will turn and move the wheelchair in that direction at a speed proportional to the displacement of the joystick. Alternatively an omni-directional wheelchair can be simulated by programming the joystick such that pushing the stick in the desired direction of travel will move the wheelchair in that direction without turning.

Other electric wheelchair dynamics can be simulated as well. For example different wheelchair configurations (front or rear caster design, size of caster etc.) or controller designs (sensitivity, oversteer, understeer, smoothing etc.) can be programmed into the software [3].


Problems

The cables necessary to connect input devices to the computers are a serious problem. The viewer using a wheelchair must be careful not to run them over or get tangled in them. We are currently considering ways to eliminate as many cables as possible with wireless communication such as rf or infrared. Initially this will replace cables to controls other than the position sensing devices (which will likely depend on manufacturer developments to become wireless). Without eliminating all cables the user will still be tethered, but by minimizing the number of cables we will reduce the weight and ready ourselves for wireless tracking technology.

Another problem for wheelchair users is that the tracking hardware uses magnetic fields to work. Large bodies of metal and electrical signals such as those found on an electric wheelchair cause interference and non-linearities in the data. We have found however that the extended range of new tracking devices has improved this problem considerably.


CURRENT APPLICATIONS

ADA Transportation Compliance

With the passing of the ADA, public transit agencies are obligated to provide access to their services to people with disabilities. Transportation providers must scramble to meet accessibility deadlines which force them to face reconstructig facilities which can date to the nineteenth century. Current conditions must be evaluated and design strategies developed to fit within today's limited transit budgets. The ADA calls for making "Key" stations accessible, these are stations which are either transfer points or see a high volume of traffic. For a transit system to be viable for its users, a large enough percentage of the facilities must be accessible.

The Design Visualization Lab currently has a proposal into the Chicago Transit Authority to develop strategies for making Key stations wheelchair accessible. Innovative new ways to raise patrons to platform level at elevated stations need to be devised. The standard method of making a station accessible is to install an elevator. At many CTA's stations, this would require widening the platforms and support structure to accommodate the increased width of an elevator. This would require tearing down private and public structures making the job prohibitively expensive. By surveying state of the art accessibility equipment and developing new ideas, we hope to solve the problem. The CAVE would be used extensively in this process to view and evaluate models of existing conditions and simulate new access methods. Tremendous amounts of time and money could be saved in the design and prototyping phases of the project by working bugs out of potential solutions before they are sent to an engineering firm for final planning.


Mass Transit Interior Design

At UIC we are continuing to develop accessible vehicle interiors. Models of the University campus shuttle bus and various public transit vehicles have been built for evaluation in VR. Bringing these models into the CAVE has allowed us to evaluate interior layouts for wheelchair accessibility. The wheelchair lift area can be inspected for clearance and the stanchions checked for position and height. By having the primary investigator wear the head tracking device, accurate clearance measurements can be made around wheelchairs, real objects and virtual objects.


Instruction for People Using Wheelchairs

It also makes possible instruction before a person learns to run a wheelchair. In the safe environment of the CAVE, you can make errors which may be fatal outside of VR. You can also try maneuvers which one is afraid to try for fear of injury, i.e., crossing traffic at an intersection. In the case of teaching people to use a powered chair, much of the fear and danger can be removed from the experience, making the transition easier [10].

The CAVE can be used for teaching ambulatory persons how people in wheelchairs function in everyday life. Designers can evaluate their ideas before committing to manufacture, and Architects can see how much space is required around a chair for ease of use. Employee sensitivity training could be much more effective if participants viewed boarding a bus from the position of a passenger in a wheelchair.


EQUIPMENT
Input devices
CONCLUSION

The CAVE is an important and effective design tool for evaluating environments and products. The ability to include real world objects in a virtual environment sets the CAVE apart from other VR systems which entirely replace the viewer's environment. By being able to experience a VR simulation with more than one person, the CAVE allows for shared or guided evaluations which can be highly effective. Joystick navigation in the CAVE is a natural interface for all people as well as those who use joystick controlled powered wheelchairs. Additional adaptive input devices may also prove to be beneficial to nondisabled CAVE users. This allows us to provide a smooth transition to the virtual world. The CAVE is accessible to a larger percentage of the population by virtue of its roll-in construction and real world interfaces making it ideal for accessibility simulations.


ACKNOWLEDGMENTS

The CAVE is a research activity of the Electronic Visualization Laboratory of the University of Illinois at Chicago, Thomas A. DeFanti and Daniel J. Sandin, Co-Directors. Support from Argonne National Laboratory and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign. Major funding is provided by the National Science Foundation (NSF) grant #IRI-9213822 and NSF grant #CDA-9303433, which includes support from the Advanced Research Projects Agency (ARPA).


REFERENCES
  1. Biocca, F. Will Simulation Sickness Slow Down the Diffusion of Virtual Environment Technology? Presence, Volume 1, number 3, Summer 1992, pp 334, 337-338.
  2. Browning, D.,Cruz-Neira, C., Sandin, D.J., DeFanti, T.A. CAVE, Projection-Based Virtual Environments and Disability. Proceedings Virtual Reality and Persons with Disabilities - 1993, California State University (CSUN) Center on Disabilities, 1993
  3. Collins,T.J., Kauzlarich, J.J. Analysis of Parameters Related to the Directional Stability of Rear Caster Wheelchairs; University of Virginia Rehabilitation Engineering Center, 1987
  4. Cruz-Neira, C., Sandin, D.J., DeFanti, T.A. Surround-Screen Projection-Based Virtual Reality: The Design and Implementation of the CAVE. 1992,pp 2-3, 9.,11
  5. Cruz-Neira, C., Sandin, D.J., DeFanti, T.A., Kenyon, R., and Hart, J.C. The CAVE, Audio-Visual Experience Automatic Virtual Environment. Communications of the ACM, June 1992, pp 67-72
  6. DiZio, P., and Lackner, J.R. Spatial Orientation, Adaptation, and Motion Sickness in Real and Virtual Environments. Presence, Volume 1, number 3, Summer 1992, pp 323
  7. Feiner, S., MacIntyre,B., Seligmann,D. Knowledge Based Augmented Reality. Communications of the ACM, Volume 36, number 7, July 1993, pp 53-62
  8. Figueiredo, M., Bohm., Teixeira,J. Advanced Interaction Techniques in Virtual Environments. Computers and Graphics, Volume 17, number 6, 1993, pp 655-661
  9. Hettinger, L.J. Visually Induced Motion Sickness in Virtual Environments. Presence, Volume 1, number 3, Summer 1992, pp 306-
  10. Inman, D. Oregon Research Institute
  11. Kennedy, R.S., et al. Profile Analysis of Simulator Sickness Symptoms: Application to Virtual Environment Systems. Presence, Volume 1, number 3, Summer 1992, pp 296
  12. McCauley, M.E. and Sharkey, T.J. Cybersickness: Perception of Self Motion in Virtual Environments. Presence, Volume 1, number 3, Summer 1992, pp 311, 313
  13. Sandin, D., Browning, D. Evaluating the Visual Aspects of Product Design: Simulating Ergonomic Studies From Within Virtual Realities; Manufacturing Research Center at the University of Illinois, May 1991
  14. Trimble, J. Virtual Barrier Free Design. Team Rehab. Nov 1992

Go to previous article 
Go to next article 
Return to the 1994 VR Table of Contents 
Return to the Table of Proceedings 


Reprinted with author(s) permission. Author(s) retain copyright.