Go to previous article.
Go to next article.
Return to 1998 Conference Table of Contents
Lake Porter and Jutta Treviranus
With the advent of virtual technology it is evident that the world of computers is attempting to imitate the real world. In the real world people receive and disseminate information in three dimensional space. In communicative modes such as conversation the relative position and gestures in three dimensional space carry large amounts of information. In educational spaces such as museums different topics and levels of detail are separated by space. In stores shoppers perceive categories over space.
While graphical user interfaces allow computer users to access information using iconography, a virtual world allows a user to access information by imitating the three dimensional space that exists in the real world. We refer to this imitation of real space as virtual space. In the real world our primary interface with the physical is our haptic sense (sense of touch). So in order to complete the imitation of real space one would expect that a haptic interface would be necessary. A haptic interface is a device which allows a user to interact with a computer by receiving tactile feed back. A haptic device achieves this feedback by applying a degree of opposing force to the user in the x, y, and z axes.
A haptic interface would serve to orient users to the location and nature of objects in a virtual space. An orthotic interface, whether embracing the entire upper body or only the tip of a finger could give the user information about the nature of objects inside the world. A less complicated interface which conveyed haptic information through a device akin to a pointer could show the user a tactile map of the area. The focus of this paper will be on the former of these two devices, and its potential for operating in virtual space created with the virtual reality modelling language.
The virtual reality modelling language (VRML) has been designed to create dynamic virtual spaces in an extremely efficient formate to facilitate the sending of these virtual spaces over the internet. As yet a user of a virtual space created with VRML is limited to visual and aural interaction. Section four of this paper will discuss ways in which a virtual space created with VRML could send relevant data to a haptic interface.
There are many access issues surrounding the implementation of a haptic interface. The haptic sense could be used to give information in the place of vision or hearing. Aspects of a virtual space which require haptic manipulation could eventually provide barriers to users with mobility impairments. These issues will be discussed in section three.
The haptic sense is unique in that, unlike the passive senses of vision and hearing, action (manipulation) always precedes perception. This means that in order for a haptic interface to be effective certain considerations should be taken in the creation of virtual space.
The creation and exploration of virtual spaces will always be a creative effort relevant to the purpose of the space. The following points are intended as guides to creating virtual spaces with a mind to users of haptic interfaces, not as absolutes to which every virtual space should or will conform.
The nature of the haptic sense means that perceived objects must be within manipulating range. This means the position of objects relevant to the haptic sense is important. Objects should be placed within manipulating range of a viewpoint (0.5 meters). This will allow users to quickly scan or sweep the area and perceive all relevant haptic data without relying on a visual display. It will also accommodate easy perception of haptic data without requiring the user to change position.
Preset positions to which a user can jump will be very important to the user of a haptic interface. As stated in the previous paragraph such a preset position would take the user within haptic range of the object(s) to be perceived. Preset positions would also be a great assist in navigation allowing a user to move from point to point without moving through a void which contains no haptic data to be guided by. In real space we are in constant contact with the physical world around us. A haptic interface is capable of imitating this contact if the designer makes appropriate considerations.
In the real world people perceive separation of topics, categories, and level of detail over space. It is most intuitive for the user of a haptic interface to be able to perceive these separations in the space they are exploring. For instance in a space designed to disseminate travel information for Australia an object in front of the user may be telling the user about tourist sites in Sydney. An object on the users left may be giving current weather conditions, while on the right a suggested packing list is displayed. For information on Tazmania the user may have to jump to a new preset position.
A space designed to optimize use for the user of a haptic interface would have some very specific and unique characteristics. The implementation of a haptic interface also effects the way a user could/would navigate a virtual space.
With the advent of alternative human computer interfaces such as voice recognition, and haptic interfaces computer users have the option of using the interface which is best suited to their physical and cognitive needs and preferences. As individualised interfaces become more and more common designers will have to start designing for the user instead of the keyboard and mouse.
A computer user with a sensory impairment could use a haptic interface to receive information otherwise only available to the impaired sense. A user with vision impairment could perceive shape, and a user with a hearing impairment could perceive the vibrations caused by sound through the haptic interface.
One universal advantage to a haptic interface is the enhancement of telepresence. When communicating humans send much information through gesture. A haptic interface would allow a user communicating in a virtual environment to make gestures via his or her avatar*. VRML does not presently support multi user environments but eventually conversing on the three dimensional web may be a regular component of interacting on the web.
Users who lack fine motor coordination may find that inputting information with a haptic device easier. Large targets and control areas would benefit such a user.
Some users who have a mobility impairment may find some tasks difficult or impossible to achieve with a haptic interface. Therefore, it is desirable to have keyboard equivalents for haptic dependant tasks such as pushing or otherwise manipulating objects.
There are several ways in which data from a world created with VRML could be sent to a haptic device. The main differences between these methods are the amount of haptic control the creator has, and the amount that the VRML language must be modified in order to implement the haptic interface.
The first possible way to implement a haptic interface would be the creation of a haptic node in the VRML language. This node would contain information telling the interface how much relative force the haptic interface should return when an object is touched. The default value for this node would be an equal and opposite force to the force applied by the user. This setting would give the feedback of a solid object without mailability. To add mailability the user would specify an opposing force, relative to the user that was less than equal to the force being applied by the user. To create a rubbery texture the opposing force would be less than equal, increasing over distance until the force was equal and opposite to the users force. This method of adding a haptic interface would require a significant change to the language, but would give the designer maximum control over haptic output.
A haptic interface could also piggy back on the solids properties. The solid properties must be defined in future versions of VRML to facilitate the implementation of advanced behaviours (ie the rubberyness of a ball defining how it will bounce). A haptic interface could use these properties to define the tactile characteristics of an object. While this option requires a change to the language the change will also facilitate another addition to the language. This implementation will allow limited control of haptic characteristics.
Thirdly, a haptic property could be added to the language which would attach tactile data to a node of an object. This would involve a small addition to the language and give a designer limited control over the haptic characteristic of an object.
Lastly, an algorithm could be developed to take the visual characteristics of an object and translate them into haptic feed back. This method would require no change in the language, but would not give the designer of a virtual world any control over haptic feed back.
A haptic interface would allow the user of a virtual space to perceive physical characteristics of the space and interact with those characteristics. The ability to interact with the physical characteristics of a virtual space enhances the illusion of real space.
The unique nature of the haptic sense, and by extension a haptic interface, means that some extensive design considerations must be taken to optimize a space for a haptic interface.
The implementation of a haptic interface allows a user inside a virtual space to receive tactile information, allowing for the supplement or replacement of another possibly damaged sense. Users with fine motor control may also find it easier to use a haptic interface to navigate a virtual space in place of the traditional keyboard and mouse.
The virtual reality modeling language is a means to transmit dynamic virtual space over the internet. This makes communicating over great distances in three dimensional space possible. In order to naturally convey the gesture which carries large amounts of information in typical conversation can only be accomplished with the implementation of a haptic interface.
There are several ways to implement a haptic interface to a virtual space created with the virtual reality modeling language. These methods vary in the amount of control that is given to the designer over haptic characteristics, and the degree to which the language will have to be changed in order to accommodate the interface. The access and VRML communities will have to decide which method is best for the successful creation of virtual space conducive to navigation and manipulation with a haptic interface.
1.Serflek, Chris. Treviranus, Jutta. "VRML: Shouldn't Virtual Ramps be Easier to Build."
2. Serfleck, Chris. "Initial Survey of Reality Modelling Language Access Issues (Draft 1b)." http://www.utoronto.ca/atrc/rd/vrml/intech001.html#i
3. Sulisbury, Dr. Kenneth. "Phantom Haptic Interface Adds Touch to Human-Computer Interaction." HPCwire Nov. 10, 1995.
4. Ressler, Sandy. "Making VRML Accessible for People with Disabilities." http://ovrt.nist.gov/projects/VRMLaccess/examples/mitersaw.wrl
5. VRML. "VRML 2.0 specifications". http://www.vrml.org/VRML2.0/FINAL/index.html
Go to previous article.
Go to next article.
Return to 1998 Conference Table of Contents
Return to Table of Proceedings