Go to previous article
Go to next article
Return to 1999 Conference Table of Contents
Adaptive Technology Resource Centre,
University of Toronto
Project URL: http://www.utoronto.ca/atrc/rd/vrml/update.html
Presently, Internet or Intranet delivered curriculum does not simulate the experience of touching and manipulating objects or environments (referred to as haptics). This restricts the number of subjects that can be effectively taught, and the types of students who can access the curriculum. This paper describes a project that will develop software applications that make it possible to deliver curriculum that can be touched, manipulated and heard over the Internet or an Intranet. Both the necessary software tools and exemplary curriculum modules will be developed. Developments will be based upon the 3D ISO standard VRML, and a Haptic API developed by Haptics Technology Inc.
Successful instruction of learners who are blind or students with learning disabilities, who favour tactile learning styles is heavily dependent on touch and tactile manipulation of physical objects and models. Dependence on manual manipulation and exploration is most pronounced in topics such as science, math and geography. For reasons of pedagogy, economics, access and efficiency, an exponentially increasing amount of post-secondary, secondary and elementary curriculum is being offered over the Internet, the World Wide Web, or an Intranet. With the transition to electronically delivered curriculum the modalities of touch and tactile manipulation are not presently available. From an access perspective there are compelling reasons to add tactile manipulation and 3D audio to education delivered at a distance ; luckily compelling reasons exist for all learners.
Human beings begin life as multimodal learners. A child's natural instinct is to touch, manipulate, see, smell and even taste a new object. As the child matures, socialization favours some learning styles over others, but recent research has shown that many new concepts and skills are better integrated when more than one sense or modality is engaged in the learning process. No teacher would question the value of hands-on learning or practical demonstration. The result of our social bias toward the use of the auditory and visual channel has resulted in the well documented phenomenon of visual and auditory information overload (1).
"Haptics" is a term which encompasses both the sensing and action involved in touching and manipulating. The primary reason the haptic modality is often the preferred mode of exploration is that it is the most active and interactive. Unlike the visual and auditory modality, by its very nature it is bidirectional and interactive. We manipulate the objects we are sensing in a continuous action, feedback, reaction loop. Thus many people do not feel they have really "seen" an object unless they have handled it and explored it using their haptic sense (thereby contributing to the frustration of many museum and art gallery curators).
Some students learn better when their tactile and kinesthetic senses are engaged in the learning task. Learning styles theory has shown that students have learning style preferences which optimize their ability to obtain and successfully integrate new knowledge (2). A large number of students learn optimally when their tactile sense is engaged and they are able to manually manipulate objects.
This phenomenon is intensified for students with learning disabilities who may be restricted to one learning style, when they are integrating new concepts or processing information (3). For example some children are unable to count unless they touch the objects to be counted. Special educators devote a great deal of time to converting teaching materials into a tactile medium for students with certain types of learning disabilities, attention deficit disorder, or developmental delay.
Students who are blind do not have the visual channel available and must rely on audio and tactile information. Students who are deaf and blind cannot use either the auditory or visual channel and must rely solely on the sense of touch. The areas of science and mathematics have traditionally been difficult to access for students with visual impairments. Complex and high-tech fields such as Chemistry, Physics, Engineering, Biology, and Mathematics are rife with visually presented concepts and information. Historically, this complex visual information has not been made available for widespread use in a format easily accessible for blind and visually impaired students. This lack of information, in turn, leads to decreased interest in scientific fields by students who are blind. (4)
In education, the use of graphics has increased dramatically. Textbooks, especially in the sciences, were once predominantly text with a few diagrams. Now they are predominantly graphics with text playing a supporting role. The provision of suitable tactile alternatives for the student who is blind is not always easy or straightforward. Presently, in order to provide access to the standard curriculum, physical, tactile models or tactile are labouriously created at great expense to teach subjects which involve spatial concepts or physical properties, such as: geography, geometry, biology, physics, physiology, or astronomy, to name just a few. A large array of technologies are used to create tactile graphics or models for students who are blind. The problems or shortcomings of commercially available products which address this function include:
The greatest disadvantage is that the information presented is not dynamic or flexible, you cannot zoom in or out, the objects don't have behaviours, and the information is not scalable or gradable . These shortcomings are more easily overcome when the information is presented electronically.
Virtual Reality Modeling Language (VRML) is a description language for three-dimensional objects, widely accepted as a 3D standard for transmission of 3D objects and environments over the World Wide Web. Because of its relative compactness, VRML is the primary language used to transmit courseware involving 3D simulation over the Internet, the World Wide Web, or over an Intranet. Presently the VRML standard does not have provisions for haptic rendering or control, nor is it accessible to people with disabilities.
Although VRML does not presently support the haptic modality, devices which display haptic information and allow haptic control are available. Haptic devices may be viewed as computer peripherals forming combined display/input devices. Where computer graphics addresses vision, "haptic" is concerned with touch and kinesthesia. Haptic devices make it possible for users to "touch," using their own hands and fingers, objects presented on computer displays as if they were real physical objects. This is done by simulating the forces that one would feel when touching a real object and presenting these forces to the user by using the force and tactile feedback capability of a haptic device. When done properly, this creates the illusion of "touching" an object.
This project will 1) make it possible to simulate practical demonstration and hands-on experimentation via the Internet, the World Wide Web or an Intranet, and 2) provide equal access to learners with disabilities. This will be achieved by
The actual project deliverables are described below.
Unfortunately, VMRL does not presently support haptics and physical models. Significant development efforts are required to add haptics and related modeling of solids to VRML. Preliminary studies conducted by Evan Wiess (5) show that three fundamental problems must be addressed in order to incorporate haptics in any VRML environment. First a method for specifying haptic properties must be developed. These include friction, damping, texture, vibration, etc. VRML does not provide a node for specifying these properties directly, however VRML does support a protonode which can be used to extend the VRML language.
A multimodal protonode will be created that will include the specification of haptic properties. This node will also be used to group, position and orient multimodal information. Next a representation of the haptic device must be integrated into the VRML world. Finally, interaction between haptic devices and VRML objects must be specified. On the one hand, a haptic device is a pointing device that is used for pointing, selecting, dragging and changing the VRML world. On the other hand, a haptic device is a display that gives the user a rapid and very intuitive understanding of the VRML world. The VRML language must be extended to handle these complex 2 way interactions.
A generic, robust and modular software driver is required to display and control the extended VRML environments. This system should be device, system and application independent. This would allow different types of haptic devices to be connected to VRML applications, including 2D devices (such as the PenCAT) and 3D devices (such as the PHANToM). The system must also fulfill realtime requirements for haptic systems while using the real-time limitations imposed by VRML. TouchKit will act as the base technology for this software driver.
3D simulation courseware will not become popular unless there are tools which allow educators to easily create or modify this courseware. Existing tools are created for programmers with specialized 3D rendering skills. What is needed is a user interface shell which provides an easy to use front end to a collection of necessary utilities. This front end must use language and metaphors which are understood by the average educator and provide choices which the educator wishes to have available.
This project will create such an authoring shell or front end using an array of existing tools, APIs and utilities. This shell will therefore be linked to libraries of existing VRML objects, template lesson plans and interactive exercises or structures into which the educator can plug in the desired curriculum content. The interface will be designed in such a way that the educator can create courseware quickly and easily using the provided defaults or delve further into the choices in order to customize the lesson.
The first exemplary courseware module will teach targeted curriculum in allied health education. In cooperation with the Department of Occupational Therapy and Biomedical Communication at the University of Toronto, curriculum which involves anatomical visualization and teaches palpation will be selected. Targeted learning outcomes will be identified. The specific anatomical structure will be rendered using VRML, the VRML extensions and the enhanced Touchkit. In addition to traditional interactive exercises, the courseware module will include a monitoring or biofeedback capability which allows the student to objectively monitor their actions (e.g., the location and amount of pressure) and compare them to the ideal. The relative success of meeting the desired learning outcomes using the courseware will be determined through user trials.
The second module will provide geography curriculum to students who are blind. Using contour information, a collection of maps will be rendered in three. A full array of geographic structures will be mapped to haptic, auditory or combined modalities. Thus, students will be able to feel cities; from the associated tone which occurs when they encounter the city they will be able to determine the approximate size of the city. By pressing on the city they will hear the name. The student can choose to zoom into a specific location and query a large database of associated geographic data. If they want information on scale or relative location they can turn on a haptic grid which may feel like a number of strings or elastics strung over the map. This curriculum will be tested by students at the elementary/secondary level as well as the post-secondary level.
This project will add haptic and multimodal capabilities to the 3D standard, VRML. It will make VRML courseware accessible to students with disabilities, thereby providing a cost effective, flexible and powerful method of teaching spatially and graphically based curriculum to students who are blind or students who require tactile input in order to process new concepts. The project will also provide risk free learning environments for students learning manual skills such as physical therapy or skilled trades.
The author would like to acknowledge the project team and Canarie Inc for its financial support.
Brown JS, Collins A., Duguid P., 1989. Situated Cognition and the Culture of Learning. Educational Researcher 1989; 18(1):32-42.
Murphy-Judy, K., Learning Modalities, Styles and Strategies. chttp://www.fln.vcu.edu/Intensive/LearningStrategies.html
Zeichner, K.M. (1995). Educating teachers to close the achievement gap: Issues of pedagogy, knowledge, and teacher preparation. In B. Williams (Ed.), Closing the achievement gap: A vision to guide change in beliefs and practice (pp. 39-52). Washington, DC: U.S. Department of Education, Office of Educational Research and Improvement.
Bumpy Gazette Online Volume 2, Issue 1, June 1996
Evan Weiss, 1997. The Addition of the Haptic Modality to the Virtual Reality Modeling Language. Master's thesis. MIT.
Go to previous article
Go to next article
Return to 1999 Conference Table of Contents
Return to Table of Proceedings