1995 VR Conference Proceedings

Go to previous article 
Go to next article 
Return to 1995 VR Table of Contents 


William Meredith
333 West 86th Street
Suite 1901
New York, NY 10024-3112
(212) 873-1813

Executive Vice President,
Veterans Bedside Network


For over 46 years the Veterans Bedside Network has provided recreational opportunities for hospitalized veterans throughout the United States. The organization was founded in 1948 by singer and actress Jean Tighe to help patients with their recuperation. She understood the influence that a person's state of mind has on the healing process, so she created a program to provide interactive recreational therapy. Unlike the USO which came in and did a song and a dance for the patients, the VBN encourages the patients to do the singing and dancing themselves. Rather than just being a passive audience, the patients become the entertainment. Giving them the opportunity to sing or act helps them attain a better state of mind, helps them to forget their pain, and achieves a camaraderie with the other patients.

Although never specifically quantified, it is apparent to everyone involved that participation in these activities has a positive impact on the patients' demeanor. We have literally witnessed people who, looking ashen and using a walker when they arrived, have left our activities with color in their cheeks; the walker forgotten in a corner. By providing the opportunity to engage in these pleasant diversions, and improving their state of mind, we believe our activities can help speed a patient's recovery.

Administered almost exclusively by volunteers, the VBN, as it is popularly known, originally involved the patients in "sing-alongs" and theatrical events. Over the years, additional technologies became available and were successfully incorporated into the VBN's repertoire. For example, some activities are based on scripts from the popular show M.A.S.H., with patients playing the roles of Col. Potter, Radar, Hot Lips Houlihan and all the other zany characters of the 4077. With a VBN volunteer directing, the patients read their respective parts from the scripts and their performances are tape recorded along with accompanying sound effects and music, all done live. At the end of the evening, the tape is played back so they can enjoy the fruits of their labors, and have a good laugh at the show.

Television has also become an integral part of the VBN. Some video activities include patients acting in "Blackout" skits, or creating and performing humorous TV commercials for imaginary products. Occasionally, they are given a chance to interact with sports announcers or other personalities who regularly appear on network television. Videotaped live, and later edited, the final product is then returned to the hospital for airing on the in-house, closed- circuit system.


Unfortunately, there is one group that has not usually been included in our activities. These are the patients in the Spinal Cord Injuries Unit, many of whom are permanently bed- ridden. The difficulties involved in getting them all together physically in one place, coupled with their limited mobility often precluded participation. I had always felt that there had to be a better way to provide our services to these patients. After attending conferences on virtual reality in 1990 and 1992, I realized that here was a technology that might eliminate some of these physical barriers. Virtual reality could remove the need for the patients to go to the site of the activity. Instead, the site could be brought to them. And, we could give them a chance to do things in the virtual world that were beyond their physical abilities in the real world; particularly, competitive sports activities.

During 1993 and the beginning of 1994, I presented this concept to numerous groups in the hope of obtaining funding for a pilot program. While enthusiasm and interest were high, the expense of such an undertaking was, at that time, prohibitive.

Although I had experienced what was considered a state of the art VR system at Jaron Lanier's VPL in 1990, the cost of such a system was well in excess of a hundred thousand dollars. This was obviously beyond the means of our organization, so I began researching methods of achieving the best possible VR experience for the least possible financial investment. To this end, I joined the Virtual Reality Alliance of Students and Professionals (VRASP), obtained access to the internet and began reading vast amounts of reference materials.

Meanwhile, word of my interest reached Mike Abelson, Director of Recreation at the Bronx VAMC. He contacted the VBN office and invited us to outline our ideas to the appropriate hospital staff members. As expected, this initial meeting generated a great deal of interest. We were then asked to address the trimester meeting of those physicians, therapists, nurses and other professionals associated with the SCIU's at Castle Point VAMC and East Orange VAMC. Once again the concept was greeted with excitement and enthusiasm.

Shortly after this presentation, in early 1994, we were given approval to initiate a program using VR at the Bronx VAMC. The only caveat being that the VBN would be completely responsible for everything involved in the program.

Once we were given the green light, we again met with various patrons to request their support. Finally we obtained very limited funding to initiate our program. In fact, this activity was launched with only $5,000. But, by this time, I had accumulated enough information to know that it was not totally impossible to proceed.

While I would have liked to have had very powerful workstations with accelerated graphics capabilities, a number of considerations led us in another direction. Primary of these was the extremely tight budget. Second, there would be no place in the hospital to leave equipment permanently set up. Third was the desire to eventually take the activity to the patients' bedside whenever possible. Accordingly, we purchased two 486DX2/66 notebook computers with 4 mb of RAM and monochrome screens. To provide a more useful display, we also purchased two VGA color monitors which could be directly attached to the computers.


To enhance the feeling that one is seeing the virtual world generated by the computer, some apparatus is required that isolates the participant from everyday reality. Usually this is a helmet with integral image displays, headphones and a position tracking device. This combination is collectively called a head-mounted display, or HMD. These systems are usually very expensive and have substantial weight. Considering the potential dangers of placing a heavy HMD on someone with spinal injuries, I opted for a simpler approach using LCD shutter glasses. These glasses contain two individually wired liquid crystal panes. While normally transparent, the embedded liquid crystals align when a voltage is applied, and the pane becomes opaque. This is the same technology seen in hand held calculators where the digits appear black against a transparent background. Although the view of a virtual world seen through these glasses is not truly immersive, it can be 3-dimensional. LCD shutter glasses have the added advantages of being lightweight, easier to program for, and they eliminate the weight and expense of a head tracking system.

In operation, this method requires the graphics engine to create two images of the presently rendered viewpoint, each offset from the other by the eyes' interocular distance. While displaying each separate image in rapid succession, the LCD shutters are alternately made transparent or opaque in sychronization with the appropriate image. This ensures that the right shutter of the glasses is transparent only when the right eye's image is displayed, and vice versa. Persistence of vision then allows the brain to combine these two images, thus creating a 3-dimensional environment. Unfortunately, the display screens on laptops do not decay fast enough for this trick to work, so the effect can only be seen on the external SVGA CRT monitors.

As implemented in the software, the synchronization voltages that trigger the shutters appear at a COM port. Because these voltages will not properly toggle the LCD's, additional electronics are needed. I built interface boxes based on a schematic I found on the Internet. Recent developments in head-mounted displays hold the promise that we may soon be able to provide immersive experiences to selected patients. Already there are systems in the 8 oz. range, and prices are dropping rapidly as well.


For manipulation of the objects in these virtual worlds, we use the Mattel PowerGlove, originally created for the Nintendo Entertainment System. Home-brew VR enthusiasts have shown a great deal of interest in these gloves because they are the only such sophisticated, low-cost appliances available. For tracking, they use an ultrasonic system to determine their spatial location. Two speakers mounted on the glove and three microphones, usually located near the display monitor, generate this information. Each speaker alternately makes an ultrasonic click which is then detected by the microphones. By measuring the time delay between the click and its reception at each of the three microphones, it becomes a relatively simple matter to calculate pitch, roll, yaw and distance. These values are then applied to the position of a hand generated within the virtual world. This virtual hand appears to follow the movements of the gloved hand. Because this is an acoustic system, unwanted side effects such as jitteriness or jumping can occur, caused by reflections and echoes. Therefore, care is taken to provide a suitably dampened or absorptive environment wherever possible.

The thumb and first three fingers of the glove each contain a variable resistive strip which measures the amount of flex applied. This measurement is decoded to determine the positions of the individual fingers, and also controls the graphical representation of the hand. As the wearer points or makes a fist, the virtual hand does so as well. By allocating special meanings within the software to certain positions of the thumb and fingers - known as gesture recognition - one can manipulate objects or control events within virtual worlds.


When an object is selected with the glove, our rendering software generates a white wireframe to highlight the selection. This indicator remains whether the object is within view or not. We found that it was difficult for a participant to know whether or not an object was still selected if it was not presently visible. In an attempt to minimize this confusion, we purchased a tactor. This is a small apparatus constructed with a piece of titanium-nickel wire. This wire contracts when an electrical current is applied to it and then returns to its original length once the current is removed. A small lever is connected to one end of this wire. These contractions pull the lever forward causing one end to protrude through a hole in a fixed baseplate. By attaching this small unit to the finger tip of a glove and then sending pulses of current through the wire, the lever taps the participant's finger. We have developed code that makes the tactor pulsate whenever an object is selected. This provides better feedback than a purely visual cue.


Not everyone in the SCIU is able to manipulate a glove, so we have had to look for other input devices that suit different patients' abilities. One area that shows great promise is with infrared controllers. We have been using a unit that is designed for remote control of computer-based presentations. It is basically an infrared mouse with some additional functions. This has been an invaluable tool for certain patients.

We also evaluated a system for patients with no ability to move at all from the neck down. An adhesive-backed, reflective dot approximately 1/4 inch in diameter is placed on the patient's forehead. An IR transmitter/receiver sits atop the monitor and tracks the patient's head movements, converting them into equivalent mouse movements. A "puff- and-sip" switch emulates the mouse buttons through a separate IR box. By modifying the software, we were able to provide control of the operator's viewpoint. While this proved a satisfactory solution, the cost of the system is presently beyond our means. The unit we used for testing was provided on a short term loan basis only.


For those patients with the capability, controlling movement within the virtual world is done with a joystick. However, the decision to utilize notebook computers presented a major problem: because there is no gameport, there is way to connect a joystick. After a bit of research, we found a piece of external hardware that lets a joystick talk to a COM port. Ahmed Shakil, a member of VRASP, wrote source code in C++ that enabled our rendering software to recognize this hardware. After recompiling, we finally had a system that permitted the user to see 3-dimensional objects, to move within the virtual world containing those objects, and to grasp and move them around while receiving tactile feedback information.


Our program began in mid October of 1994 in a corner of the physical rehabilitation room at the Bronx VAMC. With the exception of the two VGA color monitors which we leave at the hospital, everything else - computers, PowerGloves, LCD glasses, interfaces, cables, etc. - is brought in and out every time we run the activity. This is partly because of security concerns, and partly due to the lack of available space in the SCIU. Normally, we are there every Tuesday and Thursday for three hours. Because everyone from the VBN is a volunteer, we are not always able to rigidly maintain this schedule. However, in the eight months since its inception, we have logged over 200 patient hours and 300 volunteer hours.


At first, there was only marginal interest in the activity on the part of the patients. This was because previous computer-based activities dealt exclusively with vocational applications such as word-processing and spreadsheets. Apparently there had never been an attempt to use computers recreationally.

Given this indifference, we decided that it would be counterproductive to dive right in to virtual reality applications. Instead, we began with some familiar games such as chess and checkers. Often, it is difficult for patients to play these games on a board without someone else moving the pieces for them. There is also the constant need to find an interested and interesting partner who has the time to play. By making these games available on the computers, it piqued the patients' interest in a non- intimidating way. This approach not only introduced our activity to the patients, it also re-ignited a long-standing feud between two of the patients over just who was the better chess player!

Initially, there was no formal structure to the program. We simply came as often as we could, unpacked, plugged in, and waited for someone to join us. Because we were in the physical rehabilitation room, we had high visibility. While some patients were playing, others would drift by for a few minutes in between their workouts or therapy sessions. Some made suggestions on the best move, others simply watched silently. Later if a computer was free, they would ask a few questions about what we were doing, or how something worked on the computer. After a few weeks of seeing us on a regular basis, more and more patients wanted to try the various applications. We intentionally used programs that are mouse driven, to make access as easy as possible.


While this was a good beginning, the real breakthrough occurred when I began bringing along an external CD-ROM drive which enabled us to expand the applications we could provide. Geographical mapping programs and multimedia encyclopedias generated immediate interest. Just rediscovering a street on an electronic map where the patient once lived, or finding a park he used to play in seemed to relieve the drudgery of their daily routines by giving them an opportunity to relive some other time and place.

Over the ensuing weeks, we had people searching for answers to questions about gardening, digging up facts about various presidents or movie actors, and expounding the relative merits of the manned space program. One man wanted to find the location of the airfield in England where his plane crashed while returning from a mission during the Second World War. One session was even spent with patients discussing the salient points of every single breed of dog pictured in an electronic encyclopedia.

Admittedly, none of these applications constitute virtual reality, but, as the patients tried the different programs, they quickly became more comfortable about using computers. We soon had several enthusiastic patients who had never before even touched a computer. As their interest grew, so did the size of the groups participating in the activity and we then began to introduce VR software and hardware.



Our choice of hardware platform determined what VR software we could utilize. Obviously, such tight financial restrictions required freeware or donations wherever possible. This made REND386 a top contender for use in the activity. This public domain rendering engine created by Bernie Roehl and Dave Stampe was really the only software of its kind when we began the project. Because the source code was available, modifications could be made to the software, and the fact that it was free made it invaluable to us.

Joseph Gradecki graciously provided us with his VR chess game, "Mate". This is a commercial product with its roots in REND386 that can either be played over phone lines or directly between two computers. Since not everyone knows how to play chess, I modified "Mate" and developed a checkers game that also operates across a serial link allowing two players to compete. Another REND386-based application we have available is a racquetball game. We are working on expanding it to become a two player tennis game so that patients can compete against one another.

The capacity for interaction in the software has always been a prime consideration of mine. I personally feel that wandering around alone in a virtual world, no matter how photo-realistically rendered, is ultimately unsatisfying. I would much rather interact with other participants, even if this means a less than visually compelling environment. To this end, I was not only seeking the ability to create virtual environments, but to interconnect multiple occupants within these environments.


In addition to REND386, we used two games: Doom and Heretic, to familiarize patients with the idea that a computer could create a non-existent or virtual world to explore. Unlike a typical video game with its flat, 2-dimensional screen image and limited movement, these games create a fully traversable, 3-dimensional environment. Because the worlds are pre-rendered[] and their graphic resolution is limited, they are capable of very fast screen updates. This means you can move rapidly through the virtual space.

Some of the patients in the SCIU have motorized wheelchairs and watching them speed through the halls, you'd swear there was only one setting available on their control - maximum overdrive! This suggested that we trade realism for speed. To provide them with a virtual world that was realistically rendered, but moved slowly and jerkily would be extremely boring for them. This was further reinforced by the limitations of the available hardware. I was somewhat concerned about the violence in DOOM, but was assured that it would not be a problem. After all, these are all ex- military men.


We also selected these games because there are freeware editors available that can be used to design your own layouts. I hoped that if the patients became interested enough to want to make their own worlds, we would have that capability.

We are presently using this software to develop a virtual rendering of the hospital's main floor. A new swimming pool is being completed some distance away from the SCIU, and we want to show patients the various routes available. By using the actual architectural plans of the hospital, we will have a scale layout of the ground floor for the patients to navigate. They can explore different virtual ways to get to the pool before they have to exert the physical effort necessary to get there on their own. We are replacing the textures of the original games to more accurately represent the look and landmarks of the hospital. And of course there are no monsters or enemies lurking in the corners. (Well, not too many!)

We would like to eventually integrate this design with a wheelchair treadmill. Then, as the patient propels himself, his movements on the treadmill will update his position in the virtual world in real time. This will give the patients a feel for their ability to make the actual trip to the pool.


For many years, the VBN has taken patients from numerous VA hospitals on tours of the USS Intrepid. A decommissioned World War 2 aircraft carrier, now anchored in the Hudson River, it is part of the Sea, Air & Space Museum. Here again is an activity which usually excludes most of those in the SCIU. To remedy this, we are presently developing a virtual tour of the Intrepid. In order to achieve our goal of multiple participants occupying the same virtual world, we needed software with built in network and modem support. When you connect two or more computers running this software, each participant sees a graphic representation on his computer of each of the other player(s) with proper movement and perspective. Using this capability, we plan to take several patients on the tour simultaneously while permitting them to interact with one another in their exploration of the virtual carrier.


Although networked board games and wheelthroughs are fine, they are merely the starting point of our program. Our real focus is on developing interactive sports. We want to ultimately have teams of patients in this SCIU competing against others in the same hospital, and eventually against similar teams in other hospitals that the VBN services. Much like SIMNET which the military uses for field exercises, we are working to create VETNET. The military was instrumental in creating virtual reality technology. Now we want to use it to assist the military's veterans.

To this end, I have already produced a baseball game with the point of view of every player's position available. Our next goal is to network the bat and ball so one patient can pitch as another patient swings at the ball on two completely different computers. There will be no need for them to occupy the same room, or even the same ward.


Even though the patients may be separated in the real world, they will be sharing a virtual world. It is therefore imperative that they be able to speak to one another naturally and in real time. We are presently researching the most cost-effective means of achieving this.


Our overall goals include four levels of network implementation:

  1. networking the bedside - We are presently able to provide our service to selected patients at their beds. Using existing telephone services, we can link two patients. This is sufficiently fast for board games.
  2. networking the ward - This is a longer term goal. Eventually we would like to have an individualized controller/display setup at each bedside in the ward. Whenever the patient feels like participating with others on the network, he simply accesses the system and navigates to his choice of interaction. This will substantially increase the ratio of patients served per volunteer.
  3. networking the entire hospital - A natural outgrowth of the initial work done in the SCIU will be to provide access to other patients. This could be done with terminals in selected dayrooms throughout the hospital.
  4. networking multiple hospitals - A technology transfer from the military and some high speed data installations will be necessary to facilitate inter-hospital sports for team competition. While this is somewhat further down the road, it is the ultimate goal.


While we could have simply said that initiating such a technology-dependent project with such limited resources was impossible, I thought we should at least try. No matter how meager and limited our capabilities, we have seen a tremendous response from the patients to our efforts. From this simple beginning, we hope to find additional means to expand the program and eventually realize our goal of networked team sports played in virtual environments.

For 47 years, we have witnessed the positive impact on patients of previous VBN activities. We believe this newest activity has immense potential to help overcome the unusual hardships encountered by the patients in the Spinal Cord Injuries Unit. Once the idea that computers could be used in purely recreational ways became apparent to the patients, interest soared. Even simple games like chess and checkers held their attention for hours. Although we are only able to provide rudimentary VR experiences at this time, the patients become completely immersed in them. If we can eventually provide experiences with VR that are sufficiently mentally compelling, we think that real-world benefits will follow. While engaging in virtual sports, it is possible that the body may partially accept what the mind experiences. If so, then perhaps this can even lead to physical improvements. As our program expands, we look forward to inviting interested professionals to assist us in quantifying the results of our activities.

In the meantime, by permitting patients to perform actions in a virtual world that are impossible for them in the real world, we expect to at least enhance their self-esteem and improve their mental outlook. Further, by diminishing feelings of isolation and expanding their social contacts through interactive virtual sports and games, we can help them face the challenges of their daily life.

Go to previous article 
Go to next article 
Return to 1995 VR Table of Contents 
Return to Table of Proceedings 

Reprinted with author(s) permission. Author(s) retain copyright.