1994 VR Conference Proceedings

Go to previous article 
Go to next article 
Return to the 1994 VR Table of Contents 

Assistive Cursor Control for a PC Window Environment: Electromyogram and Electroencephalogram Based Control

By: David W. Patmore
University of California at Santa Cruz
Baskin Center for Computing Research

William L. Putnam
Stanford University, Department of Electrical Engineering and
Center for Computer Research in Music and Acoustics

R. Benjamin Knapp
San Jose State University
Department of Electrical Engineering


A controller integrating electroencephalogram (EEG) and electromyogram (EMG) signals is presented. Assistive devices that use bioelectric signals measured from the head are useful to a population of disabled persons, including those with amyotrophic lateral sclerosis (ALS) or spinal injury. Such devices can assist these persons to operate personal computers, allowing access to word processors and other computer tools. The controller uses EMG from facial muscles and alpha wave detection from the EEG to operate a PC mouse. The EEG and EMG signals are recorded and sent to the PC using the BioMuse, a general biological signal processing device. [BioMuse is a trademark of BioControl Systems Inc.]

This system provides a practical tool when combined with the virtual keyboard software package, WIVIK [WiViK is a trademark of Hugh MacMillan Rehabilitation Center]. WIVIK provides the user with a mouse-based alternative to keyboard typing. The results of an informal study where subjects use the cursor controller with WIVIK to operate a word processor are presented.


The need for alternative controls:

An effective interface to a personal computer (PC) via channels other than the hands is desirable to many members of the community. Given a single interface to a PC, the user can control many different applications within the computer and external to it. Persons with limited movement need an interface that makes use of their abilities. In severe cases, a person may be nearly without ability of physical expression; locked-in. Some largely locked-in people suffered from spinal injury or head trauma; others from amyotrophic lateral scleroses or cerebral palsy. These persons will benefit from having as many channels of communication open as possible.

Some existing controls:

Various devices have been developed to open new communication channels to computers. Physical motion detection devices such as chin switches [Note 1] or head gesture recognition [Note 2] have been employed. Eye pointing information has been used to control a computer, both through the electrooculogram [Note 3] and through visually evoked responses [Note 4]. The Brain-Computer interface has recently come under attention, with efforts to track the mu "wicket" rhythm [Note 5] and the 40 Hz pre-motor potential [Note 6]. Pattern recognition techniques have been applied to the EMG signal to control prostheses [Note 7] and computers [Note 8].

Bioelectrical Signals:

The electromyogram is the electrical activity associated with neural firings in the musculature. A neural pulse train is transmitted down neurons to the muscle fibers associated with the desired movement. Each neuron innervates several muscle fibers. The EMG signal detected by surface electrodes is a summation over the ensemble of pulse trains associated with the muscle, with a spectrum ranging from DC to approximately 500 Hz [Note 9].

Alpha waves are the measure of a resting rhythm of the occipital region of the brain. In appearance, the alpha wave is sinusoid at approximately 10 Hz. When the eyes of a person are not occupied, for example when the eyes are closed, the magnitude of this rhythm increases and becomes very consistent. When the eyes are occupied, eyes open, for example, the magnitude drops very low and becomes sporadic. This alpha rhythm is found in approximately 75% of the general population [Note 10].



The data collection and processing is done by the BioMuse, a bioelectric signal processing device. The BioMuse has eight input channels to a digital signal processor. The processor has several different processing routines. For this experiment, an EMG envelope detector and an EEG alpha band detector are used. The BioMuse uses a standard serial port output to write data to a personal computer. The personal computer uses software to take the processed bioelectrical signals and transform them into mouse commands.

The EMG signal from each channel is squared and filtered with a 5Hz lowpass filter. This process performs an envelope detection whose ouput serves as an estimate of muscular activity. Alpha waves are detected using a bandpass filter with a passband between 8 and 11 Hz. The output is squared and low-pass filtered to render the envelope of the signal.

Three channels of communication from the BioMuse are used for this configuration: two EMG envelope detectors and one alpha wave detector. The EMG electrodes are placed on the left and right cheeks of the subject, and the alpha wave electrodes are placed over the occipital region on the skull.

The program that has been created is a prototype for a bio-signal to PC control system. In its complete form, this program will take any bioelectrical signals, apply different pattern recognition algorithms as desired by the user, and translate the results of these algorithms to control of the PC and programs that run in MS-Windows. [MS-Windows is a trademark of Microsoft Corporation.]

The main application that has been developed so far is an interface to the software package WIVIK. WIVIK provides a virtual keyboard that sends "key presses" to any active device that accepts typed data. The key presses are activated by the user, by moving the mouse to select a key, and clicking the mouse button to "press."


The subjects used for this experiment were colleagues who had never used the mouse before, though some had used the BioMuse in other applications. The subjects were not screened to avoid those with low amplitude alpha waves.


The tests took place in a normally lit room, with no special steps to reduce noise or distractions, such as telephone rings or passers-by. The subjects sat in a comfortable chair in front of a computer screen. A headband was used to monitor the facial activity and the alpha waves. The subjects moved the computer cursor by twitching their left cheek to go left, and right cheek to go right, then could change to left cheek/ up and right cheek/ down mode by evoking alpha waves. The mouse click was generated by twitching both cheeks at the same time.

Once connected, the system is calibrated by hand on the PC. Each muscle channel has two switches attached to it. The first switch is set to trigger when a medium amplitude signal is received, and the second is triggered and held when a large amplitude signal is received, and released when the signal drops to a lower level. The medium amplitude trigger is used to move the cursor left/right and up/down, so its threshold must be set lower to reduce wear and tear on the user. The trigger-and-hold switch is used as the mouse click, and so must be set high enough to avoid spurious events.

The alpha wave envelope channel has an axis-mode switch. When the alpha reaches a threshold, the mouse switches between left/right and up/down operation. The alpha switch calibration is set by first having the subjects shut their eyes and relax, then open their eyes, and move their facial muscles. In this manner, the threshold can be set between the maximum alpha value and the maximum system response to muscle artifact.

After calibration, the subjects were given seven minutes to move the mouse around and practice clicking. During this period, the subjects were encouraged to request adjustments to the mouse settings for better performance.

The test began by having the subjects type the phrase "The quick brown fox jumped over the lazy dog." The subjects were instructed to correct any errors they made using the virtual backspace key in WIVIK. The time taken to complete the sentence was measured.

The second test was to have the subjects navigate through a maze while being timed and checked for errors. The maze had only one path, and so only tested mouse agility, not maze-threading ability.



Of five subjects who participated, one subject had strong muscle artifact over the occipital region of his skull so that, even though he had significant alpha wave activity, spurious axis-mode changes made mouse control unwieldy. The rest of the subjects were able to control the mouse with good accuracy, albeit with varying rates of movement. All of the subjects had enough alpha waves to control the mouse.


Typing rates for the sentence varied between 7 minutes 15 seconds and 16 minutes. Navigating the maze took most subjects about one and one-half minutes. The difference in typing time appears to stem from two problems: One subject had difficulty with "key bounce" (repeating mouse clicks) and had to correct many errors, and was also unfamiliar with a standard typewriter keyboard and took many less-optimal steps toward goal-keys.

Table 1: Time results for different subjects

Subject 1:
Typing Time, 8.08 minutes
Typing Rate, 5.7 keys/ minute
Maze Time, 1.9 minutes
Subject 2: Typing Time, 16 minutes
Typing Rate, 3.0 keys/ minute
Maze Time, 1.5 minutes
Subject 3: Typing Time, 7.25 minutes
Typing Rate, 6.3 keys/ minute
Maze Time, 0.8 minutes
Subject 4:
Typing Time, 11.5 minutes
Typing Rate, 4.0 keys/ minute
Maze Time, 1.2 minutes
Subject 5:
Typing Time, Did not finish
Typing Rate, N/A
Maze Time, Did not finish


This device's typing rate is not as fast as some other approaches, such as the eye mouse [Note 11]. Unlike the eye mouse, however, it does not require the commitment of the eyes to the selection task. A user with experience could operate with more closely adjusted thresholds, and faster typing skills. The addition of word prediction and optimized virtual keyboards also speeds up the task of writing.

This system offers flexibility in that the user can select any grid that they need, allowing access to any MS-Windows program that takes a mouse input.

Future Development:

A primary focus for future development of this system is to make it extremely modular and re-configurable. For example, a disabled person should be able to use whatever combination of muscle, eye, and brain signals that is most comfortable to control the mouse position, click, and so on.

This system can be developed further by adding more controls, for example, the authors are developing a MIDI interface to a multimedia sound card; other controls could control external devices or be used as recording devices for medical data. The BioMuse has been previously demonstrated as an interface to music synthesizers [Note 12], but now it can be a versatile instrument using only a Multimedia PC, which can also be used for other purposes, even simultaneously.


  1. Thornett , M.C. Langner, A.W.S. Brown, "Disabled Access to Information Technology -- a Portable Adaptable, Multipurpose Device," Journal of Biomedical Engineering (1990), Vol 12, May 205-208.
  2. A.I. Tew, C.J. Gray, "A Real-Time Gesture Recognizer Based on Dynamic Programming" Journal of Biomedical Engineering, (1993), Vol 15, May, 181-187.
  3. R. Benjamin Knapp, Hugh Lusted, "Biosignal Processing and Biocontrollers" Virtual Reality Systems. (1993), Vol 1 (1), 38-39.
  4. EE Sutter, "The Brain Response Interface: Communication Through Visually-Induced Brain Responses," Journal of Microcomputer Applications (1992) 15, 31-45.
  5. DJ McFarland, GW Neat, RF Read, JR Wolpaw, "An EEG-Based Method for Graded Cursor Control," Psychobiology (1993), 21(1), 77-81.
  6. B Daviss, "Brain Powered," Discover (1994) May, 58-65.
  7. Daniel Graupe, Javad Salahi, DeSong Zhang, "Stochastic Analysis of Myoelectric Temporal Signatures for Multifunctional Single-Site Activation of Prostheses and Orthoses," Journal of Biomedical Engineering, Vol. 7, January 1985.
  8. William Putnam, R. Benjamin Knapp, "Real-Time Computer Control Using Pattern Recognition of The Electromyogram," Proceedings of the IEEE Engineering in Medicine and Biology Conference, San Diego CA, 1993.
  9. Marco Knaflitz, Gabriella Balestra, "Computer Analysis of the Myoelectric Signal," IEEE Micro??, October 1991.
  10. Gilbert H. Glaser, EEG and Behavior. Basic Books, New York, New York 1963.
  11. R. Benjamin Knapp, Hugh Lusted, "Biological Signal Processing in Virtual Reality Applications," Proceedings: Virtual Reality and Persons with Disabilites Conference, June 10-12, 1993.
  12. Atau Tanaka, "Musical Technical Issues in Using Interactive Instrument Technology with Application to the BioMuse," ICMC Proceedings (1993) 124-126.


Go to previous article 
Go to next article 
Return to the 1994 VR Table of Contents 
Return to the Table of Proceedings 

Reprinted with author(s) permission. Author(s) retain copyright.