Go to previous article
Go to next article
Return to 2004 Table of Contents
Katya Hill, Ph.D., CCC-SLP
338 Meadville Street
Edinboro, PA 16412
Evidence-based clinical practice is now the clear expectation of AAC service as articulated in the ASHA Scope of Practice (2001) and other professional standards. The essence of evidence-based practice (EBP) is placing the client's benefits first (Gibbs, 2003). Individuals who rely on AAC have expressed that the most important benefit in using an AAC system is being able to say whatever they want to say as fast as possible. Therefore, AAC practitioners can appreciate that a major component of AAC EBP is the measurement of communication performance in order to provide services that result in maximizing these benefits for clients. Consequently, applying EBP and performance measurement produces the most effective communication for people who use AAC.
Prior to the development of language activity monitoring (LAM) (Romich & Hill, 1999), language sample collection and analysis to support EBP was a burdensome task requiring the written, audio, and/or video recording and subsequent transcription of communication. Even then, time information was generally unavailable limiting the type of summary measures that could be analyzed to report AAC performance. The introduction of LAM provided a tool to collect quantitative data with ease. The LAM function has become available in three forms: as a built-in feature in many AAC systems, as software for a PC, and as an external add-on device. The recently introduced U-LAM(c) (Universal Language Activity Monitor) now allows language samples to be collected easily from any AAC system that offers voice output.
Concurrent with the development of LAM tools, resources to facilitate the transcription of logfiles to report summary measures were being considered. SALT or the Systematic Analysis of Language Transcripts (Miller & Chapman, 2000) has been the software tool most widely used to support the transcription process of language samples based on LAM data (Hill, 2001). ACQUA (Augmentative Communication Quantitative Analysis) is a research tool designed specifically for logfile analysis (Lesher et. al., 2000). However, both software applications require different steps to manually edit or code logfiles for accurate analysis. In addition, neither program generates a final comprehensive report.
PeRT(c) (Performance Report Tool) is PC software available from the AAC Institute that is used to facilitate the analysis and reporting of AAC performance and communication competence. PeRT was developed with the clinician and family member in mind. The initial work on PeRT started in 1999 by surveying AAC practitioners to determine types of performance measures considered important for effective intervention. Analysis of survey data indicated that the most preferred traditional measure to report AAC performance was mean length of utterance (MLU) while the two most preferred measures obtained with logfiles were language representation method (LRM) use and communication rate (Hill, 2001). The summary measures selected for PeRT analysis reflect a consensus of clinical measures and AAC stakeholder values.
The first programming of PeRT from the definition proposed by Hill and Romich began in the summer of 2001. The College of Wooster (Ohio) offers computer science students the opportunity to participate in the Applied Mathematics Research Experience (AMRE). Through AMRE small teams of students, supervised by a faculty advisor, work on a project for an eight-week period in the summer. The AAC Institute contracted with the AMRE program for the development of a shell to support a performance report library (reference) which included PeRT. Work on refining of PeRT continued as subsequent summer projects until a version was ready for beta-testing (Romich, et.al., 2003). The current version of PeRT makes it easy to convert logfile data from language activity monitoring into meaningful numbers (Hill & Romich, 2003a). The end result is the generation of the AAC Performance Report of seventeen quantitative summary measures of communication performance.
PeRT opens to a screen with four areas. To begin use of PeRT, a previously saved LAM data text file is opened. If the data does not included the optional mnemonic identifying the language representation method (LRM), a default method can be chosen. The LAM data appear in the area on the left of the screen, color-coded according to the LRM. The indicated LRM can be overridden at this point. The AAC Performance Report has two primary sections: utterance-based summary measures and word-based summary measures. The first step in analyzing a language sample is to segment the sample into utterances. Each utterance is highlighted by placing the cursor on the first event of the utterance in the LAM data area and dragging down to the last event of the utterance. Then the CREATE UTTERANCE button is clicked and the utterance appears in the utterance area of the screen. Various notations (errors, bound morphemes, etc.) can be made relative to both words and utterances.
Upon completion of the utterance segmentation process and entering the information on the subject, the AAC system, and the language sample collection procedure, the AAC Performance Report can be automatically generated. Additional reports including the complete transcript, vocabulary frequency lists, and a text format report can be generated. Finally, if analyses other than those offered in PeRT are wanted, the complete transcript can be copied and pasted into SALT.
A free Self-Study Program course on the AAC Performance Report and PeRT is available at the web site of the AAC Institute (htttp://www.aacinstitute.org). Free CEUs are provided to those who complete the course.
The universal availability of LAM logfile data provided by the U-LAM (Hill & Romich, 2003b) significantly increases access to logfiles by AAC practitioners and other stakeholders. This family of tools (LAM, U-LAM, and PeRT) and other resources available at the AAC Institute web site support the expectation that AAC practitioners collect data to measure communication performance. Today PeRT is being used by AAC professionals to support EBP. They are reporting that PeRT is easy to learn, easy to use, and that their frequency of language sample collection and analysis has increased significantly. While professional practice has been enhanced by these tools, the more significant result is the improved communication performance of people who rely on AAC.
American Speech-Language-Hearing Association (ASHA). (2001). Scope of Practice. Rockville, Maryland.
Gibbs, L. B. (2003). Evidence-based practice for the helping professions: A practical guide with integrated multimedia. Pacific Grove, CA: Thompson Brooks/Cole.
Hill, K. (2001). The development of a model for automated performance measurement and the establishment of performance indices for augmented communicators under two sampling conditions. Doctoral Dissertation, University of Pittsburgh, Pittsburgh, Pennsylvania.
Hill, K. J., & Romich, B. A. (2003a). PeRT (Performance Report Tool): A computer program for generating the AAC Performance Report. [Computer software]. Edinboro, PA: AAC Institute.
Hill, K. J., & Romich, B. A. (2003b). U-LAM (Universal Language Activity Monitor): A computer program for collecting AAC language samples. [Computer software]. Edinboro, PA: AAC Institute.
Lesher, G., Moulton, B.J.,Rinkus, G., & Higginbotham, D.J. (2000). A universal logging format for augmentative communication. In Proceedings of the CSUN Conference, Los Angeles, CA.
Miller, J.F., & Chapman, R.S. (2000). SALT: A computer program for the Systematic Analysis of Language Transcripts. Madison, WI: University of Wisconsin.
Romich, B.A., & Hill, K.J. (1999). A language activity monitor for AAC and writing systems: Clinical intervention, outcomes measurement, and research. In the Proceedings of the RESNA '99 Annual Conference. Arlington, VA: RESNA Press. 19-21.
Romich, B., Hill, K., Ahmad, N., Strecker, J., Gotla, K., Seagull, A., (2003). AAC Performance Report Tool (PeRT). In Proceedings of the 26th International Conference on Technology & Disability. Arlington, VA: RESNA Press.
Go to previous article
Go to next article
Return to 2004 Table of Contents
Return to Table of Proceedings