Dr.
Rosalind Latiner Raby
Return
to 600 HomePage
The samples reflect excllent as well as mediocre past class projects.
The Check-List is to be used for ALL articles that you read in this class
Page Updated: 2014
July 14, 2009
Project-Based Test Preparation: Literature Review
Review of Related Literature
Before any creation of the specific project-based units are created, an in depth analysis of related literature on the topic helped reveal foresight into the many typical barriers to student performance on both teacher-made and standardized exams. This knowledge will help guide the creation of the units as well as anticipate extraneous predicaments that may arise throughout the intervention. First to be discussed is that much of the recent literature proposes sound recommendations for test preparation, including reading strategy needs and subject specific suggestions. Second, there are specific recommendations for minority and learning-disabled students. Lastly, research even analyzes and proposes best test formats for teachers to consider when constructing their instructional plans. Perhaps most resoundingly, the literature points to a consistent dilemma in students' skills: reading.
Throughout various research and even educational commentary articles, the need for students to have a battery of reading skills is eminently repeated. For example, research shows that students who actually enjoy reading have a high correlation with better test scores (Mucherah & Yoder, 2008). In this same study, student surveys including numerous Likert scales for various statements regarding students' rationale for perceived enjoyment or choice in reading, showed that self-efficacy in reading was the most potent factor in predicting student success on the Indiana Statewide Testing for Educational Progress (the ISTEP+, which is similar to California's Standardized Tests). This incorporated students being able to identify themselves as being the type of individuals who enjoy reading intrinsically. Further research indicates similar results but goes steps further in linking student lack of intrinsic reading motivation to deficient reading skills (Greene, A. H, & Melton G. D., 2007). Greene and Melton summarize the most important factor about their findings on test achievement as students being able to recognize specific language and vocabulary to each content area test. Instructing students on how to comprehend typical features of tests such as passages or poems with line numbers and underlined phrases was a necessity for student success. Therefore, in their test preparation program, it was vital to intertwine subject content with test specific reading examples. This particular Virginia Based test preparation plan labeled it "test talk," yet its simplistic definition lies in having students understand synonyms. For instance, students had to understand that many tests use a plethora of actual words that simply ask for a passage's "main idea." If students do not have the required vocabulary knowledge to understand what a test question is really asking of them, they are forced to apply other crucial reading strategies such as the use of context clues. Chaleff and Toranzo (2000) found similar results in their study with deaf children taking the Stanford Achievement Tests (SATs). They noted that students needed systematic instruction on reading vocabulary and contextual clue analysis even aggregated into using syntactic and background or real life experiences into those outside clues. With all of these reading-based challenges, it is truly remarkable that recommendations for systematic and thorough test preparation can be made despite the significant number of standards to be taught.
Research shows that students need to learn via whole units and not focus on individual or random items (Samuels 2008). As an example, Samuels summarized in her article that certain Chicago high schools, who were attempting to implement college test preparation through rote, individualized and time consuming plans, were actually lowering performance scores. The main reason for lowered scores was attributed to a lack of analytical and problem-solving skills; students were not building upon their knowledge but merely ended up having a repertoire of similar problems that they could solve correctly, but lacked the skills needed to generalize to newer problems. Therefore, test preparation programs should not merely give students sample questions but teach the concepts needed to grasp those questions in the first place. Additionally, research also suggests a variety of instructional methods for preparing students towards high test achievement.
Making test preparation powerful for students was consistently recommended as being that which is engaging to children. This included differentiation for visual, auditory and kinesthetic learners with emphases on fun, inquiry based learning, as well as positive reinforcement tactics (Greene & Melton 2007; Turner & Rios 2008; Fedore 2005): all important for designing the intervention test prep class at ODLHA. As an example, Greene & Melton claim that by making their test preparation program active and pleasurable for all learning styles, student scores increased. They paid special attention to note that although during real exams students are usually sitting with a paper and pencil, including movement and entertainment was crucial for engagement and motivation. Similarly, Turner & Rios state that their specific science content instruction for science-based tests had to delve into inquiry based activities. For instance, rather than simple reading of texts, students needed to complete laboratory quest. These tasks required student motivation to figure out cause and effects of various experiments in order to actually grasp the problem-solving skills addressed in tests. Their study on biology test performance even noted that gifted students require even more challenging and vigorous hands-on inquiry in order to show increases on test performance. Furthermore, Fedore's progressive research study towards de-stressing tests argued that instruction had to include the "3 R's"-Rigor, Relevance, and Relationships. This means that students had to be challenged through meaningful curriculum that related to their personal schema. As a result, positive reinforcement techniques such as certificates and dinner rewards for high achieving students were implemented successfully throughout her study. Entertaining students via teacher-made lip-syncs about test taking strategies on the actual days of testing decreased student anxiety and ultimately may have played a role in higher scores. She even encouraged her school's administrators to dress up as characters that spouted off various content knowledge (such as formulas or vocabulary words) to amuse students, decrease tension, and yet still increase student knowledge. Relating to Fedore's study that implies student anxiety and the subsequent positive brain endorphins released by de-stressing is a study done by Johnson & Memmott (2006). By analyzing test scores of third and fourth graders at two schools where one school offered intensive music instruction and the other did not, they concluded that music instructed students did significantly better on standardized tests. While a handful of extraneous factors could explain the results, they did note specific music classes where clear positive results were noted. For the purpose of this action research project, it may be important to include investigations, assignments, or other projects that include musical knowledge or rhythmic facets. Clearly, differentiation in test preparation is essential.
In addition to differentiation, research points out specifics for test-preparing instruction of minority and learning disabled students as well as test formatting. One study suggests that administrators and teachers must have high expectations for all students, thorough unit plans, and professional development for educators planning test preparation (Thompson 2007). This research points out that it is crucial, in my own quest towards making effective test prep unit plans, to search for professional development and get principal approval to attend. Additionally, due to the fact that my test preparation classes will have a high number of learning disabled students, it will be important to include the suggestions from Mowschenson and Weintraub (2009), whose analysis of a Massachusetts-based tutorial program for students with disabilities, showed that these students needed content knowledge before study skills. Also, while the program included a co-teaching model, it was important to include immediate supplements when students struggled with certain concepts. Of great magnitude towards my action research, this study shows that the units for the test-taking intervention class will have to be based on differentiated content that subtly intertwines test taking and study skills. Certainly, the formatting of tests will play a role in the creation of the intervention class' curriculum; both the formatting of multiple choice tests along with students' perception of the type of test they are taking must be addressed (Coyle & Wilson 1991; Edwards & Wilfred 2007). Coyle and Wilson found that students must learn discriminating skills (such as the popular process of elimination strategy), as well as confidence in their selections. Evidently, while teaching students these skills, teachers must ensure their multiple choice tests are written correctly to ensure students are exposed to realistic examples that they may seen on standardized tests. Similarly, Edwards and Wilfred found that helping students gain confidence and diminish their mental fear of multiple choice tests was imperative. After careful comparison of both Caucasian and African-American student test achievement scores on multiple choice and constructed response based tests, they found that data revealed student perception played a bigger part on performance than their educational levels or abilities. Decreasing the fear of standardized tests is something to be remembered when designing the action research class' curriculum.
Overall, the research shows that instruction on test preparation must address reading skills, differentiated, include higher order thinking, and use various methods towards decreasing test anxiety. These suggestions will be heavily utilized in the creation of the project-based-learning units in the test preparation intervention class.
References
Chaleff, C. & Toranzo, N. (2000). Helping our students meet the standards through test preparation classes. American Annals of the Deaf, 145(1), 33-40.
Coyle, L., & Wilson T. L. (1991). Improving multiple-choice questions: Preparing students for standardized tests. Clearing House, 64(6), 422-425.
Edwards, B. D., & Wilfred, A. Jr. (2007). An examination of factors contributing to a reduction in subgroup differences on a constructed-response paper-and-pencil test of scholastic achievement. Journal of Applied Psychology, 92(3), 794-801.
Fedore, H. (2005). De-stressing High-Stakes Testing for NCLB. Principal Leadership, 6, 27-30.
Greene, A. H, & Melton G. D. (2007). Teaching With the Test, Not to the Test. Education Week, 26(45), 30.
Johnson, C. M., & Memmott, J. E. (2006). Examination of relationships between participation in school music programs of differing quality and standardized test results. Journal of Research in Music Education, 54, 293-307.
Mowschenson, J. J., & Weintraub, R. J. (2009). A new vision of academic support. Phi Delta Kappan, 90(10), 751-755.
Mucherah, W. & Yoder, A. (2008). Motivation for reading and middle school students' performance on standardized testing in reading. Reading Psychology, 29, 214-235.
Samuels, C. A. (2008). ACT Test-Prep Backfiring in Chicago, Study Warns. Education Week, 27(39), 6-7.
Thompson, G. L. (2007). The truth about students of color and standardized tests. Leadership, 36(3), 22-38.
Turner, M. J., & Rios, J. M. (2008). Determining Paths to Success: Preparing students for experimental design questions on standardized tests. The American Biology Teacher, 70(3), 140-152.
A Review of the Literature: Fall 2008
At X Middle School there are interventions for students who have scored Far Below Basic and Below Basic on their California Standardized Tests. There are many more students though, that are "stuck in the middle" at the Basic range and do not reach proficiency. I want to focus on an intervention for students that fall into this category. Hypothesis: By providing students that have scored at a Basic level on their CST's in Language Arts with an intervention class that consists of standards based teaching combined with explicit teaching of test taking strategies student achievement will improve in Language Arts. Some questions that have surfaced are: What specific strategies are best for test taking? What test taking strategies do students already know and use? What are students' attitudes and anxieties toward testing? Can students' attitudes towards these high stakes tests change as a result of this intervention? Are there certain areas in English Language Arts that students have more difficulty in at our site? Are students aware of their scores and levels on the California Standardized Test and in their English Language Arts classes?
The following key words were used to search the literature: test wiseness, test coaching, test taking strategies, testing attitudes, student achieve*, middle school*, and intervention.
Test-taking strategies are defined as techniques that students apply to improve their achievement on exams. Strategies should be applicable across content areas. Test-taking strategies include training on test formats, reasoning and deduction strategies, time management, practice and coaching (Kretlow, Lo, White, & Jordan, 2008). Glenn (2004) and Hornof (2008) also discuss strategies for building stamina during long and strenuous assessments. Students are taught to be well rested before arriving to school, rest their eyes after completing a section of the test, have a drink of water, and take deep breaths. Interestingly many of these strategies are already in place at my school during standardized testing. Hornof (2008) prepared students for reading tests by creating a unit based on defining test-specific vocabulary, analyzing past test responses, and uncovering misconceptions. By analyzing how we take tests we can determine which strategies are best to use and how to teach them to our students.
Two empirical studies used mnemonic devices to improve academic achievement (Kretlow et al., 2008; Ritter & Idol-Maestas, 1986). The study by Ritter & Idol-Maestas (1986) used 28 middle school students with poor reading comprehension and above average comprehension in the experimental group and another 28 comparable students in the control group. The acronym SCORER refers to: S- schedule your time, C- clue words, O- omit difficult questions, R- read carefully, E- estimate your answer, R- review your work. After instruction in the mnemonic strategy SCORER the experimental group, especially the poor comprehenders, demonstrated better test scores in both near generalization and far generalizations. Near generalizations refers to the subject area where the strategy was taught, in this case social studies. Far generalization refers to in this case their science class. This enables me to see the benefits of teaching students test taking strategies that can be extended to their other content areas.
Kretlow et al. (2008) also used a mnemonic strategy but with students with mild mental disabilities. This study was compelling because of its success. Special education teachers have a wealth of knowledge to contribute to our Language Arts collaborative planning sessions. Many strategies that are used in special education, we could be using too. Though the mnemonic device used in this study was more comprehensive than in the first study, students with mild mental disabilities were able to remember and apply it correctly, which leads me to believe I can use something similar with my middle-achieving students. It is important to note that a test-taking strategy is not intended to compensate for gaps in content knowledge or reading ability (Kretlow et al., 2008).
Like Hornof (2008), Klein, Zevenbergen, & Brown's (2006) qualitative study explores "using older test questions that can be modeled and taught through think-aloud protocols so students receive a clear view of how the critical thinker uses strategies to process problems" (p.154). During the intervention I propose, I can use the released test questions for the CST to model and think-aloud how I would approach various questions. Decision-making processes, the idea of pacing, and how to respond to the test as a whole by not becoming flustered over small portions of the test could also be modeled (Klein et al., 2006). Modeling these strategies aloud for students will help give them confidence they may lack about how to approach standardized tests. Allowing them to practice these strategies in a safe environment will also boost their confidence and will hopefully carry across to other assessments and content areas.
A startling study revealed students' more frequent use of counterproductive strategies, as they got older (Roth, Paris, & Turner, 2000). Metacognition of the effects of strategies, both positive and negative would benefit students. This is something that I will directly apply to the intervention. Raising students' awareness of the effects of both positive and negative strategies will help them differentiate the value between them.
High stakes tests affect the way teachers approach teaching. Teachers need to become learners and review what specific skills it takes to take tests in order to prepare students (Greene & Melton 2007). McColskey & McMunn (2000) suggest that there are long-term strategies to consider as well:
"Working collaboratively to develop clear instructional goals based on state standards, developing high-quality instructional materials that match the school's goals, supporting collaboration to promote improved student learning through things such as lesson study, and understanding how assessments contribute to a high-quality learning environment."
Though these long-term strategies are beyond the scope of my proposal they are important to consider as a school. We have begun working on these but time and funding prevents us from moving forward with all the strategies suggested.
According to a mixed-methods study examining young adolescents' perceptions of strategies implemented before a state-mandated "high-stakes" test, we are reminded of the power teachers possess to influence and reinforce, or not, their students perceptions of school events and initiatives (Hoffman & Nottis, 2008). Students' perception of teacher behavior during testing affects students directly. We need to be aware of this in order to positively affect students and help increase their confidence and effort.
Posner (2004) offers a distinctly different opinion about preparing students for standardized tests. He states that we are focused on testing and meeting imposed standards as opposed to developing critical thinking and problem solving skills. "Could our obsession with standardized tests reduce teaching itself to a simplistic and ultimately ineffective activity that would be amenable to automation?" (p. 751). This leads me to conclude that I should include learning outcomes that require critical thinking and problem solving skills throughout the intervention class.
References
Glenn, R. (2004, October). Teach kids test-taking tactics. Education Digest, 70(2), 61-63. Retrieved February 8, 2009, from Academic Search Elite database.
Greene, A., & Melton, G. (2007). Teaching with the test, not to the test. Education Week, 26(45), 30. Retrieved February 7, 2009, from Education Full Text database.
Hoffman, L., & Nottis, K. (2008, January 1). Middle school students' perceptions of effective motivation and preparation factors for high-stakes tests. NASSP Bulletin, 92(3), 209-223. (ERIC Document Reproduction Service No. EJ809060) Retrieved February 21, 2009, from ERIC database.
Hornof, M. (2008, September 1). Reading tests as a genre study. Reading Teacher, 62(1), 69-73. (ERIC Document Reproduction Service No. EJ817112) Retrieved February 14, 2009, from ERIC database.
Klein, A., Zevenbergen, A., & Brown, N. (2006). Managing standardized testing in today's schools. Journal of Educational Thought, 40(2), 145-57. Retrieved February 7, 2009, from OmniFile Full Text Mega database.
Kretlow, A., Lo, Y., White, R., & Jordan, L. (2008, September 1). Teaching test-taking strategies to improve the academic achievement of students with mild mental disabilities. Education and Training in Developmental Disabilities, 43(3), 397-408. (ERIC Document Reproduction Service No. EJ804654) Retrieved February 14, 2009, from ERIC database.
McColskey, W., & McMunn, N. (2000, October). Strategies for dealing with high-stakes state tests. Phi Delta Kappan, 82(2), 115. Retrieved February 14, 2009, from Academic Search Elite database.
Posner, D. (2004, June 1). What's wrong with teaching to the test?. Phi Delta Kappan, 85(10), 749. (ERIC Document Reproduction Service No. EJ703951) Retrieved February 14, 2009, from ERIC database.
Ritter, S., & Idol-Maestas, L. (1986, July). Teaching middle school students to use a test taking strategy. Journal of Educational Research, 79(6). Retrieved February 14, 2009, from Academic Search Elite database.
Roth, J., Paris, S., & Turner, J. (2000, January). Students' perceived utility and reported use of test-taking strategies. Issues in Education, 6(1,2), 67. Retrieved February 14, 2009, from Academic Search Elite database.
Research Proposal: Decreasing the Digital Divide
(2008)
Research Procedures: Study Design IV
1. Research Setting
X High School's English Department computer lab received thirty-two new, state-of-the-art computers in March, 2008. Title I funding stipulated that teachers receive training before taking classes to this new lab. I elected to conduct the training and to make this my research project. I met with the Title I coordinator and the Technology specialist, and we determined that these sessions would meet four times on Saturdays for four hours each in the new computer lab, Room A213. The teachers will be paid training rate pay for participation, and the presenters will be paid an additional hour's pay for preparation.
In my role as literacy coach, I know from informal teacher conferencing that few of our teachers are comfortable with technology for themselves and every fewer are comfortable incorporating technology into students' learning experiences. Knowing that Information Computer Technology (ICT) is vital for our students in this global market, I will design the research to test the correlation of these sixteen hours of teacher training with teachers' ability and desire to introduce their students to these computers and to go farther into teaching social networking and innovative uses for these computers.
The impetus for change comes from the California Standards Test (CST) data, which shows that most students in Los Angeles public high schools fit into the definitions of the neglected portion of the digital divide: they are in the lower socioeconomic group, according to Title I definitions, and they are in the ethnic minority groups who statistically have access to fewer computers both at home and at school. To validate this impetus for change, I examined the computer lab sign-up calendar and discovered that very few of our teachers have offered this technology experience to their students.
This will be an ethnographic study as I will be a participant observer, learning as I go, and allowing room in the agenda for unexpected trainings to as ideas occur to the designated presenters and to volunteer presenters as we share our areas of skill and understanding.
2. Define Participants.
Choice of target population: During the first week of Spring Semester, the Title I coordinator and the technology coordinator asked me to set up four four-hour Saturday technology training sessions to coordinate with the arrival of our new state-of-the-art computers.
Thus, the population would be thirty-five X High School English teachers, who will be asked to attend four four-hour training sessions on the new computers in the X High School English Department lab. I gained access to this group due to my role as literacy coach. I also have referent power in that I have a good relationship with the English Department faculty and some degree of expert power in classroom technology use. The target population will be those who choose to attend the training sessions; from these I will obtain my sample. The ages extend from 26 to 62, both male and female, with the years of experience ranging from one to thirty. Concerning ethnicity, two are African American, one Asian, two Hispanic, and the rest are Caucasian. One is blind.
I introduced this training to the ninth and tenth grade teacher in a conversation as a follow-up to a required District literacy training. The eleventh and twelfth grade teachers were invited with verbal invitations and with flyers. Introductions to the training coincided with the arrival of the new computers.
The particular choice of presenters may produce some unforeseen agendas. The teacher with the disability may cause complications in not being able to see the LCD presentations nor the computer screen. I also anticipate a frustration arising between the experienced computer users and the novices concerning the amount of time spent on each strategy.
V. Strategy for Data Collection and Implications for Action Research
1. I began this project by examining the computer lab sign-up calendar for the past six months and discovered that only eight of our thirty-five English teachers had taken their students to the computer lab, and most of those experiences were merely word processing assignments.
I designed a post-viewing survey of the technology explosion video, "Did You Know?2.0" (Fisch, 2006), with the intent of motivating the teachers to understand the urgency and to feel the vital educational importance of this technology shift. I also hoped teacher would realize their need to become more technologically proficient for the benefit of their students.
Then I conducted a qualitative survey to get the teachers to verbalize their need for this training and to let me assess their attitudes, understanding, and view of the educational ramifications of this global technology explosion. I wrote the first six survey questions in a qualitative free response format to get the teachers to respond on a guttural level to this vital issue. The next question was a tacit question to see if there is a correlation between technology use and a teacher's years of experience. The final three questions were in open-ended format to show the range of technology uses and skills our teachers have or the lack thereof. These last three responses guided the design of the training classes. The survey questions are:
Free response qualitative:
1. What is your reaction to the "Shift Happens" video?
2. What scared you about this message?
3. Which statistic shocked you the most?
4. What excites you about this message?
5. How does this trend change the demands on educators?
6. What does the exponential explosion of technology mean in your classroom?
Tacit: 7. How many years' teaching experience do you have?
Free Response quantitative:
8. How do you use the computer outside of school?
9. How do you use your cell phone?
10. What kind of academic activities do you require your students to do on computers?
Planning the training sessions
I examined the lab sign up calendar for the past six months and counted only eight teachers who had offered technology experiences to their students.
I spent many weeks quizzing X teachers, the technology coordinator, and college students about their technology use to get topics for the training. Then I designed a rough draft of the four days' training. I selected trainers from these respondents in addition to myself as trainer.
I sent a sample agenda and questionnaire to these teachers to elicit responses about their desires for an ideal Professional Development day. I received only one response to this. I solicited known technology-using teachers to assist with the training. I then set the tentative schedule and included time for teachers to volunteer their most successful technology experiences (Best Practices).
I will design a longitudinal panel survey by giving technology use surveys at the beginning of each training session, followed by a post-training survey given to the same teachers five weeks after the training.
To know how to pace the sessions, I used pre-survey to obtain data on initial teacher familiarity with technology programs. Before completing the design of these surveys, I used member checking by asking the English Department chairperson and one of the frequent technology using teachers to check the survey questions and give suggestions as to their wording and worthiness. Some questions were qualitative studies that asked about their amount of computer use with three responses on a likert scale: " Never," "Occasionally," "Often." Other questions were quantitative and elicited a response of "yes" "no" for whether they have assigned students to use these strategies (triangulation). The first three surveys were paper and pencil surveys given as they entered the training room. The fourth survey was online due to our increased on-line abilities.
I also designed a mixed qualitative/quantitative student survey with most questions being quantitative to give teachers a sense of how much technology expertise their students have. This survey contained "yes-no" responses and two qualitative free response questions. This survey allowed triangulation because it also asked for the amount of teacher-assigned technology use, as did the previous teacher survey. I surveyed one 10th grade core English class of twenty-nine students for their computer use. The results showed that our students are rapidly becoming socially technology literate:
Have a computer at home - 96%.
Own other technology devices - 100%
Own a cell phone - 76% .
Use cell phones for multiple social tasks - 74%
Have a MySpace account - 89%.
Research multiple entertainments online - 90%
The academic use of the computer was less well developed.
Used computers in school lab - 69%.
Believe they can determine online cred
Able to be summarized succinctly
Can you Define the Problem / Topic?
Do you believe the significance / rationale?
Has a link been made between the research and furthering the field of education?
Limitations defined?
Is there a Hypothesis or Research Questions
Hypothesis or Research questions clear?
What are major themes in articles?
What is the main theoretical foundation?
Has it been explained well how the research fits into the existing research literature?
What research methods / designs were used in the literature?
Was the Literature Review Evident / Sufficient
Data Collection methods described
Data Collection methods appropriate
Site Selection described / justified
Sample selection described / justified
Issues of reliability and validity discussed?
Issues of generalization discussed?
Data Analysis methods described?
Data Analysis methods appropriate
Issues of ethics discussed?
Are Implications for Action Research discussed?
What else should authors have done?
What would you have missed if they used a different method?
Do you believe the findings and conclusions that the author(s) provide?