Wilmington
University
|
Practicum III Research Article
|
Reading
Comprehension Instruction for Students with Learning Disabilities
|
|
Jordan
Barton
|
1/16/2013
|
This is a summary of the study “Reading Comprehension Instruction for
Students with Learning Disabilities”.
The study covered students grades K-12 with learning disabilities
form 1995-2006. The students were
put into intervention programs and their progress was studied.
|
My article is a meta-analysis called, “Reading Comprehension Instruction for Students with Learning Disabilities”. This study reviewed and combined findings from forty studies spanning from 1995 through 2006. The article was written by Berkeley Scruggs as a comparison of a meta-analysis she wrote in 1996. The 40 studies included 1734 ranging from kindergarten through twelfth grade, all of which were labeled with a learning disability. During the studies students were put into reading intervention classrooms using different strategies to improve their reading comprehension. The students put into these intervention programs ranged from 4 to 18 but averaged about 12. Students from elementary schools, middle schools, high schools and even some from residential facilities were studied. These strategies included Questioning/Strategy Instruction, Text Enhancements, and Fundamental Reading Skills Training. Questioning/Strategy Instruction incorporated teaching students to use self-questioning while reading and direct questioning of students while reading. Text Enhancements “enhanced” the text through question placement within the text, and the use of graphic organizers and technology. Fundamental Reading Skills Training created an environment with very low student to teacher ratios and worked on basic phonics skills and other basic reading skills. The intervention classes averaged almost 30 sessions at 50 minutes a piece throughout the school year.
These
studies were measured by their “effect size” which, “measures the impact of reading comprehension
interventions” (Scruggs, 2010). There
were three types of effects studied, including: generalization, treatment and
maintenance. Generalization documented
how well the students used their reading comprehension skills across all
subject areas and types of text.
Treatment tested the effectiveness of specific reading strategies on
reading comprehension, while maintenance studied the effects of the
intervention after it had ended. Effect
sizes were measured as follows: d=0.20
indicates a small or low impact, d=0.50 indicates a moderate impact, d=0.80 or
above indicates a large impact (Scruggs, 2010).
With all three types of effects averaged together the effect was 0.70,
meaning that the interventions had a noteworthy effect but did not make a large
impact. However, when the data was
broken down generalization had the largest impact at 0.75 while treatment and
maintenance came in at 0.69. This means
that the students were carrying their skills across subjects and using them
effectively. Although these numbers do
seem to be rather high they were actually lower than the 1996 study. Scruggs seems to think the instructors are
the reason for the shortcomings of the more recent study. In the 1990’s study many of the intervention
instructors were researchers well versed and comfortable with the subject while
the newer study was more made up of teachers who had been through training in
order to teach the interventions. The
researchers tended to have a greater success rate than the teachers, therefore
more researchers equals better results.
Even though the findings were less impressive than the previous study,
all three strategies (Questioning/Strategy Instruction, Text
Enhancements, and Fundamental Reading Skills Training) were found to be
effective. In conclusion, Scruggs found
that the intervention programs did have a significant impact on the reading
comprehension skills of the students and was much more effective than regular
classroom activities.
After
reading this study I feel much more confident about the reading intervention
programs at my current school.
Previously I had preconceived notions that the classes were borderline
useless due conversations I had with students.
The students confessed their boredom throughout the class however they
had never mentioned its effectiveness. Perhaps
the moderate improvement scores could be aided through student interest. In my experience, students tend to learn more
when they are engaged in the topic.
Also, I think Scruggs could benefit from student input on her next
study. The students could provide
insider knowledge of what worked for them and what did not. This insight could help to adjust strategies
used as well to edit the implementation of current strategies. Not to mention, I would like to see more
specified data on the Fundamental Reading Skills Training programs like the Dyslexia Training Program. I would also like to see a study done
regarding reading intervention class size.
Many of the interventions in this meta-analysis had small student to
teacher ratios. These ratios could very
easily skew data as students should learn better in a smaller classroom
environment. That being said did the
intervention program truly work or was it simply the smaller class sizes that
improved student performance. Finally,
this study eclipsed 12 years and monitored almost 2000 students making its data
very reliable.
No comments:
Post a Comment