Nonetheless, many of my latest naysayers have been high school teachers who simply can't believe that such research, which was done mainly in elementary schools, applies to older students and "higher order" skills.
As luck would have it, the DI people helped developed math and science videodisc programs for use with older students for teaching more advanced skills. The discs were effective instructional tools.
One of the programs was an algebra program and it had an interesting history. The algebra program consisted of 15 initial lessons. The program was based on what the DI people assumed the kids knew by the time they reached algebra. Unfortunately, their estimates were seriously mistaken:
As part of the pre-test for the tryout group, we presented a series of problems that involved simple addition, subtraction, and multiplication of fractions.
When we tabulated the results of the pre-test, we knew that we might be in deep trouble. Of the 32 junior-high kids in the tryout group, one could add fractions with unlike denominators:One kid in the group could multiply fractions:
2 3
--- + --- =
3 4Unfortunately, the kid who could add was not the kid who could multiply.
2 3
--- x --- =
3 4
They went ahead with the program field test anyway, but soon had to face the grim reality that things weren't going to work out.
[I]t became apparent that the only thing to do with these kids was to bring them back to frame one and teach them about the properties of fractions and basic fraction operations. Their misconceptions were amazing. Some of them could tell whether a fraction was more than one or less than one; however, they didn't seem to understand that the 1 referred to is the same 1 you say when you count: one, two, three ...
We looked at some other "pre-algebra" groups and observed the same problems we saw in our group. So we scrapped the algebra program and started over about ten rungs lower on the academic ladder, with fractions, what they are, and how they work.
So, the program was rewritten before it was released, which is all but unheard of in education. A study was then conducted with the program. The study involved two teachers:
One was a devotee of manipulatives and the NCTM approach. This teacher spent lots of time teaching math -- 1 1/2 hours a day. She gave her students big time homework assignments. She worked until eight every evening preparing for the next day. The other teacher did not believe in homework (yea for her). She spent far less time on teaching math.
The study was great because the kids (6th graders) were matched in performance, and pairs were randomly split for distribution to the two classrooms. The resulting classrooms were greatly heterogeneous. In the end, the "NCTM" group was slightly ahead (but not significantly) above the video group.
After both groups worked on fractions, decimals, and percents for a semester, they received a three-part test: the first part on tasks and problems that were unique to the video program; the second on the tasks and problems unique to the NCTM program; the third on tasks and problems common to both programs.
The results:
The lower half of the videodisc program outperformed the upper half of the NCTM group on everything. On the items presented only in the NCTM program, the lower half of the group averaged 65 percent correct; the upper half of the NCTM group averaged 51 percent correct. On those items common to both groups, the lower half of the video group averaged 65 percent correct; the NCTM group averaged 35 percent correct. The upper half of the video group averaged 90 percent and 97 percent on these two parts of the test.
Clearly the problem of providing effective math instruction has far less to do with kids then with the delivery of instruction.
I'd like to think I've made my point by now about the importance of effective instruction, but our educators won't let a little thing like facts get in the way of their pet biases.
I do enjoy hearing their rationalizations though, so I'm going to stir the pot a little over the course of the next week or so and run through all we learned with these videodisc programs.
6 comments:
I've had an interesting experience since moving from Saxon 8-7 to Saxon Algebra 1.
All of a sudden, the Saxon lessons are incredibly easy - so easy that they seem far below "my level." I've begun doing 2 lessons a day because I move through them so quickly.
I'm going to pull out my other algebra books to check whether I have the same feeling about the material in Dolciani or Foerster. I'm guessing I will, but we'll see.
Another clue: Saxon algebra is an "integrated" curriculum, meaning that algebra and geometry are included in the same book.
I've had far less practice on geometry problems, and I continue to find those items fairly challenging.
If this experience holds up, it tells me that a good program including distributed practice doesn't just teach math. Over time it makes math easy.
My neighbor has been having the same experience this year. She's a statistician who spent last year struggling to get her son through the same accelerated math class Christopher is taking this year. It was a nightmare.
Recently she was talking to a friend whose child is in the "regular" math class. The friend said she was happy her child wasn't in accelerated math, because math is hard.
My neighbor's instant reaction was: this stuff isn't hard, it's easy when you have good teaching & enough practice.
She wouldn't have said this last year. Last year she perceived math as being fairly hard for her son, whose favorite subject is social studies. She made a strong distinction, in her own mind, between "math people" and "word people," and her son was a word pereson.
She still believes in that distinction, but now she perceives math as being easy for word people, too.
The reason math is easy for her son this year is that the 7th grade accelerated math course is virtually identical to the 6th grade course.
Taking the same course two years in a row means one thing: overlearning.
Once you hit the point of overlearning, the material you've overlearned seems super-simple.
I'm getting the feeling that the next material you learn may also seem simpler than it might have, but I don't know.
In any case, I've had superb self-teaching this year using Saxon Math, while Christopher has had miserable teaching at his school.
I've gained all kinds of expertise, while he's just been trying to get through in one piece.
Now math seems easy to me, but still seems hard to him.
Stupid question from a newbie to your blog (probably via eduwonk): What does DI stand for?
Ken is using DI to stand for Direct Instruction, a specific curriculum. Sometimes people use DI to mean differentiated instruction, something totally different.
On occasion, people will speak of what I call a little "direct instruction," which just means a teacher-centered rather that student centered way of instructing students.
Some people aren't aware that there is an acutal curriculum out there called Direct Instruction.
Hope that helps...
SusanS
That would be "actual."
....missing the Wiki edit after-the-fact feature since simple preview clearly isn't enough.
Oh dear. Let's see...how many utterly moronic, nonsensical, failed "research based" educational save-the-world concepts have I seen go down in flames in the last year alone...?
There is much to be said for real world experience, common sense, and old fashioned dedication and competence. All research isn't worthless, but much of what passes for research in education these days certainly is. There is no panacea save competent, dedicated, heads-up teaching.
Mike,
Most of the bad research comes from educators themselves who don't seem to know how to design or conduct a legitimate study.
Legitimate research works for everyone else. I see no reason it can't work for education. There's no reason to ignore research coming from a well designed study.
Not unsurprisingly, if you just look at the valid research with educationally significant effect sizes, what you see is simply good competent heads-up teaching practices. That's why it worked.
Post a Comment