**Update** The scandal grows. It turns out that the Reading First schools in Madison underperformed the non-Reading First schools by almost a 1/4 standard deviation. This is an educationally significant difference. (Clarification: I'm using "Reading First schools" to designate the schools in Madison that were eligible for reading First funding but didn't get it because Madison decided to use their balanced literacy program instead of a Scientifically based reading program.)
In part one of this post, I showed you the embarrassing horror show that is taking place in the Madison school district that has hapless first graders guessing at the word "pea" and thinking the word is "pumpkin." If it weren't embarrassing enough having that displayed in the opening paragraphs of this NYT article, the Madison school district has the audacity to claim that their reading program is actually boosting student performance.
Call it the $2 million reading lesson.
By sticking to its teaching approach, that is the amount Madison passed up under Reading First, the Bush administration’s ambitious effort to turn the nation’s poor children into skilled readers by the third grade.
...
Madison officials say that a year after Wisconsin joined Reading First, in 2004, contractors pressured them to drop their approach, which blends some phonics with whole language in a program called Balanced Literacy. Instead, they gave up the money — about $2 million, according to officials here, who say their program raised reading scores.
...Under their system, the share of third graders reading at the top two levels, proficient and advanced, had risen to 82 percent by 2004, from 59 percent six years earlier, even as an influx of students in poverty, to 42 percent from 31 percent of Madison’s enrollment, could have driven down test scores. The share of Madison’s black students reading at the top levels had doubled to 64 percent in 2004 from 31 percent six years earlier.
And while 17 percent of African-Americans lacked basic reading skills when Madison started its reading effort in 1998, that number had plunged to 5 percent by 2004. The exams changed after 2004, making it impossible to compare recent results with those of 1998.
No it didn't. It should have taken Diana Jean Schemo, this article's author, about half an hour on the internet to figure out that the "Madison officials" were spinning the "reading scores."
NAEP scores for Wisconsin show that the number of proficient students was 34% percent in 1998 and dropped slightly to 33% in 2003 and stayed there in 2005, the last time fourth graders were tested for reading. Students scoring at the basic level dropped from 69% to 67% during this same period. From the period 1992-2005, the achievement gap between black and white students rose from 28 points to 33 points and the gap between poor and non-poor students dropped slightly from 28 points to 25 points. So, NAEP shows us that the reading proficiency of Wisconsin fourth graders has basically remained flat since about 2000. (Go here and select Wisconsin as the jurisdiction.)
The Wisconsin Reading Comprehension Test (3rd grade) tells a different story. The number of proficient students in Wisconsin rose 22.5 points from 64.9% in 2000 to 87.4% is 2005. Based on our knowledge of NAEP scores for the same population, we know that this gain was imaginary. What Wisconsin did was a combination of making the test easier and lowering the cut score so that 22.5% more students would be able to pass it. The result is that the mean shifted by about +0.77 of a standard deviation.
Gather 'round, kids, and learn how Wisconsin made a fool out of the statistically illiterate Diana Jean Schemo.
Here's what I get from comparing the 1998-99 scores to the 2004-2005 scores:
In 1998 the percent of the state passing the exam was 64.9%. In 2005, it was 87.4%. A shift of 0.77 standard deviations (s.d.). Remember, during this same period NAEP scores for the state remained virtually unchanged.
In 1998 the percent of the Madison school district passing the exam was 58.9%. In 2005, it was 82.7%. A shift of .72 s.d. If anything, Madison underperformed the state during this period.
Update Three: From this source, I found out that Madison's Reading First Schools (the schools eligible for RF funds, but which never go them because Madison made them use a balanced literacy program) are Glendale, Hawthorne, Lincoln, and Orchard Ridge. As it turns out, the average gains made by these schools was only 21.6% or 0.56 s.d. In contrast, the non-Reading First schools gained 25.4% or +0.80 s.d. (Again, the percentile scores are deceptive because the Reading First schools caught the fat part of the curve, whereas the non-RF schools did not.) To put this in perspective, the 0.24 s.d. differential is about the same effect size found in project STAR (class size reduction) which educators rave about. So, the non-Reading First schools in Madison slightly overperformed the state average while the Reading First eligible schools significantly underperformed the state avergae.
I didn't find disaggregated data, so I can't determine if the stats for black students given to Schemo is accurate. But assuming they are, the rise in scores from 31% to 64% represents a shift of +0.86 s.d. That seems highly improbable considering 1) that Madison's overall rise was only +0.72 (this means that the scores for non-black in Madison rose only about +0.58, a significant underperformance compared to the rest of Wisconsin) and 2) black-white achievement gap on NAEP actually increased during this period, meaning that black performance actually declined.
Update: Rory finds the disaggregated data. In 2005, black performance in Madison was 57% which represents a shift of +0.68 s.d. which represents a underperformance of both Madison (+0.72 s.d.) and Wisconsin (+0.77 s.d.).
What actually happened was something like this:
Wisconsin raised the mean passing level from level 1 to level 2. The top curve shows black performance. This shift captured the fat part of the curve for black students (31% to 64%); picture level 1 being even further to the left. The blue shaded portion represents the increase in black performance.
Update two: Just in case it isn't clear, Madison's claim that black student performance "doubled" and exceeded the gains made by other students based on percentile changes is highly mispleading and a misuse of statistics. When you measure gains by the more statistically accurate standard deviation it is readily apparent that blac student performance was no greater than the gains made by other groups.. In fact, it appears to be considerably less. The achievement gap remains unchanged or may have even grew. This is what you would expect to happen under a whole language reading program because those students at the bottom of the curve are the ones most damaged by the non-explicit instruction.
The bottom curve shows white performance. The shaded portion between level one and level two shows the gain. I'd estimate that the white performance rose from about 69% to about 90%. (If anyone finds the disaggregated data let me know.) As you can see, white performance was already past the fat part of the curve--the average white student was passing the exam back in 1998.
Update: White performance in Madison was actually 93%. Ka-ching!
One thing is for certain, Madison's balanced literacy program did not cause the imaginary gains made by Madison school children.
Update: Mark Liberman of Language Log chimes in:
DeRosa's argument seems pretty persuasive to me. If he's right, then I'd consider a different evaluation -- was Schemo bamboozled by the Madison school authorities, or did she pick Madison, and spin the story as she did, in order to make an essentially dishonest point, suggesting that the mean old federal bureaucrats are trying to stop the dedicated local educators from continuing to use the methods that are helping their children so much?I certainly don't want to discount the possibility that Schemo was a willing dupe. It could very well be that she went looking for a story and Madison told her what she wanted to hear.
18 comments:
http://data.dpi.state.wi.us/data/graphshell.asp?Group=Race/Ethnicity&GraphFile=GEDISA&DETAIL=YES&Grade=4&SubjectID=1RE&EligibleOnly=NO&Level=ALL&WOW=WSAS&ORGLEVEL=DI&FULLKEY=02326903ZZZZ&DN=Madison+Metropolitan&SN=Show+Schools
I think this is the disaggregated data you were looking for...
link
Thanks, Rory.
Exactly as I predicted.
white scores in the low 90s.
Black scores at mid 50s instead of 64%.
That's consistent with a .77 sd shift.
I'll update later.
From the article
"Deborah C. Simmons, who helped write the guide, said it largely reflected the available research, but acknowledged that even now, no studies have tested whether children learn to read faster or better through programs that rated highly in the guide."
Ah... PFT never happened.
Liz from I Speak of Dreams.
Thanks for the excellent smackdown, Ken.
Two related posts, one from Language Log and one from me:
Language Log: Reading Corruption
A Bitter and Sweet Story: Wisconsin Boy's Unremediated Dyslexia.
And Liberman quotes you in an update:
DeRosa's argument seems pretty persuasive to me. If he's right, then I'd consider a different evaluation -- was Schemo bamboozled by the Madison school authorities, or did she pick Madison, and spin the story as she did, in order to make an essentially dishonest point, suggesting that the mean old federal bureaucrats are trying to stop the dedicated local educators from continuing to use the methods that are helping their children so much?
That's a nice use of diagrams - it really makes it easy to show the flim-flam that's occuring. If it's all right with you, I'd like to use that diagram next year when I talk about misleading statistics to my students.
Well how about this... At least one person on the Madison School Board already knows the gains are imaginary.
Here is a letter that Ruth Robarts of the Madison Board of Educatin wrote into their local paper.
http://www.schoolinfosystem.org/archives/2004/12/index.php
Thanks to Jason Shepard for highlighting comments of UW Psychology Professor Mark Seidenberg at the Dec. 13 Madison School Board meeting in his article, �Not all good news on reading�. Dr. Seidenberg asked important questions following the administration�s presentation on the reading program. One question was whether the district should measure the effectiveness of its reading program by the percentages of third-graders scoring at �proficient� or �advanced� on the Wisconsin Reading Comprehension Test (WRCT). He suggested that the scores may be improving because the tests �aren�t that rigorous�.
I have reflected on his comment and decided that he is correct.
Using success on the WRCT as our measurement of student achievement likely overstates the reading skills of our students. The WRCT---like the Wisconsin Knowledge and Concepts Examination (WKCE) given in major subject areas in fourth, eighth and tenth grades--- measures student performance against standards developed in Wisconsin. The more teaching in Wisconsin schools aims at success on the WRCT or WKCE, the more likely it is that student scores will improve. If the tests provide an accurate, objective assessment of reading skills, then rising percentages of students who score at the �proficient� and �advanced� levels would mean that more children are reaching desirable reading competence.
However, there are reasons to doubt that high percentages of students scoring at these levels on the WRCT mean that high percentages of students are very proficient readers. High scores on Wisconsin tests do not correlate with high scores on the more rigorous National Assessment of Educational Progress (NAEP) tests.
In 2003, 80% of Wisconsin fourth graders scored �proficient� or �advanced� on the WCKE in reading. However, in the same year only 33% of Wisconsin fourth graders reached the �proficient� or �advanced� level in reading on the NAEP. Because the performance of Madison students on the WCKE reading tests mirrors the performance of students statewide, it is reasonable to conclude that many of Madison�s �proficient� and �advanced� readers would also score much lower on the NAEP. For more information about the gap between scores on the WKCE and the NAEP in reading and math, see EdWatch Online 2004 State Summary Reports at www.edtrust.org.
Next year the federal No Child Left Behind Act replaces the Wisconsin subject area tests with national tests. In view of this change and questions about the value of WRCT scores, it�s time for the Board of Education to review its benchmarks for progress on its goal of all third-graders reading at grade level by the end of third grade.
Ruth Robarts
Member, Madison Board of Education
End Quoted Letter
death by data
wait!
I don't follow the latest.
The schools that are using SBRR programs are 1/4 standard deviation below the schools that aren't using SBRR programs?
Right, is the last update misreported or are you conceding that the reading first schools are doing worse?
Right, is the last update misreported or are you conceding that the reading first schools are doing worse?
Sorry. The Reading First schools are the ones that were eligible for funding but didn't get it because Madison opted to teach them using whole language.
oh, thanks!
I was all set to get this whole thing out to the Irvington Parents Forum when suddenly you seemed to be saying that Reading First schools were doing worse!
Are you going to follow up with an explanation of the 1/4 standard deviation??
(Not that I need it, but I'm interested.)
Ken,
The disaggregated data Rory linked to is not for the old 3rd grade reading test under discussion. It was given for the last time in the spring of 2005.
The data Rory links to is for the grades 3-8, 10 state achievement tests administered in the fall. Grades 3, 5, 6, and 7 were added to this assessment beginning the fall of 2005.
Not saying the aggregate and disaggregated performance would necessary be different, but this is a different assessment.
Best...
Stiles is correct:
In the NYT article it says:
"And while 17 percent of African-Americans lacked basic reading skills when Madison started its reading effort in 1998, that number had plunged to 5 percent by 2004. The exams changed after 2004, making it impossible to compare recent results with those of 1998."
Also the Wisconsin website says alot about this issue.
I took the WCRT data from 2005 in Madison and compared them to the 2005 WKCE data.
The numbers line up pretty well, but not exactly
Category WRCT vs WCKE
Advanced 42.4% vs 42%
Proficient 40.3% vs 37%
Basic 8.1% vs 13%
Below Basic 1.5% vs 4% (note I combine two categories on the WCKE)
Seems like Wisconsin is doing everything in their power to keep us from comparing numbers.
The point should be made, that despite the numerous different ways of measuring data, Wisconsin still uses the current numbers to define the achievement gap and the proficiency of its students.
Even though their own numbers don't correspond with each other, none of their numbers have ever correlated with the one reference we have, which is the NAEP test.
Also one other note. The article uses data from the WRCT test which ended after 2005. The article states the test changed after 2004, but the test actually changed after 2003.
Ok... my brain hurts.
I believe the WRCT data from 2004-2005 sy is still comparable to the 1998-1999 data. Even so, the 2003-2004 data is similar.
Post a Comment