February 13, 2008

Where's the so-called improvement

I recently wrote that:

Student achievement has not improved during the past 50 years during this rapid increase of wealth.

Stephen Downes responded:

This makes you possibly the only person in the world to believe this. 50 years ago, students were not even finishing high school. A tiny fraction attended university. Many were still illiterate. To say that student achievement has not improved over the last 50 years is a fabrication on a monumental scale

I doubt I'm the only one to believe this, but even if I were, it doesn't necessarily mean that I'm wrong.

Let's take a look at some data and see who has the better position.

Decent longitudinal data is hard to come by in education. About the best we have is the longitudinal NAEP data going back to the early 70s.

Let's look at the performance of 17 year olds in reading.

No statistically signficant improvement. Across the board. Nine year olds have seen a small increase in recent years, but it's yet to be seen whether those gains will translate into real gains by 17 years olds.

Scores in math tell a similar story, but there has been some slight improvement which does not appear to be educationally significant.

Now let's take a look at SAT scores.

Notice the large drop between 1963 and 1980, in particular the pre-1971 drop which is where the NAEP data begins. This represents nearly a drop of a standard deviation.

Now I understand the selection bias issue here and the great expansion and demographic shift of students taking the SAT. Analysts claim that these factors might account for up to 75% of the drop. That still leaves 25% of the drop unexplained. We're still in drop territory, no one seriously argues that there really was an improvemnet that was washed out by all the new students. Most tellingly is that this substatial drop in scores also occurred at the top of the scale where the influx of demographically-shifted students should not have affected the scores as much. This is evident in the SAT recentering conversion that took place in 1995:

Where the solid black line is above the dotted line students received bonus points. Since the solid line is above the dotted line across the entire range, all students received a bonus, even the kids with scores above 650, i.e., the elite students. (Math scores remained steady at the top, except for scores above 750 which showed a slight drop.)

The point of the SAT scores is that no one seriously argues that there's been an improvement in student achievement, except maybe Stephen Downes.

I doubt that even Stephen believes his own rhetoric at this point. That's why he argues: "A tiny fraction attended university" as opposed to arguring that they actually graduated from university or graduated from univerisity with a degree that the job market actually values as opposed to a degree from some new fangled major which might, at best, get you a job as my secretary, so long as you learned how to type. He's also reduced to arguing that "Many were still illiterate." It seems that the NAEP data refutes that claim and I'd be surprised if other literacy measures tell a different story, but I'll let Stephen dig up that data should he chose to maintain this argument.

Oops: I forgot to address this argument: "50 years ago, students were not even finishing high school." Finshing high school was far more of an accomplishment fifty years ago when there was no or little social promotion. Seat time is not a valid substitute for academic achievement in my opinion and the data shows that academic achievement has not improved. The drop-out rate has gone down a bit, but the data before the 70s is spotty at best and, according to Jay Greene, is inaccurate afterwards.


Stephen Downes said...


The reading tests and the SAT tests have in common the property of measuring only those who take the test.

If you measure only the successful students, you are eliminating all the low-achievers: the drop-outs and failures, which where much greater in number 50 years ago.

Never mind your reading study (which is neither linked nor referenced) not the SAT tests (and the unnamed 'analysts' who 'explain' the drop in scores).

Look at actual achievement statistics, not biased tests. Look at high school completion rates, for example. Take a look at figure 1 in this document:
Except for the expected dip during the Bush years, the chart shows a constant improvement in the graduate rates of Americans.

How can you look at that and say there has been no increase in achievement?

You look at the graduates, see that they averaged 70 percent (or whatever), and say nothing has changed. I count the graduates, and see there's twice as much now as 50 years ago, and I say everything has changed.

Which one is the *real* achievement, and which one is just playing with statistics?

Finally, the paragraph beginning "I doubt that even Stephen believes his own rhetoric at this point" doesn't make any sense.

A tiny fraction of people went to university in 1958. many more people go to university today. That is an example of increased educational achievement. So I don't see how this refutes what I said.

As for literacy, you "It seems that the NAEP data refutes that claim," that more people were illiterate in 1950. As though a test of 17-year-olds in school could show that. The claim that "many were still illiterate" stands, as the tables here show:

The link I just reference also contains the reading scores and the SAT scores you mention in the article. I don't blame you for not providing the URL - nobody would have gotten past the first few tables - the ones that reports educational attainment data.

What do they say? In 1960, 41 percent of 25-year-olds had graduated from high school. By 2000, that number was up to 84 percent. And in 1960, 7.7 percent had completed 4 years of college, while in 2000, that number was over 25 percent.

These statistics are so striking, and so widely known, that I cannot help but question why someone would say no increase in educational achievement happened during that period. It seems to me to be sheer denial of the obvious. What's the point of even trying to argue with someone who ignores data more striking than a slap in the face? I can't believe a whole post was dedicated to propagating such nonsense.

KDeRosa said...

Did you actually read the post, Stephen?

The reading tests and the SAT tests have in common the property of measuring only those who take the test.

I'm sure you know that this is called selection bias, a point I mentioned and accounted for in the post.

Also, the NAEP data is for a representative sample of all students. With respect to the drop outs you'll want to note that the 13 year olds also had flat scores and most eentuall drop outs were still in school at that age.

Sorry about the missing links. See here.

Look at actual achievement statistics, not biased tests. Look at high school completion rates,

I also addressed this point in the post. Two words: social promotion. As you can see from the NAEP tests (which I'll continue to contend are unbiased until you can provide support for your bias contention), reading ability (and math ability) didn't improve, despite all these new high school graduates. Possible cause: graduation requirements were made easier and/or eliminated. Studenst can now graduate high school as long as they show up.

Which one is the *real* achievement, and which one is just playing with statistics?

I say yours is the one that is playing with statistics for the reasons I've given.

The claim that "many were still illiterate" stands, as the tables here show

Your link(s) do not provide longitudinal data for literacy.

Graduating high school is not a proxy for being literate or educated. Your claim remains unsubstantiated.

Why don't you also point out that grades are also on the rise. Students today receive much better grades today than 50 years ago. Ipso facto they must be smarter and better education.

Really, Stephen. I expect better arguments from you. You are quickly descending in hackery.

KDeRosa said...


I actually agree with something Kozol has said:

Grade level completion does not equal grade level competence.

Better pack it in, Stephen

KDeRosa said...

Let me give you a hand with the supporting evidence, Stephen:

The mean prose literacy score of:

26-35 year olds was 275 (Level 2), which ranked them 11th (tie) out of 19 countries,
36-45 year olds was 284 (Level 3), which ranked them 5th,
46-55 year olds was 277 (Level 3), which ranked them 3rd, and
56-65 year olds was 266 (Level 2), which ranked them 2nd.
(Sum, 2002, p16, Table 8)

Looks like all those geezer high school drop-outs aren't so illiterate after all. Or, all those new high school graduates aren't as literate as you think.

It's like shooting fish in a barrel.

Anonymous said...

Oh yeah, everything has changed.

50 years ago illiterates didn't graduate from high school.

Now they do. They even go to college. Universities now teach remedial reading and elementary school arithmetic.

That's progress for you.

Unknown said...

Ken, Stephen: I want to thank you guys for the point/counter-point discussion.

I have two kids who, given thoughtful instruction, are very capable of learning almost anything, but are unfortunately ill-served by our school (even though it is one of the best school districts in California).

Because of our specific situation, I tend towards Ken's description of the world of education. It's really great to hear Stephen's alternate view. It absolutely convinces me that Ken is on the right track.

Anonymous said...

Well, I hate to stir the pot (ok yes I *love* to stir the pot) but two things. First, "high school completion" as a measurement is as valid as measuring what percentage of the population has a passport. Too many districts have turned it into little more than a certificate of attendance for it to have any meaning whatsoever. Furthermore, places like Houston ISD have proven that this number is easily manipulated. If anything, improved graduation rates are a symptom of a better economy rather than better education: it means young people didn't have to drop out to support their families.

Second, it is a fairly well documented fact that the actual literacy rate -- the rate of people who can read -- declined markedly between World War 2 and the Korean War, and fell still further by the time the Vietnam War rolled through the draft pool. Lest anyone accuse me of innumeracy, I am aware that WW2 ended over 60 years ago. I consider it a valid baseline if we are going to sling around and refute statements about literacy and student performance during the last 50 years.

Oh, and the purpose of the GED was to prove that those soldiers coming home from WW2 (who maybe were too busy supporting families or serving their country to graduate) knew everything they would have learned in high school. Can we put a stake through the heart of "graduation rates" as a measure of anything whatsoever?

KDeRosa said...

My understanding is that the Army has had a floating standard when it comes to their entrance exam, which is mostly an IQ test. In times of peace the bar is set low. In times of war, the bar gets lowered in the event additional soldiers are needed. This, I believe, will affect literacy rates.

Anonymous said...

"My understanding is that the Army has had a floating standard when it comes to their entrance exam, which is mostly an IQ test. In times of peace the bar is set low."

Um ... I think the bar is mostly set today to exclude the bottom 30% or so by IQ.

And, yeah, this will be dropped if there is a serious need for manpower. One issue in today's army is that the "grunts" need to use some pretty sophisticated stuff and use it well. I'm not sure that the bottom 20% can do what the army needs done...

-Mark Roulo

KDeRosa said...

Plus, they frequently get themselves killed at higher rates and break stuff at higher rates than the more cognitively capable soldiers. But sometimes you need boots on the ground, like now in Iraq. I jut read somewhere that they've been more lenient to attract enough soldiers to sign up.

Anonymous said...

One thing about NAEP assessments--they might be unbiased, but they're generally worthless. Their "reading" tests are actually writing tests, for example, and they don't measure a thing accurately.

I come down somewhere in the middle on this. I would argue that our top 10-15% of students are learning more in high school than ever before--not because of an improvement in education, but because of increased competition in high school.

This is demonstrated by the increase in students taking *and passing* the AP tests. I suspect they had the ability in the past, but it wasn't considered necessary. Again, this isn't anything to do with the improvement in education, but it's worth mentioning.

SAT scores don't show much there, because SAT scores are normed to the population. So yes, the recentering made scores much higher, but it doesn't mean that students were doing more poorly than in the past.

For example, I got a 730 on the Verbal when I took the test in late 1979. At the time, I distinctly remember that my score put me in the 99+th percentile--that is, fewer than 1% of the population got a 730 in 1979.

Today, my 730 translates to an 800. But an 800 is still a score that fewer than 1% get on the test.

So I don't think the recentering argues your case one way or the other.

My own sense is that we are educating the top 10-15% much better than ever before--too much so. Many of these kids work harder in high school than they ever do in college, where the classes have often been "dumbed down" to the lower overall level of achievement.

The middle 50, say, is getting about the same education as always. If there's a group that is arguably getting a worse deal, it's the lower half of this group. They are allowed to slide because if they don't go to college, there's no real point in performing well in high school. It's a meaningless indicator.

The bottom group has never been educated well, as you've pointed out.

KDeRosa said...

Cal, there was once a time (prior to 1963) when 800 or thereabouts did represent the top 1% and your 730 represented something less. Then this top 1% of the curve stopped preforming so well and dropped down to a score of 730 where it remains today. It's just been recentered back to a 800, but performance is still depressed. teh top 1% do not perform so well as they once did.

Anonymous said...

But going *that* far back, you are (as you acknowledge) testing very different populations.

So the top 1% *of SAT testers* became a very different group.

For example, the top 1% of SAT testers in 1980 may very well have been the top 1% of all high school students.

In 1963, the top 1% of SAT testers may have been the top .01% of all high school students. Had we tested all high school students in 1963, we may have gotten a similar top 1% as we did in 1980.

I'm not really disputing your main point. I just don't think SAT scores prove that particular point.

KDeRosa said...

I understand the infirmities of the argument. I didn't mean to imply any definitiveness.

But my understanding is that most of the demographic shift and expansion of test takers occurred prior to 1963 before the drop in scores.

One day I'll locate the cite.