There is lots of stuff we don't know. Lots of stuff without firm scientific support. Yet in many areas without firm scientific support, we often encounter zealous advocates who either believe we know much more than we do or are confused as to what we actually know.
The Big Bang Theory doesn't explain what caused the universe to come into being.
How did the universe come into being? There is plenty of good observational evidence for the Big Bang theory. But that doesn't explain how the universe came into being in the first place. What happened before the big bang? Science is unable to describe the universe before the Planck Epoch (when the force of gravity separated from the elctronuclear force). Currently, we don't know what caused the big bang or how the universe came into being.
The Theory of Evolution doesn't explain how life originated.
How did life start? There is plenty of observation evidence for the theory of biological evolution. But that doesn't explain how life came into being in the first place. What happened before there were organisms? Science is unable to explain how organisms came into being in the first place. There is no scientific consensus on how life began. Currently we don't know how life began.
(FYI: Intelligent Design is one argument for how the universe and life began. It has about the same scientific support as any other argument for how the universe and life began. That is, none. Of course, Intelligent Design isn't exactly scientific. But then again science hasn't provided any answers yet either.)
The Theory of Global Warming is infected with politics for the time being.
In a few decades we might know whether the current scientific consensus and environmental hysteria comports with the data. To the extent there is a consensus, the science remains shaky--far shakier than what we know about the origins of the universe and of life. Far shakier than the consensus scientists would like you to believe. (I think many of them don't even understand why "consensus science" isn't actually science.)
(Long Version)
We don't know how to reliably educate low-IQ/low-SES children
The science is very thin on improving student achievement outside of the elementary school years. Most theories aren't even based on actual testing of an intervention. Most theories are based on observations of correlational data on broad proxies for variables believed to affect education (poverty, teacher efficacy, availability of free lunch, availability of health insurance, and the like). not so much on actual interventions designed to improve or ameliorate these variables.
So what is government policy, like Race to the Top, based on?
Sunshine and lollipops mostly.
November 19, 2009
November 18, 2009
Your tax dollars hard at work
Virginia is going to analyze why there is a disproportionately low representation of minority students in gifted education.
Gee, what could possibly be the reason behind this disparity?
Well, let's see. Giftedness is largely determined by performance on IQ tests. So, let's take a look at the relative performance between whites and blacks on IQ tests. Better yet, let's break performance out by socio-economic status.
The chart shows a 10 to 16 point gap between white and black performance for IQ (that's about a standard deviation) across the SES spectrum. Blacks in the highest SES decile perform about as well as the 50th percentile White (5th decile). Even raising the SES of Blacks wouldn't close the IQ gap even if there were a causal connection.
Mightn't this explain all or most of the discrepancy? It took about ten seconds of googling. Virginia needs about a year and a half to find a more politically correct explanation.
There is a similar IQ gap between the performance of Asians and Whites which also explains why Asians are disproportionately represented in gifted classes. Hopefully, the study will address that problem as well.
Must be discrimination. Some virulent form of discrimination that's only present in the nasty U.S. And Toronto too. In fact I can't find a single country in which these same IQ gaps aren't present and don't manifest themselves on achievement tests (which are actually IQ tests. Shhhh don't tell anybody). So maybe that theory doesn't hold up.
In any event Erin Dillon from The Quick and the Ed is looking for a solution.
How exactly does one address those disparities?
"Virginia is proud of both the high standards of our educational system and the wealth of diversity in our communities," Governor Kaine said. "As we continue to improve on our gifted education programs in particular, it's critical we assess any disproportionate barriers to enrollment so we can ensure students of all backgrounds have the opportunity to participate."
Data reported by school divisions to VDOE show that while African-Americans make up 26 percent of the statewide student population, only 12 percent of students identified as gifted are black. Hispanics make up nine percent of the student population and five percent of students identified as gifted.
Gee, what could possibly be the reason behind this disparity?
Well, let's see. Giftedness is largely determined by performance on IQ tests. So, let's take a look at the relative performance between whites and blacks on IQ tests. Better yet, let's break performance out by socio-economic status.
The chart shows a 10 to 16 point gap between white and black performance for IQ (that's about a standard deviation) across the SES spectrum. Blacks in the highest SES decile perform about as well as the 50th percentile White (5th decile). Even raising the SES of Blacks wouldn't close the IQ gap even if there were a causal connection.
Mightn't this explain all or most of the discrepancy? It took about ten seconds of googling. Virginia needs about a year and a half to find a more politically correct explanation.
The study - which is being conducted with technical assistance from the Regional Educational Laboratory Appalachia - will be completed by Spring 2010.
There is a similar IQ gap between the performance of Asians and Whites which also explains why Asians are disproportionately represented in gifted classes. Hopefully, the study will address that problem as well.
Must be discrimination. Some virulent form of discrimination that's only present in the nasty U.S. And Toronto too. In fact I can't find a single country in which these same IQ gaps aren't present and don't manifest themselves on achievement tests (which are actually IQ tests. Shhhh don't tell anybody). So maybe that theory doesn't hold up.
In any event Erin Dillon from The Quick and the Ed is looking for a solution.
Racial disparities in gifted vs. regular education classes seemed obvious enough to me when I attended public schools in Virginia. One can only hope that this study will put some momentum behind addressing those disparities.
How exactly does one address those disparities?
November 13, 2009
Pot. Kettle. Bracey.
The last and thankfully final Bracey Report attempts to analyze the research support underlying the following three assumptions about how to reform education.
As worded, the answer to all these questions is that the research is insufficient. But notice for questions 2 and 3 how any amount of improvement will do, while for question 1 only improvement that will "eliminate the achievement gap." Such improvement would have to be on the order of about a standard deviation increase in non-Asian minority performance with no increase in white performance. A tall order indeed. In fact such a tall order, that no in-school or out-of-school intervention has ever achieved such results for the general population-- even the ones that Bracey supported and touted in this very report.
This is Bracey at his most dishonest--glaringly dishonest. Bracey had an agenda and he didn't mind bending the facts to fit his preferred outcome. He wasn't an honest researcher and this will be his lasting legacy.
A more honest researcher might adopt a more neutral standard of achievement such as "an increase in the performance of all students by an educationally significant amount (0.25 standard deviation)." That would be a laudable goal and also would serve to reduce the achievement gap. It's also the generally accepted standard in education research.
Under such a standard, Bracey would still get to criticize mayorial control and higher (national) standards as not having a sufficient research base; however, he'd have to acknowledge that there is a sufficient research base for higher-quality schools under this standard at least in the elementary school years.
Another problem with Bracey's reports is that they are peppered with his own assumptions about how to reform education that don't have sufficient research base or are contradicted by the data. Here are a few.
Schools serving low-income communities of color tend to have resources above the median school.
None of the studies Bracey, especially Berliner's, directs us to are capable of determining the causal link that Bracey implies. Bracey then proceeds to give us a few pages of various ailments and problems associated with poverty and attempts to draw a bleak picture of poverty's causal effect on student achievement. He has to resort to anecdote because the data a much less bleak picture. Poverty, or more accurately low socio-economic status (SES), is correlated with low student performance. But the amount of variance in student performance attributable to variations in SES is only about 18%. That means that 82% of the variance is attributable to non-SES factors. Bracey knows or should have known this, but misrepresents the data anyway.
Bracey then looks at one ham-fisted study, Harlem Promise Academy, as a refutation. He ignores the other studies which have shown results larger than the 0.18 standard deviation gap attributable to poverty effects.
Bracey does a better job with the mayoral control and high standards issues. But the problem is that on the poverty/SES issue Bracey's non-research-based views are no better than those of the proponents of mayoral control and high standards.
Pot and Kettle meet Bracey.
1. High-quality schools can eliminate the achievement gap between whites and minorities.
2. Mayoral control of public schools is an improvement over the more common elected board governance systems.
3. Higher standards will improve the performance of public schools.
As worded, the answer to all these questions is that the research is insufficient. But notice for questions 2 and 3 how any amount of improvement will do, while for question 1 only improvement that will "eliminate the achievement gap." Such improvement would have to be on the order of about a standard deviation increase in non-Asian minority performance with no increase in white performance. A tall order indeed. In fact such a tall order, that no in-school or out-of-school intervention has ever achieved such results for the general population-- even the ones that Bracey supported and touted in this very report.
This is Bracey at his most dishonest--glaringly dishonest. Bracey had an agenda and he didn't mind bending the facts to fit his preferred outcome. He wasn't an honest researcher and this will be his lasting legacy.
A more honest researcher might adopt a more neutral standard of achievement such as "an increase in the performance of all students by an educationally significant amount (0.25 standard deviation)." That would be a laudable goal and also would serve to reduce the achievement gap. It's also the generally accepted standard in education research.
Under such a standard, Bracey would still get to criticize mayorial control and higher (national) standards as not having a sufficient research base; however, he'd have to acknowledge that there is a sufficient research base for higher-quality schools under this standard at least in the elementary school years.
Another problem with Bracey's reports is that they are peppered with his own assumptions about how to reform education that don't have sufficient research base or are contradicted by the data. Here are a few.
Students attending American schools run the gamut from excellent to poor. Well-resourced schools serving wealthy neighborhoods are showing excellent results. Poorly-resourced schools serving low-income communities of color do far worse. (p. 2)
Schools serving low-income communities of color tend to have resources above the median school.
I said above that if there are to be more high-quality schools (or at least, “high-quality” schools in terms of high or rising test scores), they will have to be developed in low-income neighborhoods. (p. 3)Bracey is implies that schools in higher income neighborhoods are doing a fine job educating low-income students. They aren't.
Before taking up the question of whether schools alone can remedy the achievement gap for poor children, we have to ask what is known about the effect of poverty on children. What are some of the out-of-school factors that contribute to poor children’s lower performance? (p. 4)
None of the studies Bracey, especially Berliner's, directs us to are capable of determining the causal link that Bracey implies. Bracey then proceeds to give us a few pages of various ailments and problems associated with poverty and attempts to draw a bleak picture of poverty's causal effect on student achievement. He has to resort to anecdote because the data a much less bleak picture. Poverty, or more accurately low socio-economic status (SES), is correlated with low student performance. But the amount of variance in student performance attributable to variations in SES is only about 18%. That means that 82% of the variance is attributable to non-SES factors. Bracey knows or should have known this, but misrepresents the data anyway.
These disadvantages all operate to attenuate achievement in schools. The question is, can “high-quality” schools alone offset them? (p. 7)
Bracey then looks at one ham-fisted study, Harlem Promise Academy, as a refutation. He ignores the other studies which have shown results larger than the 0.18 standard deviation gap attributable to poverty effects.
Bracey does a better job with the mayoral control and high standards issues. But the problem is that on the poverty/SES issue Bracey's non-research-based views are no better than those of the proponents of mayoral control and high standards.
Pot and Kettle meet Bracey.
November 5, 2009
Today's Quote
The combination of some data and an aching desire for an answer does not ensure that a reasonable answer can be extracted from a given body of data.
-- John Tukey (1986)
Another Defense of SLA?
Joe Bires of Ed Tech Leadership offers a defense of the SLA position paper assignment:
Downesian nemesis TracyW provides a good rebuttal to Joe's "open yourself up to attack" and "the value of feedback" arguments which all censor advocates should read. I'll move on to the remainder of Joe's argument.
As a preliminary matter I note that Joe offers no defense to the students' writings and the deficiencies I noted. Is there a valid defense? I don't think there is one, but I'm happy to entertain that argument should some brave soul desire to make it.
Also, what is the point of this:
Allow me to offer a similar rebuttal: I disagree with Joe's comments. And, now we've entered the surreal realm of a Monty Python skit.
What is the point of offering an opinion without substantiating it. This is exactly what the SLA students did. So, I'm guessing that when Joe says the assignment wasn't clear his implicit premise is that the assignment was so vague that it was acceptable for the students to just offer up a series of unsubstantiated opinions. Joe then moves the goalposts again when he next argues that since the "overarching purpose" of the assignment was to solicit feedback, the written form of the students' work should not be judged.
This is the Tom Hoffman diminished expectations counter-argument. Here is Joe's argument made explicit.
Joe, like Tom before him, is claiming that Chris' assignment was "vague," and that the words Chris used in his instruction meant something other than their ordinary and customary meaning as I claimed. The actual assignment should be understood to be commensurate with the scope of the resulting work product of the students. The students didn't support their opinions; therefore, the assignment did not require them to provide such support. The students' work was riddled with grammatical and usage errors and did not conform to the standard essay format; therefore, Chris' rubric could not have been concerned with this aspect of the students' work. Moreover, my standard is an adult professional standard, not a high school grade standard.
Let's recast Chris' assignment to make it both crystal clear and in accordance with Joe's suggested rubric.
According to Joe, this was really Chris' assignment and the students' papers should be critiqued accordingly. As a result, the students' papers are in compliance with the assignment and my critique is off-base.
It should also be clear why Chris chose to use the term "position paper" instead of all this clarified verbiage.
Nevermind that the term "position paper" is not only not vague, but also has an established meaning, And that the interpretative rule of Contra proferentem (against the one bringing forth ) dictates that we should use this established meaning.
Let me also suggest that this is an argument that progressive educators are better off not making in a public forum if they wish to achieve any credibility with the public.
Your post should be titled “Why you should never publish anything on the Internet (or anywhere for that matter)”. The minute you publish something you open yourself up for attack, not just feedback.
I have no problem with your critique of Chris’s assignment as Chris is a professional putting his work out there for you to critique. I don’t agree with most of your comments, but I do feel the assignment wasn’t as clear as it could have been.
However, the lack of a rubric being posted with this assignment with clear standards makes critiquing the students writing as you do unfair. Look at your post again and code every time you put down students directly or indirectly. Clearly while you may not have wished to put down students, that is exactly what you are doing.
I believe the overarching purpose of posting these essays was to solicit feedback on the students’ ideas and not the written form those ideas took. When you look at student work and give feedback it is different than looking at the work of adult professionals and you must confine your feedback about students and their work to the scope of the assignment’s parameters (no matter your opinion of those parameters).
Your post reminds me of the quote:
“The turtle only makes progress when he sticks his neck out”
- James Bryant, educator
To conclude, if turtles took your advice on producing first drafts they would make zeroth progress because the feedback they would receive would discourage their further effort.
Downesian nemesis TracyW provides a good rebuttal to Joe's "open yourself up to attack" and "the value of feedback" arguments which all censor advocates should read. I'll move on to the remainder of Joe's argument.
As a preliminary matter I note that Joe offers no defense to the students' writings and the deficiencies I noted. Is there a valid defense? I don't think there is one, but I'm happy to entertain that argument should some brave soul desire to make it.
Also, what is the point of this:
I don’t agree with most of your comments, but I do feel the assignment wasn’t as clear as it could have been.
Allow me to offer a similar rebuttal: I disagree with Joe's comments. And, now we've entered the surreal realm of a Monty Python skit.
What is the point of offering an opinion without substantiating it. This is exactly what the SLA students did. So, I'm guessing that when Joe says the assignment wasn't clear his implicit premise is that the assignment was so vague that it was acceptable for the students to just offer up a series of unsubstantiated opinions. Joe then moves the goalposts again when he next argues that since the "overarching purpose" of the assignment was to solicit feedback, the written form of the students' work should not be judged.
This is the Tom Hoffman diminished expectations counter-argument. Here is Joe's argument made explicit.
Joe, like Tom before him, is claiming that Chris' assignment was "vague," and that the words Chris used in his instruction meant something other than their ordinary and customary meaning as I claimed. The actual assignment should be understood to be commensurate with the scope of the resulting work product of the students. The students didn't support their opinions; therefore, the assignment did not require them to provide such support. The students' work was riddled with grammatical and usage errors and did not conform to the standard essay format; therefore, Chris' rubric could not have been concerned with this aspect of the students' work. Moreover, my standard is an adult professional standard, not a high school grade standard.
Let's recast Chris' assignment to make it both crystal clear and in accordance with Joe's suggested rubric.
You are to write a two page opinion paper creating your vision of what school should be. The main purpose of the paper is to solicit feedback from your fellow students. You are not required to provide support and substantiation for your opinions, even though this will hinder the reader's ability to understand the basis of your opinion (to understand why you think the way you do) and will diminish the quality of the feedback. Following the standard essay format of introduction, body, and conclusion is also optional. In fact, presenting your opinion in a logical order is not required, nor do you have to separate your ideas into traditional paragraph format. Also, although the goal of the assignment is to solicit feedback, it is not important for you to communicate your opinions coherently to the reader at all so they are readily understood. Therefore, standard grammar, usage, and spelling rules need not be adhered to. Lastly, you may keep your papers real by peppering then with colloquialisms and other informalities typically associated with spoken language.
Your paper should consider the following points:
...
According to Joe, this was really Chris' assignment and the students' papers should be critiqued accordingly. As a result, the students' papers are in compliance with the assignment and my critique is off-base.
It should also be clear why Chris chose to use the term "position paper" instead of all this clarified verbiage.
Nevermind that the term "position paper" is not only not vague, but also has an established meaning, And that the interpretative rule of Contra proferentem (against the one bringing forth ) dictates that we should use this established meaning.
Let me also suggest that this is an argument that progressive educators are better off not making in a public forum if they wish to achieve any credibility with the public.
November 4, 2009
D-ed Reckoning Enters Edu-blogger Hall of Fame
Beloved Uncle Jay and crazy Aunt Jay are picking the best education blogs of 2009. But that's not important right now, what is important is:
I read that "posted forever" and "not eligible for the new list" as being the equivalent of being inducted into the edu-blogger hall of fame. Woo hoo. And, I haven't even retired yet. Although, I think I can safelycontinue start coasting now.
Here are my fellow inductees with my brief commentary.
1. EdTech Assorted Stuff -- Redundant. There are better education technology and progressive education blogs (odd that those two always seem to go together).
2. Board Buzz -- Not interesting. Doesn't hold my attention. The fatal flaw of many blogs run by large organizations looking to avoid controversy.
3. The Core Knowledge Blog -- Always interesting. Avoids hackery (See Edwise below). Pondiscio is a real blogger and Core Knowledge is smart enough to allow him to express his opinions.
4. Eduwonk -- Andy has good insider stuff. Sometimes too elliptic and insider to us outsiders. Doesn't blog often enough.
5. EdWize -- Often borders on hackery, no doubt due to union affiliation. Otherwise, not enough non-hack stuff to keep me interested.
6. Gotham Schools -- Good reporting on New York City stuff. Skoolboy was a good blogger at the now defunct eduwonkette. But, for some reason this blog hasn't made the cut to be a regular read. Maybe it's a signal to noise problem.
7. Joanne Jacobs -- Still going stronger than ever after all these years. The only real journalist of the lot. Has mastered the blog format.
8. Schools Matter -- The Bill Maher of edubloggers, if Maher were humorless and cut and pasted most of his material. The commentary is entirely predictable, conclusory, and never supported.
9. Susan Ohanian -- Is this even a blog? Redundant with School Matters. The opinions are identical. Pick either. Or better yet pick neither.
10. This Week in Education -- Best all around education policy blog. Russo rounds-up the news, is interesting, includes actual reporting, offers opinion, gets insider scoop, and consistently gets my first name wrong. Would be improved by jettisoning the dull Thompson.
Now that I've pulled a Michael Jordan-like acceptance speech, allow me to redeem myself by commenting on all the things I do wrong. My posts are sporadic, I take long breaks when I don't feel like writing (the problem of doing it for free and bereft of an academic sinecure), posts are far too long, lack of editing and proofreading, too lazy to spellcheck regularly, and often too abrasive.
I'm sure there are lots more which my critics should merrily point out in the comments.
Valerie will select ten and I will select a different ten for our 2009 list of 20 best blogs, which we hope to post by December. You will note a list of eleven blogs on the left side column of this blog [Ed. -- including your truly]. They are previous winners of this incredible honor, and so will remain posted there forever and are not eligible for the new list. I plan to add my ten selections to that left hand column, and make each annual contest a search for blogs we have not celebrated before.
I read that "posted forever" and "not eligible for the new list" as being the equivalent of being inducted into the edu-blogger hall of fame. Woo hoo. And, I haven't even retired yet. Although, I think I can safely
Here are my fellow inductees with my brief commentary.
1. EdTech Assorted Stuff -- Redundant. There are better education technology and progressive education blogs (odd that those two always seem to go together).
2. Board Buzz -- Not interesting. Doesn't hold my attention. The fatal flaw of many blogs run by large organizations looking to avoid controversy.
3. The Core Knowledge Blog -- Always interesting. Avoids hackery (See Edwise below). Pondiscio is a real blogger and Core Knowledge is smart enough to allow him to express his opinions.
4. Eduwonk -- Andy has good insider stuff. Sometimes too elliptic and insider to us outsiders. Doesn't blog often enough.
5. EdWize -- Often borders on hackery, no doubt due to union affiliation. Otherwise, not enough non-hack stuff to keep me interested.
6. Gotham Schools -- Good reporting on New York City stuff. Skoolboy was a good blogger at the now defunct eduwonkette. But, for some reason this blog hasn't made the cut to be a regular read. Maybe it's a signal to noise problem.
7. Joanne Jacobs -- Still going stronger than ever after all these years. The only real journalist of the lot. Has mastered the blog format.
8. Schools Matter -- The Bill Maher of edubloggers, if Maher were humorless and cut and pasted most of his material. The commentary is entirely predictable, conclusory, and never supported.
9. Susan Ohanian -- Is this even a blog? Redundant with School Matters. The opinions are identical. Pick either. Or better yet pick neither.
10. This Week in Education -- Best all around education policy blog. Russo rounds-up the news, is interesting, includes actual reporting, offers opinion, gets insider scoop, and consistently gets my first name wrong. Would be improved by jettisoning the dull Thompson.
Now that I've pulled a Michael Jordan-like acceptance speech, allow me to redeem myself by commenting on all the things I do wrong. My posts are sporadic, I take long breaks when I don't feel like writing (the problem of doing it for free and bereft of an academic sinecure), posts are far too long, lack of editing and proofreading, too lazy to spellcheck regularly, and often too abrasive.
I'm sure there are lots more which my critics should merrily point out in the comments.
November 3, 2009
A defense of SLA?
Marcie Hull, SLA's tech coordinator, commented on my posts on SLA. It's a serious comment and deserves a serious answer. (I've taken the liberty to clean up Marcie's lengthy comment which which was typed on an iPhone.)
In this case the numbers are percentile scores for the number of students who fall into each of the four proficiency categories on Pennsylvania's simplistic state assessment. So, in this case the numbers represent objective data on student performance. What's wrong with objective data? (Dick Shutz's objections notwithstanding which I acknowledge.)
The purpose of my post was to show the problem with the numbers which showed that all SLA 11th graders were proficient or above in writing. The actual student writing samples told a very different story. I also notice that you did not defend the quality of the students' writing. That is telling.
Actually, I think that SLA's selective admission system does the bulk of the work of keeping out "students with many different learning styles" and keeping the elite system in place. Not that there's anything wrong with providing appropriate opportunities for the academically meritorious students.
If we want more rule breakers, then it follows that we want more people who know the rules in the first place according to your own argument. Did the SLA student writers know the rules of grammar, usage, and argumentation? Are they in a position to break the rules even though they haven't learned them yet? It appears that SLA isn't providing the world with more rule breakers, juts people who don't know the rules.
Schools don't need just change; schools need effective change. And, caring is overrated. What these kids really need is effective teaching. So, is the teaching at SLA effective? Not based on the examples of student learning that I've seen so far. We'll see in two more years.
That's such a 20th century mindset. I made a 21st century visit. Isn't that what it's supposed to be all about?
Yes, let's look at Maslow the man; the pyramid, not so much.
Because Chris posted (and good for him for doing so) these examples of student work and asked for constructive criticism. Transparency is a 21st century virtue. Although, quite honesty I don't think SLA is quite ready yet for 21st century transparency.
At a minimum to live up to the things it claims to be doing in its family Handbook. Is it OK to expect less?
My high school was a traditional high school with all the faults and problems of traditional education. Sadly, I do not believe that SLA has improved on the failings of the traditional model. The "changes" SLA has made are superficial with respect to student learning, like many reform models that have preceded it.
Agreed. This is a problem. But is the problem the high school's for not being able to deal with under-prepared students or the elementary and middle schools that under-prepared them? And, let's just limit the discussion to the mountain of kids at the margin who came to school on a regular basis and who do not possess a cognitive impairment.
The numbers mean that Pennsylvania has set the bar way too low. Proficient students under Pennsylvania's standard apparently lack many skills foundational to writing ability.
Parents should be aware of this problem.
The numbers are also not reliable indicators that these students are ready for the workload inherent in most institutions of higher education. Higher education has been foreclosed to many of these students. Sadly, they don't know this yet.
The real conversation you need to be having is with your students, not with me. I'm just shining the light on the problem. No one from SLA or elsewhere has defended SLA on the merits yet. I have heard some excuse making. Your arguments seems to be that good intentions are good enough. I reject that opinion out of hand.
There isn't a single solution listed in your comment.
And what good is feeling success? Isn't it more important to be a success?
Chris asked for comments. I provided comments. You don't seem to agree with my comments, but you aren't exactly defending the students' work either. Take a position and defend it with supporting evidence. Making vague excuses and offering unsupported opinion is not a position.
hmmm...
I think I finally figured out why SLA students weren't able to write a position paper?
Numbers... Seems to me a silly tradition to measure a persons abilities. Why not watch them or give them excellent mentors, while listening to them and being sincerely interested in what they are saying.
In this case the numbers are percentile scores for the number of students who fall into each of the four proficiency categories on Pennsylvania's simplistic state assessment. So, in this case the numbers represent objective data on student performance. What's wrong with objective data? (Dick Shutz's objections notwithstanding which I acknowledge.)
The purpose of my post was to show the problem with the numbers which showed that all SLA 11th graders were proficient or above in writing. The actual student writing samples told a very different story. I also notice that you did not defend the quality of the students' writing. That is telling.
Numbers also seem to be a way to keep out students with many different learning styles, therefore keeping the old elite system in a safe place, far away from creativity.
Actually, I think that SLA's selective admission system does the bulk of the work of keeping out "students with many different learning styles" and keeping the elite system in place. Not that there's anything wrong with providing appropriate opportunities for the academically meritorious students.
Knowing the rules to break the rules is an old idea. We need more rule breakers if we want to see quick change.
If we want more rule breakers, then it follows that we want more people who know the rules in the first place according to your own argument. Did the SLA student writers know the rules of grammar, usage, and argumentation? Are they in a position to break the rules even though they haven't learned them yet? It appears that SLA isn't providing the world with more rule breakers, juts people who don't know the rules.
You are right schools need to change. We are attempting that change, 3 years is a very short time to be judged upon. Come see us in 5.
Schools don't need just change; schools need effective change. And, caring is overrated. What these kids really need is effective teaching. So, is the teaching at SLA effective? Not based on the examples of student learning that I've seen so far. We'll see in two more years.
Better yet come to the school and see what caring for a student can do for their academic success.
That's such a 20th century mindset. I made a 21st century visit. Isn't that what it's supposed to be all about?
And isn't that the way s brain works if your needs are met you are able to intellectualize? Look at Maslow, that is what I follow.
Yes, let's look at Maslow the man; the pyramid, not so much.
Sir, don't you see enough negativity in this world of education why would you pick out our community that strives for rigor and happiness?
Because Chris posted (and good for him for doing so) these examples of student work and asked for constructive criticism. Transparency is a 21st century virtue. Although, quite honesty I don't think SLA is quite ready yet for 21st century transparency.
What is it that you are expecting from a brand new school?
At a minimum to live up to the things it claims to be doing in its family Handbook. Is it OK to expect less?
What was your school like? Did it compare to your standards or did you make your own?
My high school was a traditional high school with all the faults and problems of traditional education. Sadly, I do not believe that SLA has improved on the failings of the traditional model. The "changes" SLA has made are superficial with respect to student learning, like many reform models that have preceded it.
I came out of one of the best high schools in the country in 1991. What scares me, even with all the money in that district not all learners were given equal opportunities to learn. They were tracked low and forgotten about and all the concentration was put on the students that could score high in math & science. Some of them are still living in their parents basements.
Agreed. This is a problem. But is the problem the high school's for not being able to deal with under-prepared students or the elementary and middle schools that under-prepared them? And, let's just limit the discussion to the mountain of kids at the margin who came to school on a regular basis and who do not possess a cognitive impairment.
So, I ask you what do these numbers mean? What do they mean to parents? What do they mean in higher education?
The numbers mean that Pennsylvania has set the bar way too low. Proficient students under Pennsylvania's standard apparently lack many skills foundational to writing ability.
Parents should be aware of this problem.
The numbers are also not reliable indicators that these students are ready for the workload inherent in most institutions of higher education. Higher education has been foreclosed to many of these students. Sadly, they don't know this yet.
Also, what kind of conversation or dialog do you want to get into with the staff at SLA? What is your purpose for reporting this? Anyone can cause a conflict.
The real conversation you need to be having is with your students, not with me. I'm just shining the light on the problem. No one from SLA or elsewhere has defended SLA on the merits yet. I have heard some excuse making. Your arguments seems to be that good intentions are good enough. I reject that opinion out of hand.
I would rather talk about solutions, kids & alternative types of instruction so that all student can feel success.
There isn't a single solution listed in your comment.
And what good is feeling success? Isn't it more important to be a success?
We are an easy target, seems to me you are upset about something and you are hiding behind this blog post instead of just writing your beef. Where's the beef?
Chris asked for comments. I provided comments. You don't seem to agree with my comments, but you aren't exactly defending the students' work either. Take a position and defend it with supporting evidence. Making vague excuses and offering unsupported opinion is not a position.
hmmm...
I think I finally figured out why SLA students weren't able to write a position paper?
A Tale of Two Cities
Now that Canada's largest school district, Toronto, has taken the bold step of reporting its testing data for racial and economic subgroups, we are able to compare its achievement gaps with the achievement gaps of other large cities that also report achievement data for subgroups.
Let's compare Toronto to Philadelphia and see who has the larger achievement gaps. Let's see how Toronto with it's compassionate and generous Canadian-style social policies compares to Philadelphia with it's backward and stingy U.S.-style social policies.
Better yet, I'm not going to tell you which city is which, see if you can guess from the achievement gaps which I've given as a fraction of a standard deviation.* Negative numbers indicate that the subgroup performed worse than whites; positive numbers indicate that the subgroup performed better than whites.
City #1
Black-White Achievement Gap: -0.85
Hispanic-White Achievement Gap: -0.73
Asian-White Achievement Gap: +0.28
City #2
Black-White Achievement Gap: -0.73
Hispanic-White Achievement Gap: -0.70
Asian-White Achievement Gap: +0.14
Which city is Philadelphia and which is Toronto?
Bonus Question: What does this analysis suggest for improving results and policies drawn on international comparisons between diverse countries like the U.S. and more racially homogeneous countries like those in Europe and Northeast Asia.
*What I did was to convert the percentile scores reported into z-scores (which is a more accurate way of analyzing normally distributed data) for each subgroup for math and reading combined (sixth grade for Toronto and fifth grade for Philadelphia). Then I calculated the difference between the z-scores of Asian (East), Black, and Hispanic (Latin) subgroup and the white subgroup. A difference of 0.25 is considered to be educationally significant. A difference of 0.75 is considered to be a large difference in the social sciences; most interventions are not capable of remedying an effect size of this magnitude.
Let's compare Toronto to Philadelphia and see who has the larger achievement gaps. Let's see how Toronto with it's compassionate and generous Canadian-style social policies compares to Philadelphia with it's backward and stingy U.S.-style social policies.
Better yet, I'm not going to tell you which city is which, see if you can guess from the achievement gaps which I've given as a fraction of a standard deviation.* Negative numbers indicate that the subgroup performed worse than whites; positive numbers indicate that the subgroup performed better than whites.
City #1
Black-White Achievement Gap: -0.85
Hispanic-White Achievement Gap: -0.73
Asian-White Achievement Gap: +0.28
City #2
Black-White Achievement Gap: -0.73
Hispanic-White Achievement Gap: -0.70
Asian-White Achievement Gap: +0.14
Which city is Philadelphia and which is Toronto?
Bonus Question: What does this analysis suggest for improving results and policies drawn on international comparisons between diverse countries like the U.S. and more racially homogeneous countries like those in Europe and Northeast Asia.
*What I did was to convert the percentile scores reported into z-scores (which is a more accurate way of analyzing normally distributed data) for each subgroup for math and reading combined (sixth grade for Toronto and fifth grade for Philadelphia). Then I calculated the difference between the z-scores of Asian (East), Black, and Hispanic (Latin) subgroup and the white subgroup. A difference of 0.25 is considered to be educationally significant. A difference of 0.75 is considered to be a large difference in the social sciences; most interventions are not capable of remedying an effect size of this magnitude.
November 2, 2009
Canada: not the educational mecca we've been led to believe
Canadian edu-pundits have been leading us to believe that Canada's lefty social policy programs have nearly eradicated both income and racial inequality and have lead to an educational mecca in which achievement gaps are no more.
The data indicates otherwise.
The Toronto District School Board commissioned a nice little study breaking down student achievement by race, parental education, and income. Before the study you would have been hard pressed to find student achievement data broken down this way. You'd think they were trying to hide something.
Guess what? They were. They were hiding giant achievement gaps.
Here's a nice little table showing student performance broken for third and sixth grades broken down by race.
Holy Moly! Look at them gaps. I'd like to say something snarky like "What is this Mississippi?" but that would be an insult to Mississippi. I know Canada has a shameful history of slavery they don't like to advertise, but I didn't realize it was this bad.
Notice how (East) Asians perform above whites, who perform above Hispanics (Latin American), who perform above blacks. What a coincidence. That's how it plays out in the US too. Who would have thunk? Also, notice how the gaps grow as the students go from third to sixth grade. Another coincidence?
Now let's look at the break down by income.
First of all I find this entire table shocking. I thought Canada was some sort of Marxist paradise, but look at that staggering income inequality. And would you believe that the kids of the people that have higher incomes (before it gets redistributed away) have student achievement as well. Makes me want to rethink that whole correlation vs. causation thing I always rail against.
Let's look at another shocking table. Parental education vs. student achievement.
Would you believe that kids who have parents who went to college outperform the kids whose parents only completed elementary school. Shocking. I thought "free" health care and school lunches solved this problem in Canada.
I simply can't wait until Stephen Downes tries to spin this data away.
The data indicates otherwise.
The Toronto District School Board commissioned a nice little study breaking down student achievement by race, parental education, and income. Before the study you would have been hard pressed to find student achievement data broken down this way. You'd think they were trying to hide something.
Guess what? They were. They were hiding giant achievement gaps.
Here's a nice little table showing student performance broken for third and sixth grades broken down by race.
Holy Moly! Look at them gaps. I'd like to say something snarky like "What is this Mississippi?" but that would be an insult to Mississippi. I know Canada has a shameful history of slavery they don't like to advertise, but I didn't realize it was this bad.
Notice how (East) Asians perform above whites, who perform above Hispanics (Latin American), who perform above blacks. What a coincidence. That's how it plays out in the US too. Who would have thunk? Also, notice how the gaps grow as the students go from third to sixth grade. Another coincidence?
Now let's look at the break down by income.
First of all I find this entire table shocking. I thought Canada was some sort of Marxist paradise, but look at that staggering income inequality. And would you believe that the kids of the people that have higher incomes (before it gets redistributed away) have student achievement as well. Makes me want to rethink that whole correlation vs. causation thing I always rail against.
Let's look at another shocking table. Parental education vs. student achievement.
Would you believe that kids who have parents who went to college outperform the kids whose parents only completed elementary school. Shocking. I thought "free" health care and school lunches solved this problem in Canada.
I simply can't wait until Stephen Downes tries to spin this data away.
The man can't keep them down
While reviewing the 2008-2209 Pennsylvania state assessments used for NCLB I noticed something that shouldn't surprise regular readers of this blog
Asians utterly dominate at the top of the performance distribution in reading and math.
There are 43 data points for racial subgroup performance for the 11th grade with ninety percent of students scoring proficient or above. This represents roughly the top 2% for school-level break downs.
28 were Asian (65.1%)
11 were white (25.6%)
2 were Hispanic (4.7%)
2 were black (4.7%)
And bear in mind that in many schools there is not sufficient Asian students to trigger NCLB's reporting requirements. Same for Hispanics.
Let's remove the subgroups that come from selective schools, such as magnet schools and other public schools that have admission requirements. This should leave us with only general admissions schools.
We are left with 28 data points.
24 were Asian (85.7%)
3 were white (10.7%)
1 was Hispanic (3.6%)
0 were black (0%)
I don't understand why the man isn't able to keep the Asians down, like he continues to do with blacks and Hispanics. Even more disturbingly, he's keeping whites down as well. But, I thought the man was white. What gives?
If we drop down to the 80% level we have 138 data points. This represents about the top 6% of performance. Here's what we have.
64 were white (46.4%)
59 were Asian (42.8%)
4 were Hispanic (2.9%)
11 were black (8.0%)
By my count there were at least 25 schools in which a white subgroup was reported, but no Asian subgroup. In most, if not all, of those schools Asians would have at least equaled the performance of whites. Hispanics suffer from the same effect due to their low numbers, as do blacks to a lesser extent. Taking all this into account, it's easy to see that Asians once again dominate.
This little exercise is hardly scientific, but it does show how ridiculous the whole racial discrimination as an excuse for poor black and Hispanic subgroup performance really is.
Asians utterly dominate at the top of the performance distribution in reading and math.
There are 43 data points for racial subgroup performance for the 11th grade with ninety percent of students scoring proficient or above. This represents roughly the top 2% for school-level break downs.
28 were Asian (65.1%)
11 were white (25.6%)
2 were Hispanic (4.7%)
2 were black (4.7%)
And bear in mind that in many schools there is not sufficient Asian students to trigger NCLB's reporting requirements. Same for Hispanics.
Let's remove the subgroups that come from selective schools, such as magnet schools and other public schools that have admission requirements. This should leave us with only general admissions schools.
We are left with 28 data points.
24 were Asian (85.7%)
3 were white (10.7%)
1 was Hispanic (3.6%)
0 were black (0%)
I don't understand why the man isn't able to keep the Asians down, like he continues to do with blacks and Hispanics. Even more disturbingly, he's keeping whites down as well. But, I thought the man was white. What gives?
If we drop down to the 80% level we have 138 data points. This represents about the top 6% of performance. Here's what we have.
64 were white (46.4%)
59 were Asian (42.8%)
4 were Hispanic (2.9%)
11 were black (8.0%)
By my count there were at least 25 schools in which a white subgroup was reported, but no Asian subgroup. In most, if not all, of those schools Asians would have at least equaled the performance of whites. Hispanics suffer from the same effect due to their low numbers, as do blacks to a lesser extent. Taking all this into account, it's easy to see that Asians once again dominate.
This little exercise is hardly scientific, but it does show how ridiculous the whole racial discrimination as an excuse for poor black and Hispanic subgroup performance really is.
School of the Future Crashes and Burns
As I predicted three year's ago, Philadelphia's ultra-expensive, ultra-high-tech, and ulta-ironically-named School of the Future (SOF) would crash and burn. Educators and edu-journalists believed otherwise. they thought that SOF would revolutionize education by teaching 21st century skills.
What could possibly go wrong? There was lots of technology. Great facilities. Community involvement. Lots of money being thrown around. There would be discovery learning and lots of inquiry. In short there was a naïve over-reliance on all the accoutrement's of education that are irrelevant to student outcomes. In fact, some of them are downright toxic.
But things have gone horribly wrong. Take a look at last Spring's PSSA scores (Pennsylvania's easy state assessment) for 11th grade students at the School of the Future.
Percent scoring proficient or above in Math (state average)
All students: 7.5% (55.6%)
Black Students: 6.8% (28.3%)
Poor Students: 7.8% (35.3%)
Special Ed Students: 0% (14.6%)
Percent scoring proficient or above in Reading (state average)
All students: 23.4% (65.2%)
Black Students: 24.5% (38.5%)
Poor Students: 23.5% (44.4%)
Special Ed Students: 0% (20.2%)
What is even more horrifying is the percentage of students performing at the below basic level. Bear in mind that getting to Basic level in Pennsylvania requires performing only slightly better than chance.
Math
All students: 74.1% (24.9%)
Black Students: 73.8% (50.2%)
Poor Students: 74.5% (42.1%)
Special Ed Students: 100% (69.5%)
Reading
All students: 49.5% (18.8%)
Black Students: 48.0% (39.6%)
Poor Students: 58.8% (34.3%)
Special Ed Students: 91.7% (61.5%)
95% of the students at the School of the Future are Black and 47.2% get free or reduced lunches, a proxy for poverty. The school is a general admission school that was paid for, staffed, and operated by the Philadelphia public school system. The project organizers aimed to create a model that could be replicated easily in other districts.
So what went wrong, besides the obvious?
Pretty much everything according to this June article in eSchool News.
1. The curriculum planning committee was staffed by naive fools.
2. The school got the equivalent of Microsoft Bob, instead of Windows 7
Microsoft made it clear at the SOF's inception that it would not be overseeing the school's operation; instead, it would lend its initial expertise, provide basic professional development, and then leave the success of the school up to its leaders.
Although the technology itself was not supposed to trump basic classroom practices, Microsoft and the school's planners had decided not to allow the use of textbooks or printed materials; instead, all resources were located online through a portal designed by Microsoft.
Yet educators frequently encountered problems accessing the internet, because the school's wireless connection often would not work.
"This vital part of the school's technology was never stable and robust enough to make it dependable," said Biros. "There was no safety net, and it seemed like a great leap of faith--faith that these teachers, amidst so many new circumstances, would be able to develop curriculum almost on the fly and store and distribute it electronically."
3. The Philadelphia public school system doesn't know how to run a modern IT department.
The district's IT staff had responsibility for the network, but according to Biros, there was not an IT employee on site, and when problems occurred they were not fixed promptly. There also was no dedicated technical support.
"I don't think the district was ready to handle the development of Microsoft's technology and portal. The district is also Mac-based and not PC-based, which caused a lot of technical issues ...," said Patrick McGuinn, assistant professor of political science at Drew University.
4. Just handing out laptops turns out not to be an educational panacea.
Another problem was that the students--most of whom came from poorer families and neighborhoods--could not use or maintain their laptops properly. Students were either afraid to take their laptops home for fear of theft, or they didn't know how to access all the programs on the machines.
5. The realities of project based education bit them in the ass.
"The lack of standardized grades made it hard to relate student progress to parents.... There is no clear definition of what project-based learning exactly is and how that can be step-by-step implemented in the classroom. Student remediation also didn't fit with the project-based collaboration model."
At one point during the discussion, an audience member asked: "All of your resources are online, and educators have to access [them] through this portal. However, your educators don't know how to work the technology. So, exactly what did the teachers teach in class? What were the students learning?"
"Well, honestly, I'm not exactly sure," replied Biros.
In the absence of real leadership, and because no community partnerships had formed, the SOF started to adopt more traditional district assessments and classroom practices. [Ed. -- isn't this always how it turns out]
6. Students didn't like going to the school.
"Truancy picked up, and we were not prepared to handle it."
Perhaps an increase in truancy wouldn't have been such a large problem, except many of the educators hired were not well-versed in dealing with at-risk students who were required to participate in project-based learning.
7. The teacher's union proved to be a menace.
Although Microsoft and the SOF based hiring decisions on Microsoft's Education Competency Wheel, which, according to the company, is "a set of guideposts for achieving educational excellence that centers on identifying and nurturing the right talents in a district's employees, partners, and learners," the SOF had to go through the Philadelphia's Teachers' Union to hire its educators.
The process, said Biros, "was intended to facilitate hiring the best faculty possible with objective consideration; [but] the reality of the union constraints within the district effectively eliminated that outcome. Because of the district's human resources policies and union regulations, most of the applications received were from current district teachers looking for new assignments. We were not recruiting from a pool of any and all teachers interested in applying to SOF."
******
In short pretty much everything went wrong and everyone blamed everyone but themselves for the problems.
But, the real problem is that you can bet that no one will learn from the failure of SOF. You can bet that all those education technology bloggers won't address the problems of SOF that reveal to gaping holes in their vision of the wonders of education technology. You can bet that all the poverty racers won't confront the failure of a mass infusion of money on student outcomes. Will the union apologists address the real problems caused at SOF by their beloved unions? And what about the progressive educators whose project-based curricula never live up to expectations in urban schools?
SOF is a microcosm of every dopey education reform that has come down the pike. Oversell the expectations; ignore the predictable outcomes.
Here's how Philadelphia's Mayor Street sold SOF:
Now you know the outcome. Half the students are performing at the below basic level in reading, three-quarters are doing the same in math.
What could possibly go wrong? There was lots of technology. Great facilities. Community involvement. Lots of money being thrown around. There would be discovery learning and lots of inquiry. In short there was a naïve over-reliance on all the accoutrement's of education that are irrelevant to student outcomes. In fact, some of them are downright toxic.
But things have gone horribly wrong. Take a look at last Spring's PSSA scores (Pennsylvania's easy state assessment) for 11th grade students at the School of the Future.
Percent scoring proficient or above in Math (state average)
All students: 7.5% (55.6%)
Black Students: 6.8% (28.3%)
Poor Students: 7.8% (35.3%)
Special Ed Students: 0% (14.6%)
Percent scoring proficient or above in Reading (state average)
All students: 23.4% (65.2%)
Black Students: 24.5% (38.5%)
Poor Students: 23.5% (44.4%)
Special Ed Students: 0% (20.2%)
What is even more horrifying is the percentage of students performing at the below basic level. Bear in mind that getting to Basic level in Pennsylvania requires performing only slightly better than chance.
Math
All students: 74.1% (24.9%)
Black Students: 73.8% (50.2%)
Poor Students: 74.5% (42.1%)
Special Ed Students: 100% (69.5%)
Reading
All students: 49.5% (18.8%)
Black Students: 48.0% (39.6%)
Poor Students: 58.8% (34.3%)
Special Ed Students: 91.7% (61.5%)
95% of the students at the School of the Future are Black and 47.2% get free or reduced lunches, a proxy for poverty. The school is a general admission school that was paid for, staffed, and operated by the Philadelphia public school system. The project organizers aimed to create a model that could be replicated easily in other districts.
So what went wrong, besides the obvious?
Pretty much everything according to this June article in eSchool News.
1. The curriculum planning committee was staffed by naive fools.
"We naively thought, I guess, that by providing a beautiful building and great resources, these things would automatically yield change. They didn't," said Jan Biros, associate vice president for instructional technology support and campus outreach at Drexel University and a former member of the SOF Curriculum Planning Committee.
2. The school got the equivalent of Microsoft Bob, instead of Windows 7
Microsoft made it clear at the SOF's inception that it would not be overseeing the school's operation; instead, it would lend its initial expertise, provide basic professional development, and then leave the success of the school up to its leaders.
Although the technology itself was not supposed to trump basic classroom practices, Microsoft and the school's planners had decided not to allow the use of textbooks or printed materials; instead, all resources were located online through a portal designed by Microsoft.
Yet educators frequently encountered problems accessing the internet, because the school's wireless connection often would not work.
"This vital part of the school's technology was never stable and robust enough to make it dependable," said Biros. "There was no safety net, and it seemed like a great leap of faith--faith that these teachers, amidst so many new circumstances, would be able to develop curriculum almost on the fly and store and distribute it electronically."
3. The Philadelphia public school system doesn't know how to run a modern IT department.
The district's IT staff had responsibility for the network, but according to Biros, there was not an IT employee on site, and when problems occurred they were not fixed promptly. There also was no dedicated technical support.
"I don't think the district was ready to handle the development of Microsoft's technology and portal. The district is also Mac-based and not PC-based, which caused a lot of technical issues ...," said Patrick McGuinn, assistant professor of political science at Drew University.
4. Just handing out laptops turns out not to be an educational panacea.
Another problem was that the students--most of whom came from poorer families and neighborhoods--could not use or maintain their laptops properly. Students were either afraid to take their laptops home for fear of theft, or they didn't know how to access all the programs on the machines.
5. The realities of project based education bit them in the ass.
"The lack of standardized grades made it hard to relate student progress to parents.... There is no clear definition of what project-based learning exactly is and how that can be step-by-step implemented in the classroom. Student remediation also didn't fit with the project-based collaboration model."
At one point during the discussion, an audience member asked: "All of your resources are online, and educators have to access [them] through this portal. However, your educators don't know how to work the technology. So, exactly what did the teachers teach in class? What were the students learning?"
"Well, honestly, I'm not exactly sure," replied Biros.
In the absence of real leadership, and because no community partnerships had formed, the SOF started to adopt more traditional district assessments and classroom practices. [Ed. -- isn't this always how it turns out]
6. Students didn't like going to the school.
"Truancy picked up, and we were not prepared to handle it."
Perhaps an increase in truancy wouldn't have been such a large problem, except many of the educators hired were not well-versed in dealing with at-risk students who were required to participate in project-based learning.
7. The teacher's union proved to be a menace.
Although Microsoft and the SOF based hiring decisions on Microsoft's Education Competency Wheel, which, according to the company, is "a set of guideposts for achieving educational excellence that centers on identifying and nurturing the right talents in a district's employees, partners, and learners," the SOF had to go through the Philadelphia's Teachers' Union to hire its educators.
The process, said Biros, "was intended to facilitate hiring the best faculty possible with objective consideration; [but] the reality of the union constraints within the district effectively eliminated that outcome. Because of the district's human resources policies and union regulations, most of the applications received were from current district teachers looking for new assignments. We were not recruiting from a pool of any and all teachers interested in applying to SOF."
******
In short pretty much everything went wrong and everyone blamed everyone but themselves for the problems.
But, the real problem is that you can bet that no one will learn from the failure of SOF. You can bet that all those education technology bloggers won't address the problems of SOF that reveal to gaping holes in their vision of the wonders of education technology. You can bet that all the poverty racers won't confront the failure of a mass infusion of money on student outcomes. Will the union apologists address the real problems caused at SOF by their beloved unions? And what about the progressive educators whose project-based curricula never live up to expectations in urban schools?
SOF is a microcosm of every dopey education reform that has come down the pike. Oversell the expectations; ignore the predictable outcomes.
Here's how Philadelphia's Mayor Street sold SOF:
"You won't be able to say, 'I didn't have the computers. I didn't have the technology. I didn't have the teachers. I didn't have mentors,' because the young people who go to this school will be in the premier educational environment in the entire country, maybe even in the entire world," Street said. "So the bar for you is raised."
Now you know the outcome. Half the students are performing at the below basic level in reading, three-quarters are doing the same in math.
November 1, 2009
Some Perspective
Two posts ago we looked at the writing of some students (mostly seniors) at Philadelphia's Science Leadership Academy magnet school. I was none too impressed to put it mildly. And no one has stepped up so far to defend the actual writing ability of the students. Time will tell.
Now pretend you are setting proficiency standards. Where would these students fall out on an advanced, proficient, basic, and below basic scale for this assignment (the subject matter of which admittedly was overly-difficult, although, the task of writing a position paper was not.)
Have you determined your standards yet and the approximate percentage of students falling within each category? Don't read on until you have.
Now where would you predict the commonwealth of Pennsylvania would set their standards? What percentage of students would be advanced? What percentage would be proficient? Basic? Below basic.
Last April, 11th graders took Pennsylvania's writing exam. Here is how the SLA students fared:
Advanced: 28.4%
Proficient: 71.6%
Basic: 0%
Below Basic: 0%
The 2009 PSSA Writing School Level Proficiency Results, p. 266.
Your eyesight isn't failing you. 100% of SLA 11th graders were proficient or above in Pennsylvania's writing assessment.
Bear in mind that Pennsylvania falls out somewhere in the middle of the states as far as where the PSSA exam falls out relative to the NAEP in Reading and Math. See Figures 2 and 3. Sadly, 11th graders don't take the NAEP writing exam. Only 8th graders do. 36% of 8th graders were proficient on the NAEP writing assessment.
Sherman Dorn is right to point out that the NAEP cut scores have been arbitrarily set. However, whenever I look at actual test NAEP questions or how students perform on state tests which relative to NAEP are low, it's hard to come to a conclusion that arbitrary necessarily means too high. If anyuthing it's just the opposite.
Update: Some more perspective. Here's the breakdown of SLA's 11th graders based on the writing test statistics: 36.2% are white, 49.1% are black, 15.3% are Asian or Hispanic, and 30.1% are economically disadvantaged. Also, students need to apply and be accepted to SLA. Here is the admission criteria:
Now pretend you are setting proficiency standards. Where would these students fall out on an advanced, proficient, basic, and below basic scale for this assignment (the subject matter of which admittedly was overly-difficult, although, the task of writing a position paper was not.)
Have you determined your standards yet and the approximate percentage of students falling within each category? Don't read on until you have.
Now where would you predict the commonwealth of Pennsylvania would set their standards? What percentage of students would be advanced? What percentage would be proficient? Basic? Below basic.
Last April, 11th graders took Pennsylvania's writing exam. Here is how the SLA students fared:
Advanced: 28.4%
Proficient: 71.6%
Basic: 0%
Below Basic: 0%
The 2009 PSSA Writing School Level Proficiency Results, p. 266.
Your eyesight isn't failing you. 100% of SLA 11th graders were proficient or above in Pennsylvania's writing assessment.
Bear in mind that Pennsylvania falls out somewhere in the middle of the states as far as where the PSSA exam falls out relative to the NAEP in Reading and Math. See Figures 2 and 3. Sadly, 11th graders don't take the NAEP writing exam. Only 8th graders do. 36% of 8th graders were proficient on the NAEP writing assessment.
Sherman Dorn is right to point out that the NAEP cut scores have been arbitrarily set. However, whenever I look at actual test NAEP questions or how students perform on state tests which relative to NAEP are low, it's hard to come to a conclusion that arbitrary necessarily means too high. If anyuthing it's just the opposite.
Update: Some more perspective. Here's the breakdown of SLA's 11th graders based on the writing test statistics: 36.2% are white, 49.1% are black, 15.3% are Asian or Hispanic, and 30.1% are economically disadvantaged. Also, students need to apply and be accepted to SLA. Here is the admission criteria:
Criteria: Admission to SLA is based on a combination of a student interview at the school with a presentation of completed work, strong PSSA scores, As and Bs with the possible exception of one C, teacher or counselor recommendation and good attendance and punctuality. Interested families must contact the school to set up an appointment for an interview. SLA will not initiate the interview process with families.Just in case you thought SLA was teeming with abjectly-impoverished inner city kids with abusive parents and poor language skills who really don't want to be in school anyway.
Subscribe to:
Posts (Atom)