January 30, 2009
As usual, Pennsylvania has released its data in a way that makes it all but unusable for analysis purposes. basically, they just tell us the percentage of students in each district that took remedial courses, the number of courses taken, and the amount it cost. Not very helpful. But not to worry, fearless readers I did the heavy lifting of putting the data into my all-purpose Pennsylvania database of education statistics and managed run a few regressions.
It should be noted that there are selection bias issues out the wazoo because PA failed to disaggregate any data. So take that as a large warning in interpreting the data.
Anyway, off we go.
The first regression I ran compared the percentage of adults with bachelor degrees in the district and the percentage of students needing remediation. Adult education level is a proxy for student socio-economic status (SES) (and parental/student IQ). The thought is that students with higher SES levels should be better prepared for college level work. Let's see if that hypothesis holds up. (Note that I've indicated the Philadelphia School District as the alrge red triangle. And, note that Philadelphia generally appears above the regression line indicating that its actual remediation rate is generally higher than its predicted rate.)
No, the hypothesis doesn't hold up. Less than 1% of the variance in parental education level is associated with the percentage of students in need of remediation. We would probably have gotten a better result if we knew the actual parental education level of the students who were in need of remediation since the parental education levels in my database are for every adult in the district (not just parents and not just parents of remedial kids).
Now let's check if the percentage of students receiving free and reduced lunches (another proxy for poverty and low-SES) correlates with the percentage of remedial students.
Nope. Once again the variance is less than 1%. I was somewhat surprised at this outcome. Low-SES students are generally underperformers so I was expecting to see districts with high numbers of low-SES students have higher remediation rates. But that wasn't the case. Maybe the selection bias effects are showing up here in that the low-SES students may be applying to college at lower rates. This analysis would have benefited from the demographic data of the actual remedial students. I'm certain PA has his data. Release the data, PA!
Now let'see if school expenditures make a difference on remediation rates.
Not really. Again the variance is low -- 3.6%. If anything, school expenditures are negatively correlated with remediation rates. The more a district spends the higher the percentage of remedial students it gets. Instead of dreaming up various explanations for this outcome, let's just keep it simple and say that school expenditures don't seem to have much of an effect on remediation rates.
Now let's look at whether there is a correlation between the percentage of students failing PA's NCLB test (the easy minimum-skills PSSA test) and the remediation rate.
(Note: the x-axis caption is incorrect. It should read: % failing state test)
Again, the variance is somewhat low at 8.7%. Schools with higher PSSA pass rates produce slightly lower remediation rates. I guess that's not too surprising. Students who pass the PSSA probably know more and as a result are less likely to require remediation. This is another analysis that would have benefited greatly from knowing the percentage of remedial students that passed or failed the PSSA exam.
These results aren't terribly interesting since they basically are just showing low correlations. But the following regression is a bit of a doozie.
Let's take a look at the percentage of non-white students and the remediation rates. Note that in PA, the number of Asian and Native American students is very low, so you can read non-white as mostly black and Hispanic.
Hello! Finally a decent correlation. 25% of the variance in remediation rates is associated with race. The more black and Hispanic students in the district the higher the remediation rates.
I'm wondering why the PA Dept. of Ed didn't highlight this inconvenient piece of data. PA schools continue to do a poor job preparing black and Hispanic students for college. PA schools with high percentages of black and Hispanic students fared poorly with remediation rates. Yet, oddly, schools with high numbers of low-SES, regardless of race, didn't fare much worse than schools with low percentages. And, spending does not seem to affect the remediation rates, if anything the more schools spend the higher the remediation rates.
I'm not sure what to make of all this, but PA could have done a much better job analysing and presenting the data. But, i suppose it's not in its best interest to do so.
January 27, 2009
Based on Bill Gate's 2009 Annual Letter it's apparent that trend is going to continue in the near future.
First, Bill sets a silly goal:
Our goal as a nation should be to ensure that 80 percent of our students graduate from high school fully ready to attend college by 2025.
The country's drop-out rate is presently higher than 20%. This means that at least every student who now completes high school must be capable of doing college level work.
Now consider this: Pennsylvania just found out that a third of its students who enrolled in a state school or community college were not prepared for college level work. Those are the kids who thought they knew enough to go to college. How about the ones who knew better to apply in the first place? About half the students know better than to even try.
Bill, just upped the ante big time on NCLB's lofty goals. It's one thing to claim that you'll get 100% of students to a loosely defined standard (i.e., "proficiency") based on the standard made up by each state, measured with a testing instrument devised by each state (the fox guarding the hen house), and with cushy safe-harbor provisions. But, it's another to set a high and easily measurable standard that can easily be verified (Hello remediation rates).
And one more thing, Bill. At least have the good sense to give us a cut point that will now result in a large achievement gap between the races/classes. An 80% cut point will leave us with a 23% achievement gap absent we finding the educational magic that allows the low group to gain relative to the high group. Good luck with that. The cut point has to be at least at 95% to get the achievement gap down to a politically acceptable level.
Next, Bill asks an easily answered rhetorical question:
Unlike scientists developing a vaccine, it is hard to test with scientific certainty what works in schools. If one school’s students do better than another school’s, how do you determine the exact cause?
There are basically two effective ways to do this:
Type 1 approach. The most obvious way would be to ask and check. Because successful applications have been created by design, ask the designer of the successful program what the variables are that the design controls and why. The answers imply tidy, controlled experiments that involve systematic investigation of the designer’s assertions.
Type 2 approach. A related approach would involve fully implementing a successful instructional program and then systematically altering the details of it (one at a time, while trying to maintain the others as they were in the original program). The changes would be correlated with changes in student performance. If no difference results from a change, the dimension that was changed does not function as a variable (at least within the range of variation observed). A manipulation that results in improved student performance identifies a variable that was not well designed by the original program. A change that results in inferior student performance identifies a variable that was designed better by the original program than it was in the modified program. This approach would need clear descriptions of what constituted improved performance. Efficiency is an important variable. If the change resulted in improved performance but required three times the instruction of the original, the rubric for judging efficiency would have to compute the ratio of improved performance over the time to arrive at a reasonable overall judgment of the net “desirability” of the change.
of course, this isn't the way we currently evaluate research. But that's a whole 'nother problem.
Lastly, Bill tells us how his foundation is all ready to jump to the next poorly researched education fad: teacher effectiveness.
It is amazing how big a difference a great teacher makes versus an ineffective one. Research shows that there is only half as much variation in student achievement between schools as there is among classrooms in the same school. If you want your child to get the best education possible, it is actually more important to get him assigned to a great teacher than to a great school.
Whenever I talk to teachers, it is clear that they want to be great, but they need better tools so they can measure their progress and keep improving. So our new strategy focuses on learning why some teachers are so much more effective than others and how best practices can be spread throughout the education system so that the average quality goes up.
Bill's rocking a dead baby with this new teacher effectiveness initiative of his. Five minutes on
here's some free advice, Bill. Next time some education expert claims they know what they are talking about, you need to follow Ronald Reagan's advice -- trust, but verify.
This brings us full circle back to the need for ascertaining what works in education research:
Type 1 approach. The most obvious way would be to ask and check. Because successful applications have been created by design, ask the designer of the successful program what the variables are that the design controls and why. The answers imply tidy, controlled experiments that involve systematic investigation of the designer’s assertions.
Never trust what anyone who in education tells you.
January 26, 2009
The competition challenges middle school students to design a city of the future with a focus on water conservation, reuse, and renewable energy. The students use the game SimCity (Deluxe 4) to help them build their three-dimensional models to scale. They have a semester to dream up and then construct their miniature cities entirely out of recycled materials. Supposedly, this inspires them to consider engineering as a profession.
Let's see what we get after all that creativity.
When it comes to the perfect place to live, [L.U.R.E., a sprawling metropolis set in southern New Mexico] would seem to have all bases covered.
School is free for everyone, brought into individual homes via a holographic teacher. Nearly everyone in town is gainfully employed as an engineer.
Mountain goat racing and sand surfing satisfy a yen for sports and leisure. And if, for no apparent reason, you need a getaway, there's the Space Shuttle Gilligan to whisk you on a four-month vacation to the moon...
L.U.R.E's coffee shop was housed in a Starbucks Frappuccino cup; office buildings were fashioned from paper towel rolls.
I'm sure there was some creativity is selecting a frappuccino cup as the coffee house building, but I'm wondering how that creativity generalizes to a domain outside of coffee cup miniature modeling. And try not to think about what the citiy's brothel was constructed from.
And what's up with the animal abuse? Enslaving our mountain goat brethen for our personal amusement seems a bit cruel. I'm surprised PETA didn't protest this event. Now that might have shown some real-world problem solving and creativity in how to defuse a PR nightmare without resorting to the firehoses.
I'm also wondering how many vacation goers would be willing to fly in the Space Shuttle. I'm guessing that the students hadn't heard of the Space Shuttle's propensity for blowing-up
unexpectedly. Hope they got insurance for that one.
At least the teacher hologram initiative shows creativity. Hologram's of a person trying to convey information or say, plans, to others is something I've never seen before. I wonder where they got that idea.
Look, I'm sure the kids had a lot of fun. But where's the educational value?
Here's what the kids were supposed to learn about:
The National Engineers Week Future City Competition offers students a resourceful way to learn about engineering.
- Learn how engineers turn ideas into reality.
- Develop a project plan to guide team activities.
- Use SimCity™ software to design their city.
- Build a city model using recycled materials.
- Work as a team under the guidance of an engineer and a teacher.
- Demonstrate writing skills by composing an essay on an engineering design problem.
- Enhance communications skills through a team presentation.
Did the students really learn how engineer's turn ideas into reality? Is a miniature model reality? If so, then isn't my kindergartner's pictures (media: crayons and construction paper) teaching her the same thing, just without the fancy labels?
This is not how engineer's turn an idea into reality. It doesn't seem to me that the students needed to know any actual engineering or any engineering constraints to construct their models. So, this is how a non-engineer turns ideas into reality. And, I'm not sure this exercise , in any way, generalizes to any real-world situation.
I suppose the kids did learn how to play SimCity. Videogames 101. That's what kids need -- more time playing videogames. I'm sure SimCity is a neat program, but it's not exactly a precursor to AutoCAD or other real-world construction/drafing programs.
And how does building a model out of recycled mterials generalize to building real stuff with recylced materials? Someone explain that to me.
The rest of it can be summarized as "learning how to work in a group." Something that our educators think students need a lot of practice doing for the real world. Apparently, lazy students need to refine their shirking skills from a young age and the more capable students need to understand the hell that awaits them in the real-world as they are expected to carry the load of the shirkers and share the credit.
All kidding aside, what does participation in this project actually teach that generalizes to anything else in a different domain? To the extent the students learned any generic creativity, explain how this creativity might generalize to a domain that requires knowledge in that domain without the student knowing that knowledge?
January 24, 2009
There are two kinds of special families: those with twins and those with adoptees. If you want to disentangle the effects of nature and nurture, one approach is to compare identical twins, who share all of their genes, to fraternal twins, who share only half. Another approach is to compare adoptees to members of their adoptive families. If identical twins are more similar than fraternal twins, we have strong reason to believe that the cause is nature. If adoptees resemble members of the families they grew up with, we have strong reason to believe that the cause is nurture.
By using — and refining — these twin and adoption methods, behavioral geneticists have produced credible answers to the nature-nurture controversy. To put it simply, nature wins. Heredity alone can account for almost all shared traits among siblings. "Environment" broadly defined has to matter, because even genetically identical twins are never literally identical. But the specific effects of family environment ("nurture") are small to nonexistent. As Steven Pinker, a professor of psychology at Harvard University, summarizes the evidence:
"First, adult siblings are equally similar whether they grew up together or apart. Second, adoptive siblings are no more similar than two people plucked off the street at random. And third, identical twins are no more similar than one would expect from the effects of their shared genes."
The punch line is that, at least within the normal range of parenting styles, how you raise your children has little effect on how your children turn out...
Recent scholarship does highlight some exceptions [but] the fact remains that people tend to greatly overestimate the power of nurture.
If family environment has little effect, why does almost everyone think the opposite? Behavior geneticists have a plausible explanation for our confusion: Family environment has substantial effects on children. Casual observers are right to think that parents can change their kids; the catch is that the effect of family environment largely fades out by adulthood. For example, one prominent study found that when adoptees are 3 to 4 years old, their IQ has a .20 correlation with the IQ of their adopting parents; but by the time adoptees are 12 years old, that correlation falls to 0. The lesson: Children are not like lumps of clay that parents mold for life; they are more like pieces of flexible plastic that respond to pressure, but pop back to their original shape when that pressure is released.
Bleak news indeed for the SES warriors.
This is why we see substantial IQ gains for some preschool programs whose effects fade by the end of elementary school. This is also why time should not be wasted in elementary school doing "developmentally appropriate" nonsense. There is a brief window of opportunity that needs to be taken advantage of.
January 23, 2009
I think we all agree that schools should be using current technology and teaching students how to use current technology. Kid should know how to use computers, and the internet, and how to use various software packages, etc. I won't go on ad nauseum here; there are lots of education blogs that specialize in using technology.
Actually, what they do often do is oversell the academic benefits of this technology. I love the irony of all these smart and knowledgeable people writing entries for other smart and knowledgeable people to read telling how the only thing holding back dumb people from educational attainment is better access to technology. Like the ditch-digger down the street who can barely put together a coherent though is capable of getting much out of the internet as they do. There's a large gulf between those with only a little bit of knowledge and those with a lot and google isn't capable of bridging it.
But let's not fight about that. There'll always be other posts for that. Let's find something I think we all agree upon: that schools are abusing the call for 21st century skills to draw attention away from the bad job they are currently doing.
Take for example today's top story in the Massachusetts' Standard Times about how Fairhaven High School's new plan to make itself into a 21st Century school which is promised to "equip students with the necessary knowledge and skills to be successful in today's world." As usual the devil is in the details.
Schools seem to follow the same script. First they wipe the slate clean and give the impression that they were doing a marvelous job back in the 20th century, but that simply isn't good enough anymore.
"Even though we're facing these difficult economic times, our commitment to kids has to stay strong," said Fairhaven High School Principal Tara Kohler, who presented the plan to the School Committee last week.
"What we've always done isn't good enough anymore."
Was it ever good enough? I seriously doubt that. We call this change for the sake of change or, more specifically, change for the sake of not establishing a longitudinal track record of bad data. It's much harder to hit a moving target. And, what's the real difference between yesterday's fad and today's. Very little.
After the slate is wiped clean, the new plan is presented in incomprehensible edu-jagon designed to sound much more impressive than it actually is.
The district's plan is centered on five goals:
* Ensuring all students have access to a quality education.
* Preparing students for a 21st-century job market and a global economy.
* Improving students' transitions to high school, thus increasing graduation rates.
* Increasing opportunities for students to take college courses and participate in internships or other school-to-career activities.
* Improving access to technology.
What the hell does this even mean. How is the school going to transform itself into a 21st Century school. Here's how.
While the high school has already made some progress toward these goals — a new computer lab was installed earlier this year and a new transition program for incoming freshmen was implemented — there is still a lot that can be done, according to Ms. Kohler.
They're installing a new computer lab.
That is so 1980's.
There's other assorted nonsense in the school's plan (foreign languages, assorted "green" nonsense, "Virtual High School" online courses, and creating an iPod mobile lab), but none of it is on par with the stuff the edu-tech bloggers get all giddy about.
But don't worry about that the school is now a 21st century school.
Sam Dillon of the NY Times breathlessly reports some real educational magic today:
[The]performance gap between African-Americans and whites on a 20-question test administered before Mr. Obama’s nomination all but disappeared when the exam was administered after his acceptance speech and again after the presidential election.
The inspiring role model that Mr. Obama projected helped blacks overcome anxieties about racial stereotypes that had been shown, in earlier research, to lower the test-taking proficiency of African-Americans, the researchers conclude in a report summarizing their results.
This one was so hot the Times couldn't wait for peer review. No need to waste time with that. The Obama effect is for real and it must be reported right this second.
In fairness, Dillion does mention that the study had not undergone peer review, but only provides an incomplete and misleading explanation of prior research on stereotype threat which might have tipped readers off as to the dubiousness of this latest study.
Here are the money grafs from the Wikipedia entry on stereotype threat:
Furthermore, while Sackett et al. do not dispute the fact that stereotype threat has a real, measurable effect on test scores, they posit that in the part of the experiment where Steele and Aronson removed the stereotype threat, the achievement gap which did remain correlated closely with the existing African American - White achievement gap on large-scale standardized testing such as the SAT. In their own words:
Thus, rather than showing that eliminating threat eliminates the large score gap on standardized tests, the research actually shows something very different. Specifically, absent stereotype threat, the African American-White difference is just what one would expect based on the African American-White difference in SAT scores, whereas in the presence of stereotype threat, the difference is larger than would be expected based on the difference in SAT scores.
In subsequent correspondence between Sackett et al. and Steele and Aronson, Sackett et al. wrote that "They [Steele and Aronson] agree that it is a misinterpretation of the Steele and Aronson (1995) results to conclude that eliminating stereotype threat eliminates the African American-White test-score gap."
In the past researchers have been able to depress scores by introducing a stereotype threat (basically, the researchers told the test subjects that they were part of a group that were dummies). Removing the threat only brought scores back up to historic averages. They have not been able, however, to actually increase scores above historic averages.
This new research, in contrast, supposedly shows that test scores can be increased by such a large amount (somewhere between 0.5 to 1.0 standard deviations) to wipe out the achievement gap that exists between blacks and whites. Educationally speaking, that's a giant effect size and a truly unprecedented result (if true). Indeed, the Obama effect must be extraordinary to achieve such a result.
It's basically the educational equivalent of cold fusion. The Times apparently forgot about that lesson in journalistic humility. And so much for acknowledging legitimate opposing viewpoints that might cast some doubts on these extraordinary, unprecedented findings. Get a load of the expert the Times dredged up, replete with some nice spin supplied by the Times.
“It’s a nice piece of work,” said G. Gage Kingsbury, a testing expert who is a director at the Northwest Evaluation Association, who read the study on Thursday.
But Dr. Kingsbury wondered whether the Obama effect would extend beyond the election, or prove transitory. “I’d want to see another study replicating their results before I get too excited about it,” he said.
Kingsbury's wants to see replication -- a prefectly reasonable response. And the Times spins the failure to achieve replication as possibly being cause by "transitory" effects. The expert provides no indication that the results might be in doubt (in fact, he praises the study as "a nice piece of work") and the Times provides no indication that any doubt exists.
The impression I get from the Times article is that the Obama effect is for real, pending peer review, but might fade due to its transitory nature. The Obama legacy is already being written.
Keep your eyes on test results this summer. If we are to believe the Times the achievement gap should be eradicated due to the Obama effect. NCLB will turn out to be a smashing success. And, the world will be a happier place. Unless those nasty transitory effects dash our hopes once again.
January 22, 2009
RWP also has a post demonstrating the same thing with not one, not two, but three business problems.
And, Pondiscio has, I believe the best post of the week showing how much of President Obama's inaugural speech you missed out on if you lacked the needed historical and literary content knowledge.
For all you connectivists out there I see a pattern emerging.
To paraphrase Edward G. Robinson -- "Where's your google now, nyahhhh?"
January 21, 2009
Education has long been enamored with higher order or inference questions. While educators are correct in calling for numerous inferential items, they must realize that large doses of inferential items will not necessarily improve students' inference skills. especially for instructionally näive students. The problem occurs when the inference items assume knowledge and skills the students do not have; and yet, teacher guidance is not provided. For example, consider the inference item: "As the location of the subatomic particle becomes more precise, what would you infer about its momentum?" Or consider this example: "When John walked out onto the street, he nictitated rapidly." Where do you think John has been? These are inference items, but working many items similar to these would not improve an average adult's skill in drawing inferences. Similarly, exposing students, especially low-performing ones, to inference items is not sufficient. The items must be carefully selected and sequenced, and careful instruction must be provided. Students must know relevant vocabulary, assumed relationships,a nd how to draw inferences if practice exercises are to be helpful.
Direct Instruction Reading, 4th edition, p. 237, n.1.
I'm wondering what the readers' reactions are since the opinions seem to span the spectrum.
Here's the problem (with my selection of 38.5 m as the distance from the kick point to the cross bar):
A ball is kicked from a point 38.5 m away from the crossbar. The top of the crossbar is 3.05 m high. If the ball leaves the ground with a speed of 20.4 m/s at an angle of 52.2º to the horizontal? (The usual assumptions apply: uniform earth gravity, no drag or wind, the ball is a point)
a. By how much does the ball clear or fall short of clearing the crossbar?
b. What is the vertical velocity of the ball at the time it reaches the crossbar?
Here's a graphic representation of the problem:
We know that the ball is kicked at an initial speed of 20.4 m/s at an angle of 52.2º. As such, the ball will be travelling both upwardly (vertical component) and to the right (horizontal component). Gravity will also be acting on the ball and pulling (accelerating) it back down to earth.
So, our problem is governed by the "what goes up, must come down" law of motion. In our case the ball gets kicked. It goes up. Then it comes back down. In between the going up and the going down parts of the trip, the ball reaches its maximum height.
We can easily calculate the maximum height (vertical distance travelled), the time it takes the ball to reach this maximum height, and the horizontal distance the ball travels during this time. Once we know these values we can determine where the crossbar is in relation to the maximum height reached by the ball. Does the ball pass the crossbar while it's going up or while it's going down? Both possibilities are shown in the picture.
Once we figure this out, then we'll worry about solving the rest of the problem.
So let's get to it and use some 21st Century skills. But, notice the amount of domain specific critical thinking that had to be done beforehand.
Let's google "projectile motion calculator" and select the first link.
Pay Dirt -- a projectile motion calculator presented as an HTML form. Filling out an HTML form seems to be a good example of a 21st Century skill, so let's use it.
The first two input fields call for an initial velocity (in m/s) and an angle (degrees). We know those values -- 20.4 and 52.2. Let's enter in those values and select Horizontal Component of initial velocity, Vertical Component of initial velocity, Max Height, Time to reach max height, Time of flight, and Range of flight as the things we want the calculator to calculate. The more the merrier. Computation time in the 21st century is cheap and this case nearly instantaneous.
Here's what the calculator spits out:
Horizontal Component of initial velocity: 12.503 m/s
Vertical Component of initial velocity: 16.119 m/s
Max Height: 13.256 m
Time to reach max height: t = 1.645 s
Total Time of flight = t = 3.290 s
Total distance travelled = 41.131 m
Wow. That saved me quite a bit of effort remembering and applying algebra, trigonometry, and physics. I think I like these fancy 21st Century skills.
Now let's put our critical thinking caps back on and make some sense of these numbers.
The calculator told us that the total time of flight (3.290 s) was merely double the time it took to reach the maximum height (1.645 s), i.e., it takes the same amount of time for the ball to travel up that it does to travel back down. This means that the distance to reach the maximum height is half of the total distance travelled 41.131 m ÷ 2 = 20.5655 m.
Since the cross bar is 38.5 m away from the starting point and the maximum height is reached at 20.5655 m, then the ball reaches its maximum height before it reaches the crossbar. So, the ball will be on its way back down to earth when it crosses the crossbar. This condition is shown in the picture and is the crossbar on the left hand side.
At the maximum height the ball is 13.256 m high and the crossbar is only 3.05 m high. The question is will the ball still be above the crossbar when it reaches the horizontal distance of the crossbar.
First we have to figure out how much farther the ball has to travel horizontally between the maximum height (20.5655 m) and the distance to the crossbar (38.5 m). This is simple subtraction (38.5 - 20.5655): 17.9345 m. The ball has 17.9345 m left to travel horizontally befoe it reaches the crossbar.
The calculator tells us that ball is travelling at a horizontal speed of 12.503 m/s.
I'm hoping that most people know that distance travelled = speed x time travelled. Or time travelled = distance travelled ÷ speed. In our case, the time it takes to traverse those 17.9345 m is (17.9345 m ÷ 12.503 m/s) 1.434 s.
Now the question is how far will the ball fall down during those 1.434 s.
That seems like real physics to me. So let's see if we can find another calculator to do the dirty work. As luck would have it the second link of our google search has just what we need. This page has a bunch of different projectile motion calculators. Again, as luck would have it, the top one "freefall" is the one we need (though you probably didn't know that unless you knew some physics).
The freefall calculator asks only for a time of travel. We just calculated that time as 1.434 s. Punching that number in, we find out that:
The vertical speed of the ball is 14.0532 m/s downward and the ball travels 10.076 m. So, the ball falls 10.076m from its max height of 13.25 m. so the ball is (13.25 m - 10.076 m) 3.174 m above the ground when it reaches the crossbar. The crossbar is only 3.05 m high, so the ball just barely clears the bar.
Now we have all the information we need to answer the problem.
Answer A: The ball clears the bar by (3.17 m - 3.05 m) 0.12 m.
Answer B: The ball's vertical velocity is 14.1 m/s downward according to the calculator.
We can check our answer by filling in our numbers in the "will it clear the fence" calculator toward the bottom of the page The answer does indeed check out.
Done and done.
We'll discuss the results in another post.
January 20, 2009
[R]eading strategies, “child-centered” teaching and differentiated instruction are joined at the hip. When you have heterogeneously grouped classes, all reading self-selected books based on individual interest and reading level, then instruction can no longer be about the text. It has to be about a generalized skill that can be applied to any text. Or any child.
Today's predominate balanced literacy instruction precludes teaching content knowledge.
In fact, I'd add the following codicil.
Whole language, reading strategies, “child-centered” teaching and differentiated instruction are joined at the hip. When you have heterogeneously grouped classes, all reading self-selected books based on individual interest and reading level based on readability formulas, then instruction can no longer be about the content and decodability of the text. It has to be about a generalized skill that can be applied to any text. Or any child. And, the instructional advantages conferred by using phonics is greatly diminished.
Let's call it the Pondiscio-DeRosa rule of reading instruction
In the episode Lisa decides she wants to learn how to tap dance despite not being very good at these kinds of athletic endeavors. Lisa signs up for classes at a dance studio run by Little Vicki, a former child star along the lines of Shirley Temple. Little Vicki personally teaches the children's tap dancing class.
Vicki: Now, the key to great dancing is one word: tappa-tappa-tappa.
[the children exchange looks]
Vicki [demonstrating]: Tappa-tappa-tappa.
[the children try]
Many tappa-tappa-tappas later, the class progresses nicely. Then Vicki notices someone off the beat. Vicki has Lisa dance by herself. She loses her balance and falls over backwards into a potted plant. The class laughs at her.
Vicki: Children, stop it! For all you know, she has a medical condition.
Vicki: I see....
Lisa continues to have trouble staying upright. Lisa blames it on the floor and moves to one end of the room, where she falls down again. Little Vicki dismisses the class for five minutes to talk to Lisa.
Lisa: What am I doing wrong, Little Vicki?
Vicki: Well, you're falling a lot. Maybe you should work on that.
Lisa: Yeah, well, no offense, but maybe I need a little more instruction than just "tappa-tappa-tappa".
Vicki: Why, back when I was your age, I had 43 movies under my belt, and I had to do it without tappa-tappa-tappa. I would've killed for tappa-tappa-tappa.
Lisa: Sorry, I'm just frustrated.
Vicki: Well, you'll never save Grandpa's farm with that attitude! You've just got to turn that frown upside-down!
Vicki: That's a smile, not an upside-down frown! Work on that, too!
Much of what goes on in elementary education is just a bunch of tappa tappa tappa. "Try reading this non-decodable children's good. Can't do it, then try looking at the pictures or the first letter of each word for cues. Still can't do it, then its time to call in the reading specialist."
Here's how the segment ends.
After class Lisa goes home and dejectedly walks through the kitchen, still wearing tap shoes:
Marge: There's our Broadway baby!
Marge: Hey, dig that crazy rhythm!
Lisa: I'm just walking. Listen, I know I said I wanted to be a dancer, but--
Marge: And you WILL be a dancer!
Homer: Look at you, all sugar and spice instead of equations and test tubes. [puts Lisa on his lap] You're Daddy's precious dancing queen.
Marge: And you look adorable!
Homer: Now, honey, what were you trying to say before we kept interrupting with our loving proudness?
Marge: Yes, our tiny tapper, what was it you were going to say after "I wanted to be a dancer, but"?
Lisa: But, I just... [looks at her proud parents] ...need more practice! See ya! [exits the kitchen, with her shoes tapping]
Homer: Oh, what's that awful sound?
Marge: The furnace?
Lisa: It's me!
Homer & Marge: Aww....
January 17, 2009
Starting in the late 1960s, Siegfried Engelmann led a government-sponsored investigation, Project Follow Through, that compared nine teaching methods and tracked their results in more than 75,000 children from kindergarten through third grade. It found that the Direct Instruction (DI) method of teaching reading was vastly more effective than any of the others for (drum roll, please) poor kids, including black ones.
This is true as far as it goes, but only part of the story: The results from PFT were a bit more conclusive than WcWhorter lets on. Here's how Zig describes the results in chapter 5 of his last book.
The evaluation had three categories: basic skills, cognitive (higher-order thinking) skills, and affective responses.
The basic skills consisted of those things that could be taught by rote—spelling, word identification, math facts and computation, punctuation, capitalization, and word usage. DI was first of all sponsors in basic skills...Only two other sponsors had a positive average. The remaining models scored deep in the negative numbers, which means they were soundly outperformed by [the control group]
DI was not expected to outperform the other models on “cognitive” skills, which require higher-order thinking, or on measures of “responsibility.” Cognitive skills were assumed to be those that could not be presented as rote, but required some form of process or “scaffolding” of one skill on another to draw a conclusion or figure out the answer. In reading, children were tested on main ideas, word meaning based on context, and inferences. Math problem solving and math concepts evaluated children’s higher-order skills in math.
Not only was the DI model number one on these cognitive skills; it was the only model that had positive scores for all three higher-order categories: reading, math concepts and math problem solving. DI had a higher average score on the cognitive skills than it did for the basic skills...
Not only were we first in adjusted scores and first in percentile scores for basic skills, cognitive skills, and perceptions children had of themselves, we were first in spelling, first with sites that had a Headstart preschool, first in sites that started in K, and first in sites that started in grade one. Our third-graders who went through only three years (grades 1-3) were, on average, over a year ahead of children in other models who went through four years—grades K-3. We were first with Native Americans, first with non-English speakers, first in rural areas, first in urban areas, first with whites, first with blacks, first with the lowest disadvantaged children and first with high performers.
You see the problem? DI wasn't just effective with poor/black students; it's effective with all students. In the long run this success is going to militate against eliminating achievement gaps, though a relative long-term reduction might be possible.
DI isn't exactly complicated: Students are taught to sound out words rather than told to get the hang of recognizing words whole, and they are taught according to scripted drills that emphasize repetition and frequent student participation
On a superficial level DI does not appear to be complicated. But, DI requires a lot more than "sounding out words" and "scripted drills" to work effectively. These are superficial features of DI. SOmeone with a deep understanding of DI would focus on other features, such as mastery learning.
Matt Yglesias basically agreees with McWhorter's somewhat superficial analysis but cautions:
A word of caution I would offer is that the rhetoric in the column seems, in my view, to oversell this fix. I think it’s important not to set people up to believe that some proposed change is a silver bullet when that just sets the stage for a potential future backlash. Based on what we know, it would be much better in general—and especially for poor kids—to do more direct instruction.
Yglesias is also confused. Like, McWhorter he identifies phonics as the primary component for DI's success. This isn't accurate. Moreover, McWhorter is talking about Direct Instruction. Yglesias is talking about direct instruction; the two are not the same. McWhorter is overselling DI a bit. He's also underselling it as well. So, Yglesias's concerns are only partially probative here. Yglesias's concern of a "potential future backlash appears to be based on the following:
Even the most egalitarian countries have statistically meaningful achievement gaps, and the United States is far from being the most egalitarian country.
Believing that egalitarian policies, or lack thereof, are somehow the cause of achievement gaps is a good example of what you get when you use correlational studies without understanding the underlying issues. Just because poverty and educational achievement are correlated and egalitarian policies and achievement are also correlated, does not mean that egalitarian polcies cause increased student ahcievement. See La Griffe's latest analysis to see why this is not so.
Yglesias does appear to stumble upon the right answer in the end though:
There’s no “solution” to the general existence of achievement gaps. There are, rather, policies that can be effective in narrowing them and this is one.
Even though the rest of his comment is wrong. Funny how that works sometimes.
January 16, 2009
The Philadelphia Inquirer reports that:
Just one in three low-income students eligible for free or reduced-price breakfasts got those meals in Philadelphia schools during the 2006-07 school year, according to a national report released yesterday.
Since the 2006-07 school year, however, the district has worked to improve the situation, a district spokesman said. But problems persist in the surprisingly complex and nuanced world of school breakfast.
So now serving breakfast is too complex and nuanced for schools to deliver.
Remember the hullabaloo a few months back regarding the community schools and the desire for schools to take over a whole panoply of social services.
Now we find out they can't even serve breakfast effectively. And, breakfast is very similar to lunch, something they've been doing for a long time.
And I don't want to hear that the schools don't get enough subsidy for these meals. We went to Disney World last month and the Disney Dining Plan was $9.99 for children who received a counter service lunch, a full restaurant table service dinner and a snack that could include almost any food product sold by Disney for about $3.50, including many breakfast items.
Contrast this to the french fries served by my high school whose box read "no nutritional value but edible." Mmmmm-mmmm.
January 14, 2009
Can you think critically in a domain in which you do not have a deep understanding?
Some in the edusphere think you can.
I think lack of domain knowledge will serious handicap your ability to think critically in that domain.
Instead of arguing and accomplishing nothing, why don't we collect some data.
And I urge everyone in the edusphere who has an opinion in this area to think up their own example.
Below is a simple physics problem that requires some domain knowledge in physics, algebra, and arithmetic to solve. And, unless you've seen a problem very similar to the one below, I think you'll need a somewhat deep understanding to come up with a solution.
At least before the 21st century and the internet.
I've taken 2 years of physics (one in high school and one in college) and a semester of dynamics. I have some domain knowledge in physics. But, I took these courses over 20 years ago and I haven't had much need to solve these kinds of problems since then. However, I still remember how to solve these problems with a little prompting because I solved a few hundred similar problems two decades ago. It's in long term memory at this point.
And, indeed, I was able to solve the problem in about five minutes with a pencil, paper, and a calculator with trigonometric capabilities.
Then I turned to the internet and tried to solve it using knowledge I could extract therefrom. Not unsurprisingly, the internet offered a cornucopia of information that greatly assisted me and made the job of solving the problem much easier. I didn't have to use any algebra nor did I have to do any manual computation. And, the tools I found had brief descriptions of the underlying physics, so I'm thinking a non-expert might be able to use them without really knowing or understanding the underlying physics all that well.
Again, I was able to solve the problem using the internet in about five minutes. But, again, I have domain knowledge.
In theory, the problem can be solved using general critical thinking skills and a basic understanding of how the world works if one has access to the internet.
Let's put that theory to the test.
Here is a simple physics problem that any high school student taking physics is capable of solving. You've probably taken high school physics, but you probably don't remember much of it if you didn't go on to take more physics in college.
See if you can use your critical thinking skills to solve it. Use your 21st century skills to help you solve it. Here it goes.
A ball is kicked from a point X m away from the crossbar, where X is a number between 38 and 39 that you select. The top of the crossbar is 3.05 m high. If the ball leaves the ground with a speed of 20.4 m/s at an angle of 52.2º to the horizontal? (The usual assumptions apply: uniform earth gravity, no drag or wind, the ball is a point)
a. By how much does the ball clear or fall short of clearing the crossbar?
b. What is the vertical velocity of the ball at the time it reaches the crossbar?
Just pick a random number between 38 and 39 and off you go. (I did this so that everyone would get a different answer).
It may be that I picked too simple of a problem. (I always underestimate the amount of math that non-science/engineering majors know and I didn't want to make the problem too hard.) But I think what the problem will demonstrate is that without domain knowledge in physics you will be overwhelmed by the problem and will have much difficulty in doing all the steps needed to answer the questions.
(I did not find any tools that allowed you merely to plug in the numbers and pop out and answer. That would not require any critical reasoning.)
Leave your answers and comments in the comments of this post. Don't forget to tell us what number you selected (between 38 and 39).
And for all of you edusphere bloggers why don't you use your specific domain knowledge and come up with your own critical thinking problems to test and challenge your readers and your own opinions on critical thinking and 21st century skills.
Update: No fair using the pre-21st century skills as asking someone else for help or a complete solution or to learn the specific domain knowledge needed. I wouldn't be able to have chosen such a simple problem if I allowed that and I can certainly have found a more difficult problem in which you could not have learned the domain knowledge in a reasonable amount of time.
Don't spend more than about 20 minutes or so solving the problem.
Also, don't give away too much information in your comments. We don't want other readers using your methodology to find their own solutions. But do tell us the amount of domain knowledge you think you possess. And do tell us if you can't solve the problem within the constraints.
I will make a new post in a few days where you'll be able to leave more detailed information as the discussion gets furthered.
January 13, 2009
If anything, the new model has some potential advantages in keeping students fully engaged during class. This advantage is contingent upon the student being capable of being kept engaged during class which requires that the student prepare for class by reading and thinking about the material beforehand. This is not something the non-physics M.I.T students are willing to do, at least according to the article.
I jut don't see how an unprepared student can effectively participate in a problem solving activity with only a few minutes of teacher talk.
Here's a good example of what I'm talking about. Projectile motion is one of the first and easiest topic in introductory physics. See if you can read the explanation and then see if you can solve the problems in real time. Now go find five friends who don't know Physics and see if it helps when you try together. The example provides a good introductory explanation. I'm sure someone knowledgeable in Physics, like a grad student, would help out a bit. But, I'm thinking that it would be a lot more helpful, if you were presented the initial explanation first, then you were given a few hours to wrap your head around the material and try your hand at solving a few problems. Then you would know exactly what you knew and what you need more help with. That's when the grad student comes in handy. And, that's why the traditional way Physics is taught works well for those that do the work. It also tells us that there is no shortcuts no matter what M.I.T and the NYT wants you to believe.
Introductory Physics is a typically a four credit class. Actually, two semesters of four credits. This means that there's four hours of classroom instruction provided each semester. Typically, these courses are followed by (at least) a two credit lab course in which students take what they've learned and conduct highly scripted Physics experiments.
Here's how Introductory Physics is traditionally presented in a typical school week.
1. Student reads the next section in the textbook.
2. Professor explains the section, provides his insights, and works some problems from the section. (one hour)
3. Student attempts to solve the assigned problems from the section, either alone or in a study group.
4. Grad Students work with students in small groups reviewing the problems to ensure the student understands the material. (one hour)
5. Repeat steps 1-4 for next section. (two hours)
That's how it's supposed to work. In practice, it usually goes a little differently.
1. Student fails to read next section or doesn't understand next section.
2. Professor reviews exactly what is written in the textbook without providing any insight that the student could have gotten by reading the book on his own.
3. Student fails to do some, many, or all of the assigned problems either through lack of understanding or laziness.
4. Grad student reviews problems and student copies answers.
Note that the student is supposed to be spending at least eight hours a week studying and solving problems outside of the classroom. Some students front-load the work and do their work before the material was presented and the problems are reviewed like in the first example. Other students back-load the work and do their work after the material was initially presented and the problems solved like in my second example. Most students, however, fall out somewhere in the middle.
This is basic direct instruction using worked problem examples. Research shows that it is an effective way to teach novice students, and, by definition, students taking an introductory Physics class are novice students. These students have a long way to go before they are experts in Physics. Using worked problem examples is less effective with experts and non-novices with considerable domain knowledge. The difference is domain knowledge. The experts have it; the novices do not, at least not yet.
In fact, they won't have it the following semester either when they take Physics lab. Undergraduate Physics lab is closer to baking a cake from scratch than it is to real science. It is a highly scripted affair because the students do not yet know enough physics or how to conduct a real experiment on their own yet. They are novice scientists and the lab provides another opportunity to follow worked problems. In this case, the worked problems are the scripted experiments.
I think now we have enough background knowledge to make some sense out of the Times article.
For as long as anyone can remember, introductory physics at the Massachusetts Institute of Technology was taught in a vast windowless amphitheater known by its number, 26-100
The physics department has replaced the traditional large introductory lecture with smaller classes that emphasize hands-on, interactive, collaborative learning. Last fall, after years of experimentation and debate and resistance from students, who initially petitioned against it, the department made the change permanent. Already, attendance is up and the failure rate has dropped by more than 50 percent..
Right off the bat, I find it hard to believe that all four hours classroom time in Introductory Physics are present in a large ampitheater. Are there any science/engineering majors (especially those attending M.I.T) out there that that were taught like this. Undergraduate physics instruction is all about learning how to solve basic physics problems. BY necessity this will involve the student working hundreds of problems over the course of the semester on his own or with a study group. There's no getting around that fact. Even the most direct instruction of courses requires that the student work the problems on his own following an introduction by an expert (the professor) and concluding with a review of the problems with an expert (a grad student or the professor). This requires motivation on the part of the student. And that appears to be a problem at M.I.T, as you'll soon see.
Also note the petitioning of the students adn the failure of the Times to get to the bottom of that. We'll get to that later.
The traditional 50-minute lecture was geared more toward physics majors, said Eric Mazur, a physicist at Harvard who is a pioneer of the new approach, and whose work has influenced the change at M.I.T.
“The people who wanted to understand,” Professor Mazur said, “had the discipline, the urge, to sit down afterwards and say, ‘Let me figure this out.’ ” But for the majority, he said, a different approach is needed.
I think Professor Mazur is delusional. Physics forms a critical foundation for most of the students learning a hard science or engineering. Subsequent courses will build off of what is learned in introductory physics and the physics problems will be revisited and expanded upon often in subsequent years. So, a student who does not possess the urge to put in the hard necessary to learn physics is in for a rude awakening sophomore year. The years of coddling in high school are over; now is the time for real work.
“Just as you can’t become a marathon runner by watching marathons on TV,” Professor Mazur said, “likewise for science, you have to go through the thought processes of doing science and not just watch your instructor do it.”
That's stating the obvious now isn't it. And I find it hard to believe that M.I.T students were merely watching their instructor solve problems for four hours every week in a large amphitheater and not actually solving their own problems. I'm sure somewhere along the line students were being assigned problems to work out of class and that some time in-class was spent reviewing those problems and their solutions. Are we to believe that only the Physics majors were doing their homework?
Then we have this non-sequitur.
In an article in the education journal Change last year, Dr. Wieman noted that the human brain “can hold a maximum of about seven different items in its short-term working memory and can process no more than about four ideas at once.”
“But the number of new items that students are expected to remember and process in the typical hourlong science lecture is vastly greater,” he continued. “So we should not be surprised to find that students are able to take away only a small fraction of what is presented to them in that format.”
What does this have to do with anything related to this article. The magic number 7 is a problem under both the old way and the new way at M.I.T. Either way, the students are learning more than they can absorb. That's why they take notes and write stuff down. Students really bump up against the short term memory problem when they try to solve the problems until they have learned the underlying material. The new way of teaching doesn't fix that problem. The only thing that fixes that is lots of practice solving problems. Are the students getting more practice under the new system? Let's see.
At M.I.T., two introductory courses are still required — classical mechanics and electromagnetism — but today they meet in high-tech classrooms, where about 80 students sit at 13 round tables equipped with networked computers.
Instead of blackboards, the walls are covered with white boards and huge display screens.
Why are journalists such suckers for bright lights and fancy gizmos? I've yet to see any of this technology used in a way that is pedagogically superior to a blackboard and a slide projector.
Circulating with a team of teaching assistants, the professor makes brief presentations of general principles and engages the students as they work out related concepts in small groups.
Teachers and students conduct experiments together. The room buzzes. Conferring with tablemates, calling out questions and jumping up to write formulas on the white boards are all encouraged
This is the money graf. Here's where we find out that M.I.T hasn't really done away with the lecture, they've just shuffled the chairs. Instead of Lecture for an hour in a classroom and then solve problems for an hour in small groups with grad students, M.I.T. now has the professor lecture for a short period of time then the students solve problems for a short period of time with the help of the professor and grad students in the same room, repeat until the class is done. What's the difference?
I don't see the advantage, except maybe that the lazy students are being forced to do the work under the watchful eyes of the instructors instead of copying the problems they should have worked out before a later problem solving period. But, since they're working in groups now, there's no guarantee that they're not free-riding off of their neighbors instead of free-riding off of the grad student in the separate recitation period.
What the article describes is exactly what was going on in our problem solving classes with our grad student after our lecture. The only change is that we had blackboards.
M.I.T hasn't done away with the lecture; they've merely rearranged it in a way that is no more sound in a cognitive science sense than it was before. What M.I.T. is doing is providing another year of coddling. It's also still direct instruction. (Though I can't wait to see how Stephen Downes is going to try to spin it.)
And, the students aren't experimenting, they are solving problems. There is a big difference.
“There was a long tradition that what it meant to teach was to give a really well-prepared lecture,” said Peter Dourmashkin, a senior lecturer in physics at M.I.T. and a strong proponent of the new method. “It was the students’ job to figure it out.”
Our professor gave some really good lectures and then he ran one of the problem solving sessions. Is there a difference?
Apparently the problem is really an attendance problem.
John Belcher, a space physicist who arrived at M.I.T. 38 years ago and was instrumental in introducing the new teaching method nine years ago, was considered an outstanding lecturer. He won M.I.T.’s top teaching award and rave reviews from students. And yet, as each semester progressed, attendance in his introductory physics courses fell to 50 percent, as it did, he said, for nearly all of his colleagues.
“M.I.T. students are very busy,” Professor Belcher said. “They see the lecture as dispensable, that is that they can get it out of a book more efficiently than getting up, getting dressed and going to lecture.”
After three years, Professor Belcher had had enough. “I had poor attendance, and was failing 10 to 15 percent, and grading the tests and shaking my head in despair about how little was getting across,” he said. “And this is a subject — electromagnetism — that I love.”
Here's the thing. The problem solving sessions are critical to success. The lectures less so if the textbook presents the material well and the student reads it beforehand. This is especially so if the professor is a bad teacher. Under the new system the students are forced to endure the gas-bag "initial presentation,"i.e., mini-lectures, to get to the problem solving part.
Maybe that's why the students are petitioning. The silly lectures are no longer optional for the student. That's why attendance is up. Here's another reason why attendance is up:
Unlike in the lectures, attendance counts toward the final grade, and attendance is up to about 80 percent.
I suppose the clickers don't hurt.
“One of the newer professors, Gabriella Sciolla, who arrived in 2003, was teaching a TEAL class on circuits recently. She gauged the level of understanding in the room by throwing out a series of multiple-choice questions. The students “voted” with their wireless “personal response clickers” — the clickers are essential to TEAL — which transmitted the answers to a computer monitored by the professor and her assistants.
You know where they are,” Professor Sciolla said afterward. She can then adjust, slowing down or engaging students in guided discussions of their answers, as needed.
Lecturing in 26-100, she said, she could only look out at the sea of faces and hope the students were getting it.
Unless they had clickers because if they had clickers in the lecture hall, the professor would get the same feedback.
What I see here is a distinction without a difference. The learning is no more active then it was under the old system.
What is left unexplored by the Times is why there was protesting by the students. Students are no fans of boring lectures. And I'm sure under the old system there was plenty of in-class problem solving and opportunity for feedback. Under the old system students were supposedly left on their own to solve difficult physics problems. Now they get to do the same thing in a high-tech classroom with all their classmates and lots of teaching assistants milling around. So why the protests?
I suspect once we learn why, we'll get a better idea of the problems of the new system at M.I.T.
January 9, 2009
BTW, Willingham also has a new article on memory in the new American Educator which you should also read.
And, would you believe that I actually beat the inestimable Core Knowledge Blog in posting the video. Take that Pondiscio.
January 8, 2009
Not unsurprisingly, popular spelling instruction practices are based on flimsy pseudo-science:
One common perception we have encountered is that visual memory, analogous to taking a mental picture of the word, is the basis of spelling skill. Teachers often tell us that they teach spelling by encouraging whole-word memorization (e.g., using flashcards and having students write words 5 or 10 times) or by asking students to close their eyes and imagine words. We’ve encountered this perception that spelling relies on visual memory so many times that we became curious about when and how it originated—after all, it’s a far cry from Webster’s spellers. We traced it back to the 1920s: one of the earliest studies to stress the role of visual memory in spelling was published in 1926, and it found that deaf children spelled relatively well compared with normal children of similar reading experience.4 Based on this study, and the perception that the relationship between sounds and the letters that spell them is highly variable, many people concluded that learning to spell is essentially a matter of rote memorization. Thus, researchers recommended that spelling instruction emphasize the development of visual memory for whole words
Right, let's have them visually memorize whole words because there couldn't possibly be any other helpful information they could use. Apparently not; its a common belief that English is a highly irregular language. The article lays that trope to rest.
This is a question we hear often. If English spelling were completely arbitrary, one could argue that visual memorization would be the only option. However, spelling is not arbitrary. Researchers have estimated that the spellings of nearly 50 percent of English words are predictable based on sound-letter correspondences that can be taught (e.g., the spellings of the /k/ sound in back, cook, and tract are predictable to those who have learned the rules). And another 34 percent of words are predictable except for one sound (e.g., knit, boat, and two). If other information such as word origin and word meaning are considered, only 4 percent of English words are truly irregular and, as a result, may have to be learned visually (e.g., by using flashcards or by writing the words many times).
Far from being irregular and illogical, to the well-known linguists Noam Chomsky and Morris Halle, English is a “near optimal system for lexical representation.
There are three types of information that, once learned, make spelling much more predictable: (1) word origin and history, (2) syllable patterns and meaningful parts of words, and (3) letter patterns.”
It doesn't take a rocket scientist to figure this out. Research shows that children misspell irregular words more often than regular words. That should have been a good indication that visual memorization might not have been the best way to go.
The other thing is. Wouldn't using word origin and history, spelling patterns and meaningful parts of words, and letter patterns to spell words involve using and practicing critical thinking skills -- dare I say 21st Century skills, rather than brute memorization? Just sayin'.
The article is a good read. The only weak part is when the authors make some untested recommendations as to how they think spelling should be taught. At best, these recommendations are representative examples of what might possible be good practice once someone takes to time to develop an test a suitable instructional sequence. But, that work has not yet been done and the authors are a wee bit overconfident that their recommendations will be effective.
January 7, 2009
Granted, the 21st-century skills idea has important business and political advocates, including President-elect Barack Obama. It calls for students to learn to think and work creatively and collaboratively. There is nothing wrong with that. Young Plato and his classmates did the same thing in ancient Greece. But I see little guidance for classroom teachers in 21st-century skills materials. How are millions of students still struggling to acquire 19th-century skills in reading, writing and math supposed to learn this stuff?
There are ways, some teachers tell me. Tim Burgess, a physics and chemistry teacher in Alabama, said he tried coaxing students to think for themselves. He laid out clues and let students sort them out together -- and it worked. "Suddenly, it became clear how 21st-century thinking was far more important than the mounds of content we were expected to force-feed our victims (I mean students)," Burgess said.
The 21st Century skills movement is nothing more than an excuse for continuing not to teach content under the mistaken belief that if you teach students how to think (i.e., how to learn how to learn), content becomes irrelevant internet access.
Unfortunately that's not the way it works. Critical thinking skills are domain specific. If you want to think critically about the American Civil War you unfortunately need to know a lot of stuff about American history, European History, military history, the American Civil Wat itself, and lots of other bring stuff like that.
This doesn't necessarily mean that students need to spend lots of time memorizing minutiae, but they at least know enough general knowledge to be able to pass those silly internet tests that embarrassingly show that today's (and yesterday's) students don't, in fact, know this stuff. There must be some sort of mental framework in place for Google or Wikipedia to be useful.
Instantaneous access to information doesn't guarantee that one will know what to do with the information after it's located.
Although, I think there is one useful 21st Century skill that students should be taught: how to set the time on their VCR's to lose that technological incompetence badge of shame: .
Or maybe not.
(That lame ending joke is actually a good example of what I'm talking about. It depended upon my knowing a few pieces of minutiae: 1. That there is an annoying blinking HTML tag (something I've known for some time) and that VCRs are no longer being manufactured (something I learned last week). And my being able to quickly retrieve those facts in real time as an example of obsolete skills, the importance of knowing facts, and critical thinking (the ability to synthesize those facts to make the joke) to end the post. The other point is that a good comedian should never have to explain his jokes. I leave it up to you to deconstruct that one in the comments.)
Update 1: Apparently, IE doesn't properly display the BLINK HTML tag. That's probably a good thing. Use your imagination.
Update 2: Willingham beat me to the punch. "But these 21st-century skills require deep understanding of subject matter, a fact that these reports acknowledge, albeit briefly. As I have emphasized elsewhere, gaining a deep understanding is, not surprisingly, hard. Shallow understanding requires knowing some facts. Deep understanding requires knowing the facts AND knowing how they fit together, seeing the whole. It’s simply harder. And skills like “analysis” and “critical thinking” are tied to content; you analyze history differently than you analyze literature, a point I’ve emphasized here. If you don’t think that most of our students are gaining very deep knowledge of core subjects—and you shouldn’t—then there is not much point in calling for more emphasis on analysis and critical thinking unless you take the content problem seriously. You can’t have one without the other."
January 5, 2009
Here's how gifted education is supposed to work according to the statute:
1. Student is identified as being gifted, i.e., an IQ of two standard deviations above the mean (with some leeway which allows schools to fudge the results a bit for students just missing the cutoff).
2. The gifted student's present level of educational performance is then determined to see where the student is academically. For example, a third grade student might be reading on a fifth grade level and doing math on a fourth grade level.
3. Then the student's instruction is supposed to be specially designed, i.e., individualized, to meet the needs of the student.
4. Annual goals (what the student is supposed to learn this year) and short term learning objectives (the steps the students is to take to learn the goals) are then developed.
5. And the whole plan is memorialized in a written document (GIEP) which must be approved by the student's parents.
That's how things are supposed to work in theory. In actuality, things typically work a little differently. Here's how it works in practice in most school districts:
1. Student is identified as being gifted.
2. School district recommends that the student particpate in its gifted pull-out program which typical entails "enrichment" not acceleration.
3. Student receives some "differentiated" school work (i.e., semi-random worksheets) in class (because the courts have determined that a gifted pull-out program is not sufficient by itself).
4. Fuzzy goals and learning outcomes are listed in the student's educational plan which are typically subjective, unquantifiable, and/or untestable.
5. Plan is presented to student's parents for approval without informing them that the district's recommendation is merely a preference and that other options are available tp the student.
I'd characterize this as the school's way of discharging the regulatory burdens of providing gifted education with the minimal amount of work and the minimal amount of additional academic expectations. Instead of the student's needs being paramount as intended by the law; the disctrict's administrative convenience is paramount.
As a parent of a regular education student you basically have no say in how your child is educated in the public school system. You don't agree with the school's choice of fashionable curriculum? Too bad; move to a new a new school district. But once your child is identified as gifted (or "special" at the other extreme) they become statutorily protected. Now the parent does have a say. But unfortunately, most parents willingly (if perhaps unwittingly) sign away this right as soon as they accept the district's recommendation which is, as I described above, designed to specifically appear to be doing something for the student without doing much of anything or being responsible for doing or accomplishing much of anything.
The school's favorite way of accomplishing this goal is to specify academic "enrichment" for the student. So, what is enrichment? It's one of those education weasel words. It could mean almost anything. But I think my definition of enrichment is a good functional definition
Enrichment is not acceleration.
That cuts right to the chase. If the student is receiving enrichment, he's not receiving acceleration. He might be learning more, but that "more" being learned isn't the stuff needed to make it to the next level.
Let's say the gufted student is capable of learning 50% faster than the regular education instructional pace. This means that in two years the student is capable of learning three years of academic content. If the student was in third grade and was being accelerated, he'd be ready to tackle sixth grade level work by the end of fourth grade (2 years). However, if the student were being enriched, he would likely only be prepared to do fifth grade work at the end of fourth grade.
Maybe an illustration would help.
The first three light blue ovals represent how much the regular student needs to learn. The light green circles represent how much our hypothetical gifted student learns in a given year (150%) in an enrichment program. The gifted student is clearly learning a lot more than the regular student for the three years of grades 3-5 depicted. At the end of the those three years, however, the student still is only prepared to do sixth grade work.
Let's contrast this with an acceleration program.
The student has learned the same amount of material, but the learning is focused in the direction of what the student needs to know to progress through the grades. The result is that after the same three years of learning, the accelerated student is ready to do work at grade 7.5 instead of grade 6 as in the enrichment example above.
Acceleration seems, at least to me, to be the preferred course of action for the gifted student. School districts, however, don't see it this way. The vast majority of schools only want to offer enrichment pull-out programs for their gifted students. Why do you suppose this is so?
I think that the reason is that there's increased accountability in accelerating the gifted student. In my example, the gifted student should be ready to do sixth grade level work by the end of 2 years instead of three. If the student isn't ready then something has gone wrong and the student hasn't learned what he was supposed to. Someone is going to be blamed and who wants that aggravation, especially considering these are the kids who should be coasting through the system and Taking up less of the teacher's time, allowing her to focus on the other kids.
The other reason is that acceleration programs present administrative challenges for the school since these gifted kids will have to be separately tracked ot perhaps taught in a different grade for some subjects.
Nonetheless, the statute clearly places the student's needs above the administrative problems of the schools, so this last factor shouldn't be an issue in theory. In practice, you know it is. This is a monopoly we're dealing with and monopolies don't care about their customers -- where else are they going to go? And who cares anyway, the same amount of tax dollars are still going to flow into the coffers every year.