Lots of attention in the discussion around teacher quality focuses on value-added data and the ability to determine a teacher’s effectiveness from a single test score.
More recently, a study by researchers at Harvard has received lots of attention because it purports to indicate that replacing a bad teacher with a good one has significant lifetime impact on student earning potential.
Unfortunately, it seems none of the media fawning over this study know how to use a calculator.
So, I break it down here:
This is the study that keeps getting attention around teacher quality and student earning potential. It was even mentioned in President Obama’s State of the Union back in 2012. It keeps getting cited as further evidence that we need to fire more teachers to improve student achievement.
Here’s the finding that gets all the attention: A top 5 percent teacher (according to value-added modeling or VAM) can help a classroom of students (28) earn $250,000 more collectively over their lifetime.
Now, a quarter of a million sounds like a lot of money.
But, in their sample, a classroom was 28 students. So, that equates to $8928.57 per child over their lifetime. That’s right, NOT $8928.57 MORE per year, MORE over their whole life.
For more math fun, that’s $297.61 more per year over a thirty year career with a VAM-designated “great” teacher vs. with just an average teacher.
Yep, get your kid into a high value-added teacher’s classroom and they could be living in style, making a whole $300 more per year than their friends who had the misfortune of being in an average teacher’s room.
If we go all the way down to what VAM designates as “ineffective” teaching, you’d likely see that number double, or maybe go a little higher. So, let’s say it doubles plus some. Now, your kid has a low VAM teacher and the neighbor’s kid has a high VAM teacher. What’s that do to his or her life?
Well, it looks like this: The neighbor kid gets a starting job offer of $41,000 and your kid gets a starting offer of $40,000.
Wait, what? You mean VAM does not do anything more than that in terms of predicting teacher effect?
And so perhaps we shouldn’t be using value-added modeling for more than informing teachers about their students and their own performance. Using it as one small tool as they seek to continuously improve practice. One might even mention a VAM score on an evaluation — but one certainly wouldn’t base 35-50% of a teacher’s entire evaluation on such data. In light of these numbers from the Harvard researchers, that seems entirely irresponsible.
Perhaps there’s a lot more to teacher quality and teacher effect than a “value-added” score. Perhaps there’s real value added in the teacher who convinces a struggling kid to just stay in school one more year or the teacher who helps a child with the emotional issues surrounding divorce or abuse or drug use or any number of other challenges students (who are humans, not mere data points) face.
Alas, current trends in “education reform” are pushing us toward more widespread use of value-added data — using it to evaluate teachers and even publishing the results.
I can just hear the conversation now: Your kid got a “2” teacher mine got a “4.” My kid’s gonna make 500 bucks more a year than your kid. Unless, of course, the situation is reversed next year.
Stop the madness. Education is a people business. It’s about teachers (people) putting students (people) first.
I’m glad the researchers released this study. Despite their spurious conclusions, it finally tells us that we can and should focus less on a single value-added score and more on all the inputs at all levels that impact a child’s success in school and life.
As Kentucky considers teacher evaluation “reform,” caution should be used when deciding what (if any) role value-added scores will play in new evaluations.
For more on Kentucky education politics and policy, follow us @KyEdReport