Skip to main content

When Reporters Use (S)extrapolation as Sound Bites

Earlier today on Twitter, Annie Lowrey and I had a brief exchange (or an exchange of tweets) about the column inches she used for the Hanushek-like extrapolation in Friday’s New York Time story on the Chetty, Friedman, and Rockoff value-added measure paper.1 Of 1142 words for the story as a whole, about 15% of the story was spent on the following passages:

All else equal, a student with one excellent teacher for one year between fourth and eighth grade would gain $4,600 in lifetime income, compared to a student of similar demographics who has an average teacher….

In the aggregate, these differences are potentially enormous. Replacing a poor teacher with an average one would raise a single classroom’s lifetime earnings by about $266,000, the economists estimate. Multiply that by a career’s worth of classrooms.

“If you leave a low value-added teacher in your school for 10 years, rather than replacing him with an average teacher, you are hypothetically talking about $2.5 million in lost income,” said Professor Friedman, one of the coauthors….

“The message is to fire people sooner rather than later,” Professor Friedman said.

There are two reasons why those passages concerned me. One was the policy frame the paper’s authors accepted from Erik Hanushek: the main way you improve teaching is by firing people. But more relevant to the reporting, Lowrey was spending a measurable amount of space on the weakest part of the paper.

I understand the instinct of reporters to look for implications, especially in reporting research. Who cares that CFR created a new algorithm to test whether a teacher’s value-added measures might be an artifact of student assignment within a school? My professional judgment is that’s much more important than the (s)extrapolation in section 5 of the paper. But it is very hard work for a reporter to explain that significance, and when researchers such as Erik Hanushek or Chetty, Friedman, and Rockoff give you a hook, it’s hard to resist a strong sound bite.

I don’t think Chetty, Friedman, and Rockoff wrote the paper to attract reporters’ interest by section 5 — if so, it wouldn’t really be section 5. But Friedman certainly went there in the interview, and Lowrey used it. The same space could have been used to get Jesse Rothstein’s views on whether the paper addressed the potential bias of value-added measures. While Lowrey quotes Rothstein, it’s on a different point entirely. If you’re writing a story for the New York Times, for goodness’ sake, don’t talk about the research equivalent of the Kardashians when there’s more substantive material available! Lowrey’s a good reporter on the whole, but in this case I cringed at the waste of space on (s)extrapolation.

For another view on the same question, here’s Bruce Baker (who lays more of the blame on Chetty and Friedman):

These two quotes by authors of the study were unnecessary and inappropriate. Perhaps it’s just how NYT spun it… or simply what the reporter latched on to. I’ve been there. But these quotes in my view undermine a study that has a lot of interesting stuff and cool data embedded within.

One more bit of perspective: the job of a reporter is compounded by a history of researchers’ odd attempts to quantify the unquantifiable, most obviously with the practice of cost-benefit analysis (how much is a 46-year-old professor’s life worth?). And in education, there are various studies that make reasonable but fragile assumptions, whether you’re talking cost-effectiveness analyses such as Clive Belfield and Hank Levin’s work on various interventions or the whole practice of meta-analysis. So what can reporters do when trying to explain the significance of new research, without getting trapped by a poorly-supported sound bite?

  • If a claim could be removed from the paper without affecting the other parts, it is more likely to be a poorly-justified (s)implication/(s)extrapolation than something that connects tightly with the rest of the paper.
  • If a claim is several orders of magnitude larger than the data used for the paper (e.g., taking data on a few schools or a district to make claims about state policy or lifetime income), don’t just reprint it. Give readers a way to understand the likelihood of that claim being unjustified (s)extrapolation.
  • More generally, if a claim sounds like something from Freakonomics, hunt for a researcher who has a critical view before putting it in a story.

Notes

  1. If you are curious about my substantive views of the paper, see my comments, and I also highly recommend the notes of Bruce Baker. []

Related posts:

  1. Can reporters raise their game in writing about education research?
  2. Hechinger Institute hypes the obvious — this is a role model for reporters??
  3. Columnists exist to make the rest of us look smart

This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:

The views expressed by the blogger are not necessarily those of NEPC.

Sherman Dorn

Sherman Dorn is the Director of the Division of Educational Leadership and Innovation at the Arizona State University Mary Lou Fulton Teachers College, and editor...