If this is March, it must be Charter School
Propaganda Celebration Month! A couple of weeks ago, we had some of Boston’s white shoe wonders threatening a civil rights suit against the charter school cap. Everyone who wasn’t laughing was taking the proposal with all the serious it deserved. Even Captain Charter himself Paul Grogan didn’t seem to take the threat seriously. Anyhoo, that was last week. There is a week and a half left in March.
This week is the release of the CREDO’s Urban Charter School Study. Picked up by the State House News Service and turned over to the Globe for stenographic treatement, the study was incarnated as headline. Studies like this are less about the facts than the press release, don’t you know? Reading a study like this is like reading a contract for cell phone service: you’d like to know the details, but it’s really hard to understand. In the end, you sign on the bottom-line. And in news, nine times out of ten, the bottom-line is the headline. The headline that doesn’t even mention a study.
The Globe doesn’t have what it takes to read and analyze the study (truth be told, it’s pretty complicated for any lay person), and instead of calling someone about the quality of the research, Globe calls all the usual people and types who haven’t read the study but certainly have an opinion on charter schools. BTU President Richard Stutman challenges the assumptions of the the study: that charter school kids are equivalent to public school kids. Mark Kenen, who chairs the Charter School Cheerleading Squad, has his obligatory cheer: “These are historic achievement gains. Charters are providing a blueprint for success.” Cap the article off with a sensationalized headline like “Boston’s charter schools show striking gains: Test scores surpass traditional public schools, counterparts nationwide.” It’s no wonder reading the news leads us to know more about less.
The Globe may not be churlish enough to point out that CREDO, the organization responsible for publishing the study is part of the Hoover Institution, but I am. (Not surprisingly, CREDO brags about the connection to Stanford University, but they are an arm of the Hoover Institution). And the Globe may not be perceptive enough to question the report’s claim that “The strides at Boston charter schools—in both math and reading—equaled what students would have learned if they had been in school hundreds of additional days each year.” But I am. I can’t do the math, but I know when a little skepticism is called for. I would have called Daniel Koretz, Harvard School of Ed professor and expert on this kind of thing. He might have said something about linear relationships between variables and fallacious extrapolation. Things that might have shed some light on this
headline generator report. Alas, I’m too ill at these numbers to argue them. And I don’t have the hours to pore over the report to do a job the Globe could have done with a couple of phone calls.
Aside from the plentitude of charter school boosters and the standard stenography, there’s not much of anything online about CREDO’s Urban Charter School Study Report. Some informed googling on my part, however, turned up some useful information from The National Education Policy Center at the University of Colorado (another potential article source):
there are significant reasons for caution in interpreting the results. Some concerns are technical: the statistical technique used to compare charter students with “virtual twins” in traditional public schools remains insufficiently justified, and may not adequately control for “selection effects” (i.e., families selecting a charter school may be very different from those who do not). The estimation of “growth” (expressed in “days of learning”) is also insufficiently justified, and the regression models fail to correct for two important violations of statistical assumptions. However, even setting aside all concerns with the analytic methods, the study overall shows that less than one hundredth of one percent of the variation in test performance is explainable by charter school enrollment. With a very large sample size, nearly any effect will be statistically significant, but in practical terms these effects are so small as to be regarded, without hyperbole, as trivial.
To be clear, this summary describes NEPC’s review of CREDO’s Charter School Study of 2013. Yet the Urban Charter School Report uses the same methodology: the same sketchy Virtual Twins comparison; the same questionable estimation of growth in “days of learning.” The results may differ from those of two years ago, but it will take better-trained minds than mine to untangle all of that, And by then, the headlines will have already had their effect.
But hey, no worries. it’s March. And that means Charter School Madness.