The media has a hard time reporting on data. They often don’t know what it means or how significant it is. Blame them for people taking Vitamin E because there was research that said it was good for you, until there was research showing too much Vitamin E might actually be bad for you. When it comes to their gatekeeping function, the media rarely does well by its readers when it comes to research. It’s worse when it comes to education.
The Boston Globe has what has to be one of the shallowest, most banal pieces ever produced using MCAS scores. I guess it’s a plug for the paper’s real estate section. I clicked on it because I’m interested in education and how it’s reported on. Here it is:
Living near a successful school district is often a perk—if not a necessity—for home buyers. But how do you determine a district’s quality? There are several factors to consider. We chose to look at towns and cities where students excelled on the 2012 MCAS, whose results were released on Sept. 19.
Many of the highest-scoring schools were charters. Several were from outside the Boston area. Take a look at which towns and cities were home to the highest-scoring schools, then check out a property in each one.
Ranking things from highest to lowest is the crudest form of statistics, the form most likely to lack significance, most likely to lead to wrong conclusions. The Globe’s piece chooses 25 places readers might consider living based on the percentage of kids passing the MCAS test at a school in that community. The Globe may call its piece “Towns with high-performing school districts,” but in many cases, there is only one school on the list. Is Boston a high-performing school district? According to the title it does, though only MATCH Charter School and some other charter schools are mentioned. Wayland, Weston, and other public school powerhouses don’t make the list.
One reason may be that public schools that have students in out- of-district placements have those kids counted against their pass rate. My high school, for example, had less than 100% of our students pass. I was concerned until my principal told me that the students that had not passed the test don’t actually attend my school. They are sophomores from my district in outside placements. Pass or fail, my school has had no effect on them. MCAS just sort of charges us with those students, though we don’t educate them. One-hundred percent of last year’s sophomores that actually attended my school passed MCAS. I couldn’t care less whether East Longmeadow gets listed by the Boston Globe as a desirable community to live in, but to list Greenfield as such would come as a surprise to many in the our neck of the woods.
Some readers might be perplexed to find some of the communities on the list as places people might consider moving to. I don’t know any towns in Eastern Massachusetts well enough to pass judgement on them, but I’ve been led to believe that New Bedford isn’t considered the most desirable place to live. I’ve heard that Hull has become a bedroom community for Boston commuters. What about Duxbury? Medford? Harvard? I’m not being a snob here. I live in a working-class, small town. But would I move to New Bedford because one of its charter schools has a 100% pass rate in one MCAS subject?
Maybe I’m taking this too seriously, but I clicked on this Boston.com slideshow because I thought it would contain at least a modicum of useful information. Instead, it turned out to be an advertisement for the real estate pages. Something I didn’t realize until I looked more closely at the thing.
When it comes to education, the media has it rough. Instead of research, they usually receive raw or slightly processed data that must be mulled over and reported on with little help from the people who produce it. The College Board, for example, makes it a point to release SAT scores packaged for publication, though if you read ” A Note on the Use of Aggregate SAT Data” on page 12 of Guidelines on the Uses of College Board Test Scores and Related Data), you find out SAT scores in the aggregate–such as by community or school–are largely meaningless.
There are some useful ways to use MCAS data, but the Globe hasn’t found one here.
oceandreams says
The use of MCAS scores to promote real estate decisions puts terrible pressures on teachers to simply teach to the test to maximize scores instead of educating students. IMO among those this harms are the brightest kids, who are prepped and drilled for this test during times when they could be learning other things.
The only useful measure of the quality of educators through this test would be to weight scores based on socioeconomic predictors. A teacher whose students perform better than expected given their socioeconomic backgrounds is more likely to be preparing students well than a teacher whose students perform less well than would be predicted — regardless of which scores are higher.
I’d wager there’s a pretty strong correlation between school scores and students’s family income; and in fact, moving to a town like Weston or Wellesley for the “good schools” is probably just as much moving to a wealthy community where your kids can be around students from high achievement -oriented families as it is about any other factors.
As for the specific Globe piece, I didn’t see it but it sounds kind of silly as you describe it. Seems like what they want to be talking about in large communities is specific schools, not necessarily entire communities. Ah, if only we had more time to teach statistical analysis (among other things) in schools…
Mark L. Bail says
My department chair was sent out to Weston years ago to see what they were doing to get kids to score so high on MCAS. The answer was nothing much. The students had chosen their parents well.
Christopher says
I think MCAS scores can tell you something about the district, but some things have to be held constant in order for the variables to mean anything. Therefore, this data should be district by district rather than cherry-picking schools within districts, and personally I would have stuck with the fully public schools rather than using charters for this exercise at all. Because of correlations mentioned, this may well be a shorthand way of gauging other factors. It reminds me of when I was looking at colleges I found that Division I schools, while technically an athletic designation, very often was indicative of academic caliber as well.