Schworm’s angle on the study, that Massachusetts public schools are failing the Commonwealth and its students, is reflected in the article’s third graph:
The study raises concern that the state’s public schools are not doing enough to prepare all of their students for college, despite years of overhauls and large infusions of money.
The report didn’t seem to raise concerns when it was released in February, but Schworm has plenty of sources expressing concern ranging from Higher Education Chancellor Patricia Plummer to future Secretary of Education Paul Reville. I’ve written about Plummer’s problems interpreting research here, but aside from the three-strikes-and-your-in approach to the MTEL which seems to have come from Reville, I haven’t written much about him. His contribution to the Schworm article, however, demonstrates a similar lack of research literacy:
“We’re hopeful high schools will regard this [study] very seriously. This tells us that higher standards are necessary. We’re not fully preparing students for non-remediated college work.”
Mr. Reville seems to be implying that public high schools are not doing their work. Reville, however, ignores several issues, including the limitations of this particular study, the transition from high school to college, and role of standards-based education in improving learning. The position of Secretary of Education will no doubt be a challenging position, requiring leadership, management, and political skills, but it should also be an educational position, one that not only directs the Commonwealth’s educational bureacracy, but also works to educate the public.
With this mission in mind, here is how Reville and his colleagues might have spoken about the Massachusetts School-to-College Report High School Class of 2005:
1. Acknowledge Statistical Limitations First. The Globe article ignores one important limitation of the study. It doesn’t include “students who attended private Massachusetts high schools and students who attended a Massachusetts public high school and enrolled in a private or out-of-state postsecondary institution are not included in this report.” How many students are not included? About 40,000. Almost 60,000 students graduated from Massachusetts public high schools in 2005. This study tracks the one-third attended public higher education in Massachsuetts. Some certainly went to private colleges and universities. Others may have entered the military. Still others entered the work force. Others may be living on the street for all we know. Counting only students who enter public colleges from public high schools skews the percentages.
Here’s what one high school’s students might look like:
Let’s consider a hypothetical graduating class of 100 students. 85 of them decide to go to college. Fifteen choose to enter the military or enter the workforce. Of the 85 who pursue higher education, 18 decide to attend private colleges, leaving 58 to enter public colleges. 37% of these 58 students take a remedial college class. How many of the original graduating class of 100 students take a remedial college class? 21 or 22.
If these numbers were real, one-fifth of this graduating class would take one (or more) remedial classes in college. Is that significant? It could be. But it’s a lot lower than the 37% average reported in the Globe. Percentages for different high schools could vary widely. An affluent high school, for example, sends more students to private colleges. Less affluent high schools send a disproportionate number of students to public schools. The state-wide average, therefore, provides a poor picture of what’s happening in high schools.
2. Are remedial college classes after high school problem? Criticism of high schools for college readiness begs the question of whether every high school student is socially, emotionally, and intellectually ready for college after 13 years of schooling. There is no psychological reason to assume this. The number of years spent in American high schools is the result of culture and tradition, not research. In some countries, students have 14 years of schooling. Others turn non-college-bound students out of high school at 16. Why shouldn’t some students take remedial courses in college if they are prematurely forced out by an educational system that arbitrarily tries to cut them off after 13 years of school?
Of those taking remedial college courses, 65% have learning disabilities. Half are lower-income. More than half are students of color. Half have limited English proficiency. All of these students deserve a higher education, but given their particular challenges, it should be expected that some of their high school education may need to be extended into the college years. Why shouldn’t colleges fulfill this role?
3. Standards Don’t Always Mean Standards. To those unfamiliar with the nuances of educational double speak, a standard is an ideal, something to be met, a basis for evaluation. Few rational people would argue against encouraging our students to work to a high standard. Yet the conversational use of the word “standard” is not the same as an education standard. In education, there are no standards without curriculum frameworks and standardized tests. One can certainly argue that standards-based learning and standardized tests are the way to lead students to achieve at a high standard, but the point is hardly a given. When Reville tells us that higher standards are necessary, he’s talking about more MCAS, not necessarily better education.
The research is just starting to come in on the effectiveness of standards-based education. Critics have pointed out for years now that preparing for standardized tests has the effect of increasing the amount of time students spend learning test-taking strategies and spending less time critical thinking. Here in Massachusetts, Ed Moscovitch, one of the original educational reformers admitted the underlying assumption of reform (more specifically, standards-based education) was wrong:
the reform law was based on the premise that teachers and principals knew what to do but for some reason weren’t doing it; embarrassing them through low MCAS scores, while decreasing their enrollments through school choice, would somehow get them in gear. This fundamental premise was mistaken.
At the end of eras, one response to problems is (even) more of the same; the other response is to try something new. Mr. Reville’s small quote in the Globe article suggests he’s invested in the former rather than the latter. If educational improvement requires thinking outside the box, we have a standard problem.
Mark, your post is a very clear, cogent explanation of part of that complicated process of improving education. Thank you for putting this forward for people to digest.
<
p>I’m especially interested in your third point. As you note, a lot of educational improvement is now obsessed with the state standards. Establishing a competency level for each grade level and then testing to see whether the students have mastered the standards for their level sounds like a straightforward strategy.
<
p>Like teachers across the state and nation, my eighth grade English Language Arts (ELA) colleagues and I have spent a lot of time reading and analyzing the scores and released questions (standardized tests with answer keys from previous years) with the hope of identifying weaknesses in our curriculum or in our approaches to instruction. This process results in what a lot of people refer to as “teaching to the test”.
<
p>Some of our preliminary insights regarding the MCAS for middle school English:
<
p>- Some standards, although they are specified in the frameworks, have never been “tested” by MCAS. I don’t recall ever seeing “Making Connections” (standard 9) or “Analysis of Media” (standard 26) as the intended focus of a released question. When you look at the released questions, you can quickly identify “super standards” that come up over and over again: “Understanding a Text” (standard 8) and “Nonfiction” (standard 13) are DOE’s greatest hits. So guess where teachers are encouraged to focus their efforts? It may appear that a MA education will produce a well-rounded graduate, but a standardized test may actually undermine the broad and idealistic standards in the DOE frameworks.
<
p>And think about this: The seventh and tenth grade ELA tests include a test of composition skills; the 8th grade ELA test includes only literature interpretation. A lazy or pragmatic ELA teacher, worried about the threat to link test scores and merit pay, might be tempted to focus only on Literature in 8th grade, and to neglect effort toward improving writing skills. That would be wrong, wouldn’t it?
<
p>- Although the released questions purport to be testing one standard, it’s not always clear what separates a given question from a similar question purporting to target a different standard. I have seen questions about a poem listed as a test of Poetry (standard 14) but – aside from the text on which they are based – they seem to me no different from questions testing standard 8, “Understanding a Text”. In other cases, similar questions claim to test skill with “Non-fiction” and “Understanding at Text,” but the only substantial difference between the two questions is that one specifies the lines in the excerpt that the student should interpret, and the other asks the student to interpret the whole excerpt (is that an oxymoron?).
<
p>- We’re always teaching with previous tests in mind . At our school, students did not do well on poetry in previous years. So we put a lot of emphasis on poetry this year. Based on what I glimpsed over the shoulders of students as I monitored the room during the test, there was almost no poetry on the test this year. I feel like our students are Jack in the story, whose mom told him to carry coins in his pocket to get them home safely. Unfortunately, Jack is given a pound of butter as pay for his work the next day. He dutifully stuffs the butter into his pockets and arrives home with melted butter and messy pockets.
<
p>The solution, of course, is to help students to know how to carry each item home safely and logically. I don’t think the MCAS is helping us reach that goal.
I’ve been working on teaching MCAS stuff since the mid-90’s. I started out with workshops, corrected the long composition a couple of years in a row, and have taught at least one sophomore class since then.
<
p>With 15 years in, I’ve come to the conclusion that for most kids, the English test is primarily a waste of time. The results are next to useless for teachers. When looking at the open-reponse questions, for example, you can’t tell whether the kid didn’t understand the task or couldn’t communicate it successfully in writing.
<
p>The context vocabulary questions usually lack context. Unlike the SAT sentence completions, these questions are taken out of actual passages and usually lack clues for students to figure out the word from context. Results are probably skewed toward measuring a kid’s vocabulary rather than their ability with figuring out a word’s meaning.
<
p>Another unhelpful type of question require students to make inferences. Their ability to be able to make inferences is critical, but I’ve yet to find a way to teach kids how to do this on their own with a test.
<
p>Most of the skills that are supposedly tested on the MCAS test are valuable, but the test does a poor job of providing feedback to improve instruction. I don’t think we’re getting away from standards/testing anytime soon, but research is starting to show that the effects are wearing thin.
<
p>Mark
Feedback from experienced colleagues has been a great source of solace to me as I navigate this NCLB world. I’m very grateful to know that more seasoned teachers than I are seeing flaws in this methodology!
1. Acknowledge Statistical Limitations First.
The article didn’t hide the fact that the study looked only at graduates of MA public high schools who attended MA public colleges. Better to study a full system, then to try including private schools and colleges which may or may not participate. That would have left us with incomplete data and truly would have been a limitation.
<
p>The scope of the study was how well do MA public high schools prepare students for college work. I don’t see the need to state again what is already made clear in the article. For all you know, Reville may have described the scope of the study but the reporter didn’t include the statement because that fact was already covered in the article twice.
<
p>2. Are remedial college classes after high school problem?
Yes! These students have invested 13 years of their schooling. If they aren’t prepared to take on college-level work, then we have failed them.
<
p>We did this first by buckling under and lowering the passing rate score for the MCAS. We then failed them again by graduating them without the necessary skills to succeed. We weren’t forced to push them out of the system after 13 years. We could have kept them 1 or 2, and they could have had more tuition-free education.
<
p>Let’s not distract the discussion with talk of “social and emotional readiness.” The remedial courses students need to purchase are to learn social skills. They are to give them the academic tools they need in order to handle college work. If colleges can get these students up to speed in a semester or two, what is preventing high schools from doing the same?
<
p>I really take exception with your listing the usual suspects — low-SES, English proficiency, race — as reasons why we should expect these students to need remedial work. My read on the Globe article is these aren’t excuses. Indeed, anytime you have evidence of instructional problems and assign causes other than instruction, you tend to try to solve social issues rather than academic ones.
<
p>Fine, if you can demonstrate how social-based reforms have improved student achievement, I’d like to know. But don’t keep using these external factors as excuses. I’m sorry to be so blunt, but it’s insulting.
<
p>These low expectations gave fuel to the standards and accountability reforms. So, if you don’t like the heat, you shouldn’t have fueled the fire.
<
p>3. Standards Don’t Always Mean Standards.
A standard isn’t an ideal; it’s a definition of the real requirements needed to meet a goal.
<
p>With respect to the standard needed to take on college-level work, you talk as if the MCAS is the only standard. You completely omit the fact that passing the MCAS is only one component of graduation requirements and that all others are set locally. I think the point of the article was that all the other non-MCAS parts of education need to be examined and improved. Would you not agree?
comment. Feel free to pick up any I’ve dropped in a separate reply. I’ll be glad to respond.
<
p>-Mb
<
p>Concerning the Report & Conclusions: After acknowledging the limitations of the study, which delimit the scope, you say, “The scope of the study was how well do MA public high schools prepare students for college work.” If this a typo, I apologize for refuting it). This statement is factually incorrect. The scope of the study was how well MA public high schools prepare students for publiccollege work. Any conclusions drawn about all Massachusetts public high schools is beyond the scope of the data.
<
p>It is misleading and unfair to use 37% to stand for the entire population of Massachusetts Public High School students when a sizable, yet unidentified percentage, go to private schools. 197 students graduated in Wayland. 33 went to public colleges. We can assume more than 15% of that graduating class went to college. Care to guess their desination? They don’t get included the 37%.
<
p>Concerning Instructional vs. Environmental Issues:
Instructional issues are instructional issues. Definitely address them. Do all you can do. That was the point of the first part of my post. I said nothing about ignoring instructional issues. That’s always part of the job.You seem to imply, however, that since we can’t change the environmental factors affecting students then instruction can make disadvantaged kids reach academic standards. To be blunt, that’s illogical. It’s quite possible that those kids won’t ever reach the academic standards set by MCAS. That doesn’t mean we don’t do our best to make it happen, but there’s a difference between ideals and reality.
<
p>Social reforms improving achievement? How about the social capital that helps the many academically successful students achieve? Take a look at which communities have the lowest percentage of students taking remedial work in college. Take a look at their average SAT scores. Take a look at their MCAS scores. There’s a strong correlation between demographics and student achievement. What’s insulting about that?
<
p>Mark
Concerning the Report & Conclusions:
Yes, this was a typo, and I concur with your correction of it. I don’t find in the Globe article where any conclusions were drawn beyond the scope of the report.
<
p>37% doesn’t stand for the entire population, nor is it asserted to do so. The report is limited (purposefully so?) to the public education systems which we control. Our schools are graduating these students. Our colleges are accepting them. But the expectations don’t match.
<
p>What we’ve been trying to accomplish in MA is to have a high school diploma signify college readiness. I think the more important question is whether that is an ideal we are committed to. If it is, then morphing college into “high school plus” isn’t going to get us there. I would much prefer to see more students retained at grade level; if it takes them an extra year or two to acquire the necessary skills, so be it.
<
p>Concerning Instructional vs. Environmental Issues:
If I was implying, then that was my mistake. I should have stated it more clearly:
<
p>We can’t change a child’s race, how much money their parent’s earn, or whether or not they are native English speakers. But we can surely change the instructional methods we use. If they are not effective, then we should be trying something else.
<
p>So-called instructional reforms of the last decade have focused on building self-esteem, class size reduction, and even such ridiculous nonsense as looking at brain scans to see what activity is happening during certain tasks. That’s all well and good, but if it doesn’t lead to increased learning, then the problem goes back to instruction.
<
p>There is a correlation between SES and student achievement. That in and of itself is not a social reform. It is only a statement of fact. What I found insulting was this assertion: “…given their particular challenges, it should be expected that some of their high school education may need to be extended into the college years.”
<
p>This is nonsense. If students were better prepared by the time they reached high school, they would be better prepared by the time they left. What can we expect? We can expect better and so should students.
<
p>
My “guess” is that gains in achievement will remain limited until social/environmental factors change. I don’t believe in a magic bullet for educational ills, though I take it, you believe direct instruction provides one. My belief, which I most emphatically don’t claim to be “the truth,” is that there is no single correct method for teaching or instruction. Communities differ as do students’ needs.
<
p>As far as I can see, our disagreement boils down to two issues:
<
p>1. Social/environmental Factors and Educational Achievement
In a perfect world, all kids would start out with the same amount of social and cultural capital. They don’t. My suburban students don’t come to school after their brother has been shot and killed in a gang fight. They aren’t homeless. Almost all of my students lack even a passing acquaintance with the legal system. Unlike their urban peers, those involved with legal system receive innumerable breaks. In short, I don’t understand why this is insulting, to you or these students. As a teacher or a citizen, I believe we owe it to these students to push them as far as we can and work as hard as we can to get them there. I think we both agree with this last sentence. My question for you: 1) To what degree do social/environmental factors affect educational achievement? Do they not matter?
<
p>2. The efficacy of standards. Underlying the standards and accountability movement is the idea that students and schools can be driven to improve with the threat of punishment. There may be some motivational benefits to MCAS, but they are not limitless and as recent research published in AERA suggests, we’ve probably seen the end of test-driven improvements. Implied in this conception of motivation is the assumption that social factors in learning are minimal or don’t matter. Also implied in the concept of standards is that the state, the same state that can’t manage its very limited mass transit system, can effectively reform education, which is infinitely more varied complex. This was part of my original objection to Reville’s comments: he seems to think that the answer has been discovered.
<
p>Mark
<
p>P.S. I have to go teach right now so I probably won’t post until tomorrow.
I don’t believe DI is magic. But I do believe in the results of “the largest educational experiment ever conducted. It involved over 200,000 students and 22 sponsors of different approaches for how to teach at-risk children in grades kindergarten through 3. The 178 communities that implemented the different approaches spanned the full range of demographic variables (geographic distribution and community size), ethnic composition (white, black, Hispanic, Native American) and poverty level (economically disadvantaged and economically advantaged).”
<
p>So, while you may guess until the cows come home, others are actually looking at the data and doing what has been proven to increase academic achievement of all kids.
<
p>I taught high school in a poor neighborhood in So. Cal. I taught elementary school in Lynn. I also taught in Marblehead.
<
p>In my experience, effective teaching transcends demographics. I didn’t know about DI when I was teaching, — and DI doesn’t exist for my subject area — but the more I learn about the techniques, the more it echoed the subject matter training I received at IU.
<
p>We had a lot of drill on technique. Why? Because the people who were teaching us knew what worked in the classroom.
<
p>You ask To what degree do social/environmental factors affect educational achievement? Do they not matter?
<
p>I’m sorry if I’m coming on strong about the low expectations being insulting, but at one of my first jobs — the high school gig in So. Cal. — I offered to teach Calculus. The principal just laughed and said “these kids” didn’t need it.
<
p>When I taught, I didn’t set expectations based on a student’s SES, I set expectations based on their acquired skills, the rate of their learning. Heck, I worked with one student whose mother and sister died on the boat ride from Vietnam. It’s a disservice to define these kids by their life circumstances. If anything, school should be a refuge, not a reminder.
<
p>We both know that there is a correlation with low-SES and low academic achievement. If knowing about that correlation does nothing to eliminate it, then no, it doesn’t matter.
being offensive.
<
p>So, while you may guess until the cows come home, others are actually looking at the data and doing what has been proven to increase academic achievement of all kids.
<
p>You’re missing my point. I say I “guess” because I don’t know. I try to differentiate between established fact, opinion, and speculation. I consider that trying to be intellectually honest. I’m not interested in demagoguing the issue.
<
p>You’ll forgive me, I hope, for not reading up on scripted learning. I’ve read some of Slavin’s stuff, but in spite of your interesting points, I’m not interested in researching a section of research that has limited application.
<
p>
<
p>This is illogical. Here’s a more concrete analogy: I have a class full of kids who run. A sub-group of kids, I’ve noticed, run more slowly than the rest. It turns out each of these kids has a leg that is shorter than the other. According to your logic, this disability doesn’t matter because the knowledge does nothing to eliminate it?
<
p>Social and environmental factors correlate with academic achievement. I assume that there is not just correlation, but causation. As a teacher, I may consider these factors, but they don’t affect how I teach my students. As a researcher, it’s important for me to understand how social and environmental factors affect achievement. This knowledge can be instrumental in developing programs and policies.
<
p>Mark
When I was hired to teach music at a poor high school in California, I didn’t have a full schedule of music classes. So, having also been certified to teach math, I offered to start a pre-Calculus and Calculus course. The principal laughed and told me “these students” didn’t need those classes.
<
p>So forgive me if I get upset by low expectations of poor students.
<
p>I’m not saying you are prejudicial, however, we look at the same data — the correlation of low-SES and low achievement and draw two different conclusions. You look at that data and conclude “it should be expected that some of their high school education may need to be extended into the college years”. I look at that data and conclude we are not are not using effective teaching methods.
<
p>Then, I offer you evidence from the largest and most expensive federally funded experiment in education ever conducted and you dismiss it as “a section of research that has limited application” or by saying that was when educational research was in its “infancy.” If that data is too old for you, how about taking a look at the results coming out of Gering, Nebraska? I’m sorry, but I find your unwillingness to read or view information about this research unforgiveable.
<
p>Yes, these programs are used in elementary and middle schools, but doesn’t it make sense that if students learn more in the early grades, they will be better prepared for high school, and subsequently, college?
<
p>Let’s go back to your running analogy. You observe that student’s with one short leg run slower than students with two legs of equal length. You “assume that there is not just correlation, but causation.”
<
p>The social-reform method would go like this:
<
p>These students are running slower because they have one short leg. Let’s make their legs both the same length.
<
p>One problem with this — other than the fact that making both legs the same length may not actually be possible — is that even with two legs of the same length, the students still may not run faster. Another problem is that considerable resources are given to “lengthening legs” which in no way benefits slow runners with two legs of equal length. Nor does it benefit average or fast runners.
<
p>The direct method would identify and improve the requisite qualities needed for fast running:
<
p>- muscle strength
– cardiovascular efficiency
– stride length
– balance
<
p>It would group students according their abilities in these skills. It would apply proven ways to improve these requisite skills. The focus would not be on changing the students but on changing the instruction.
<
p>Until we improve instruction, it is insulting to assume people with one short leg will always need remediation to participate in athletic events. True, they may never be as fast as professional athletes, but I am certain that the fastest of them will far outpace the average and slow runners in the general population.
In previous incarnations, I unexpectedly found myself training recent college graduates to function in an office.
<
p>More than one could not write a letter that made sense. One recent graduate did not appear to know the difference between a noun and a verb, did not know the difference between “its” and “it’s”, and could not absorb direction that he had misused a word (“exemplify”) over and over again in letter after letter. SES was not a factor for these young people.
<
p>Don’t even get me started on work ethic and integrity.
<
p>So, yes, public schools may be sending students forward before they are “ready”, but colleges are not necessarily correcting the problems. It appears to me that you are optimistic when you say:
<
p>
If you were successful in training them, this is evidence that it can be done.
<
p>Is this sort of workplace training appropriate? No.
Are these the sorts of topics that should be the primary focus of a college education? No.
Should these skills be acquired before a student graduates from high school? Absolutely.
<
p>It is truly sad commentary that you think it optimistic to believe if public schools need to retain students a year or two (extending their time in public schools to 14-15 years) that students still would not have acquired these basic skills. However, it does underscore my other point — that unless you change the instruction, you will not change the outcome.
There are no state-imposed standards for colleges. Who decides that the colleges and universities are doing their jobs?
<
p>What are the standards for deciding that a student needs remediation? My guess is that they differ for every public college. What exactly is their “standard” for needing college remediation?
<
p>I remember a student (not mine) who sued our high school because Westfield State College rejected her because she had a single low-level English class on her transcript. Only WSC had this bizarre criteria. Our attorney went on to show that other state colleges didn’t do it. Neither did Harvard, for that matter. The suit was lost, but my point is that colleges and universities can be rather capricious in their requirements. Why assume they are all reasonable?
<
p>Good point,
<
p>Mark