First, for giggles, let’s remember what the acronym MCAS stands for — Massachusetts Comprehensive Assessment System.
That’s right. A two-hour exam administered in March or May is supposedly “Comprehensive” — accurately summing up a year or more of education. To name one case, the 160 hours of education in world geography students received are assessed in a sit-and-spill exam that takes less than two hours. The state would like you to believe that is not only a decent assessment, but it is “comprehensive”.
So let’s see about the, ahem, system. If interested, check out the the Massachusetts curriculum frameworks, the documents that supposedly tell us what to teach. This “system” breaks down history into a highly detailed, nearly day-by-day listing of events. In ELA, there are about a dozen parallel strands (good luck trying to synthesize that document!). In math, we speak in vague generalities, such as urging teachers to explain how to chart a graph, something that can require a month of teaching. This is in no way a “system” if each subject is organized wildly differently.
I’m okay with something actually “comprehensive” if there are resources to make it happen. The chuckleheads at the state enjoy offloading the work of the DOE on individual teachers, as in the MCAS alt. Rather than design a smart, useful system that combines requirements into a small group of documents, they propounded complexities of requirements, then told overworking special ed. teachers to follow them with little guidance. I maintain a survey of the schools of Massachsetts would show 85% of them out of compliance with special education law, at least. I assign responsibility for this on the arcane laws, not the hardworking heroes in special ed, the toughest job in teaching today.
Exams are no mroe reflective of today’s world than is requiring girls to learn needlepoint, and boys how to operate a lathe. Nice things to be able to do, but not demonstrating skills or knowledge essential for life as a productive member of society, and a thoughtful citizen. I’ve had to take an exam in a vocational setting three times, and two of those were teaching tests. I’ve had to do research a lot, work collaboratively a great deal, produce a finished product (such as, oh, the time I earned my teaching license), organize spontaneously generated ideas in every job in my short life almost every day, but only three exams.
In the age of Google, having children memorize arcane trivia is pointless unless they plan on going on Jeopardy. (When is the last time you calculated a complimentary angle, differentiated between ancient Babylonian social classes, or listed prepositions? When is the last time you needed to know that trivia right now, we don’t have thirty seconds for you to hit the Internet or dictionary! ?) We don’t have time to teach them the skills they need because we’re supervising their memorization of the information they don’t need, crammed into a year robbed of at least 12 teaching days due to these very same MCAS exams.
If we are going to insist on a system whereby state bureaucrats (please no more private companies making bucks off of our children) judge students performance, here is what I’d like to see:
Each student must complete two assessments per term in each academic class. These assessments will increase in complexity as the student ages. The DOE established reasonable and clear requirements, adding components such as the use of technology in a certain number, or significant cross-curricular component, traditional sit-and-spill, etc. Copies of the assessment document, instructions, and rubric for the assessments for the first half of the year are due to the DOE on September 15th, the second half by December 15th.
Then in May, the state identifies which three assessments in each discipline will be evaulated by district. If any assessment proposed by a school is found unsatisfactory, then it will not be considered for state evaluation. If the state cannot identify three assessments that meet its criteria, the school is put on notice. Two years in a row, they are forced to administer a standardized test.
DOE employees compare the finished product(s) to the requirements, and the received grade. This allows them to identify student achievement while also calibrating educator standards. Students that constantly underperform expectations are put on the same warning system that we currently have.
Of course, I realize the major fault. There are serious control issues that I’d be interested in others’ input to resolve. Doing this away from the DOE’s eyes may make the end product suspect, and would open up the system to cheating much more open (though barely more pervasive) than what we have now. This may be a price worth paying to prepare students for this millenium rather than days gone by.
Also, I realize this is ambitious. I realize many school districts will just opt for more kill-and-drill testing that prepares students to be citizens of 18th century America. Fine — that’ll keep the educational parasite industry in business.
I realize that it would take certain individuals backing up their meaningless talk with money and political capital. It means cutting the money flow to Measured Progress, Inc. and hiring people for the DOE.* It means people at the DOE who know the difference between education and teaching, possibly turfing out the employees they have who see the inside of a classroom about as often as I see the inside of the Space Shuttle.
But that is what I’d like to see. You?
*To be more specific about my beef with Measured Progress — here is the true nature of the scam. they pull in thousands administering the exams, but that isn’t the real money. The cash that pays for thos eexecutive jacuzzis is the out-of-state folks they hire as “consultants”. These consultants, in exchange for a hefty check from schools, are sent out to hold “professional development” seminars that combine outdated educational theory with tips on how to instruct students to beat the tests that their employer writes.
joeltpatterson says
The part where teachers within a district design an assessment to meet assigned criteria is, I think, the key part of this. Corporations and Educ bureaucracies fetishize the “controlled experiment” parts of the MCAS–make sure test proctors read the same script before the test, make sure kids answer the same questions–this might assure that students are judged against the same standard, but that is different from assuring that a student’s understanding may be tested. The de-emphasis of understanding leads to math questions like “What is the median of the data listed below?” A simple mechanical question that students both fluent and not-exactly-fluent in English can respond to. But that question doesn’t explore whether the kid really understands it. In the real world, a person should know how politicians talk about means instead of medians to lie with statistics. Note that one can’t know this unless one already understands the difference between a mean and a median.
<
p>
Relaxing control over the questions themselves would let teachers and DOE employees devote more time to the criteria to measure understanding.
joeltpatterson says
Those are so disjointed in listing topics to be taught that the Frameworks don’t help teachers see the big ideas. Pre-Calculus curriculum exemplifies missing the forest for the trees: vectors, complex numbers, trigonometry all get taught without a big idea to unify them. (The big idea is that numbers have directions as well as sizes–and operations have directions, too. For instance, multiplication and division rotate numbers. Vectors and trigonometry let you understand how to use numbers that point off-angle from the usual negative-positive numberline.)
<
p>
Our education might improve more if the state had educational professionals focus more on what we should be teaching and less on what can written into a test.
raj says
…I earned the highest score possible on the Advanced Placement math exam in 1967, at the ripe old age of 17, and I cannot figure out what your parenthetical phrase means. Numbers don’t have directions. Vectors can be visualized as having directions, but they aren’t required to have directions, and, in some applications, vectors (which are essentially a form of tensors) don’t have directions at all.
joeltpatterson says
It ties together geometry with algebra. Numbers have directions. Negative real numbers are oriented 180 degrees from positive real numbers (which have a zero degree orientation). Imaginary numbers have an orientation at a right angle to the reals.
<
p>
Multiplication entails combining the rotation of two numbers: two positives multiplied means combine their zero degree rotations–so the product has a (zero+zero deg) positive orientation. Two negatives means combining 180+180 degrees, so the product has a 360 degree (positive) orientation. This is why the square root of a positive can be negative or positive. And the square root of a negative number would need to have a 90-degree orientation (that way if you do 3i times 4i, you get -12, which will have a 180-degree orientation). This is a geometric way to look at DeMoivre’s Theorem. And if you start seeing i as a “1 with a right angle rotation,” it can start to make visual sense as to what happens when you do powers to imaginary exponents.
<
p>
But maybe you’re confused by my choice of words. One can say, “vectors can be visualized as having directions in some applications.” Or the way I see it, numbers have directions as well as sizes (absolute value), and in some applications, you don’t care about the directions–they are extraneous. Or maybe the dimensions of the vector represent something besides directions, like different categories. Semantics get tricky with abstract notions.
<
p>
Back to the “big idea,” in teaching math it helps if a broader narrative is used to tie together details.
raj says
…I don’t particularly care for it. I’m familiar with the complex plane (x axis for reals components, y axis for imaginary components) but that’s about it.
<
p>
Funny story that your comment regarding “direction” reminds me of. In my legal accounting course, the instructor quipped that, on a balance sheet, accountants distinguish between assets and liabilities as follows: assets towards the window, liabilities towards the wall. Directionality? Most assuredly.
raj says
…I have always considered the various branches of mathematics to be a set of tools. If that is a broader narrative, that’s fine with me. I’m not sure why a broader narrative linking the details of the branches is necessary.
john-howard says
you mean kind of like how the Celtics are leading 75 to 58, but because of the direction of the numbers, they still lose the game? Dude, everyone knows that.
<
p>
MCAS question: is the MCAS supposed to evaluate schools or teachers or students? Random sampling is a pretty valid way to measure large populations. Does every student get the same test? If each student gets 50 random questions out of say 1000 questions that get asked that year around the state, out of say 10000 questions from the curriculum that might get asked, it would be pretty comprehensive overall, and also a pretty accurate measure.
john-howard says
wow, that’s really freaky, the C’s are actually leading 75 to 58 with 30 seconds left in the third. I’m watching the late night rerun and I know we won already, but I didn’t see the game, so that’s pretty freaky that they actually hit that score, isn’t it. I’m going to enjoy watching them survive this one.
stomv says
I certainly know all of the ideas you’ve mentioned, but I hadn’t thought about them that way in a long time (ever?) What’s nice about it is that like nearly all tools that map to spacial geometry, these ideas map past three directions. Even if your brain (and mine) can’t wrap around it.
<
p>
So, in the complex triple a + bi + cj + dk, you can still think about it as rotations. Nice!
<
p>
That being said, I always thought about pre-calc as a way to pick up all of the neat tools needed for calculus and advanced math that didn’t fit really well anywhere else. Things like:
<
p> * Factorial. * Combinations and Permutations. * Summation notation. * Product notation. * Continuous fractions. * Infinite series. * Limits.
<
p>
These are all tools that come in handy in calculus, but don’t seem to have a natural place in algebra, algebra II, or geometry.
goldsteingonewild says
Thoughtful post. You open up a whole range of issues. I appreciate your putting forth a different vision. I’m heading out to watch our team play some hoops. But my first reaction —
<
p>
In addition to your concerns about the tests themselves, you seem troubled by the traditional content taught day-to-day in most schools. Fair?
<
p>
<
p>
I.e., you wouldn’t favor any sort of assessment that measures “basic facts” that you can look up? Or am I misstating?
raj says
…When is the last time you calculated a complimentary angle
<
p>
Which angle would be the one that was complimenting the other?
<
p>
Sorry, couldn’t resist*.
<
p>
*for the math challenged, and to forestall a flame war, it is the complementary angle. But I got a chuckle out of the comment. Examples:
<
p>
Mr. Angle1 to Mr. Angle2, you are very nice. Mr. Angle2 to Mr. Angle 1, you are very nice, too. They are very complimentary of each other.
<
p>
Mr. Angle1 to Mr. Angle2, you’re pi radians out of phase with me. You’ll cancel me out. They are very complementary of each other.
<
p>
End of math lesson.
raj says
raj says
…When is the last time you…listed prepositions?
<
p>
I don’t bother listing prepositions, although I know what they are–in english. But, just to let you know, learning a foreign language, even at an advanced age, can do wonders for one’s grammar. I started going to Goethe-Institut Boston (170 Beacon Street) in my 40s. The grammar instruction (Hochdeutsch grammar is considerably more complex than that of American English) was quite instructive to me, and helped me reflect on my knowledge of American English grammar.
<
p>
Just last night, my spouse–who was born in Germany, but who is an American citizen, were laughing about the following. I don’t recall what triggered it. But we were laughing about the fact that, in German grammar, one could write a single sentence, that began with
<
p>
“He”
<
p>
then followed with
<
p>
“who was born in ….(irgendwo–somewhere)”
<
p>
then followed with
<
p>
“who worked at–or was educated at” such and such a place
<
p>
then followed with
<
p>
“who won a Nobel prize for discovering this that or the other”
<
p>
then followed with
<
p>
“who was married to this, that or the other person, and they had several children, all of whom were Nobel prize winners”
<
p>
“just died.”
<
p>
All in a single sentence. It was hillarious.
<
p>
Attention to grammar is useful, but it can be overdone
joets says
“Just last night, my spouse–who was born in Germany, but who is an American citizen, were laughing about the following.”
<
p>
Sorry, had to point it out.
<
p>
However, I agree raj! I didn’t know about the passive voice and many other grammatical nuances until I started learning German, too.
joeltpatterson says
There are so many “basic facts” that could be taught, that if you start testing (and therefore emphasizing) the basic facts, you run out of time to assess things like organizational thinking and critical thinking. If you devise an assessment well, the student will have to provide some basic facts they know as part of an answer that also demonstrates higher-level thinking.
<
p>
Say the question, instead of specifically addressing ancient Egyptian social classes, asked something more like, “Pick 2 countries in other historical periods and compare their social class systems to social classes in modern America.”
shack says
I gave my 7th grade students a mid-term that included a number of vocabulary words they had encountered in their reading, looked up in a dictionary and used in a sentence. As I looked over the results of the test, it was clear that even the brightest students were guessing when asked to list the part of speech (noun, verb, adjective, etc.). When I mentioned to a more experienced colleague that it looked as if I would need to return to the subject of grammar before the end of the semester, she said, “Don’t bother. It’s not on the MCAS.”
<
p>
And she’s right. “Parts of speech” is a 6th grade topic; we had reviewed it in 7th grade (alongside using the terms in regular vocabulary assignments). If they don’t have it now, the students will have to absorb it on their own (or, for the academically strong students, learn it later in a foreign language class, as Raj points out).
<
p>
Another anecdote: I assigned kids to look through books to find examples of similes. One girl brought in the example of “squealed like a stuck pig.” I asked whether she knew what that meant, and she admitted that she did not. She was aghast when I told her that it had to do with slaughtering an animal. Another boy brought in, “like throwing Brer Rabbit into the briar patch.” I applauded him for bringing in a literary allusion as well as a simile and he stared blankly. The class had never encountered Brer Rabbit (even though folk tales are in the DOE frameworks for Middle School ELA).
<
p>
In other words, the students were fulfilling the assignment by scanning texts until they found the word “like” or “as,” and writing down the words surrounding this key component of a simile without taking the time to understand the meaning of each simile itself. (Nurse! Bring me an IV of Brer Rabbit! Stat!)
<
p>
Perhaps NCLB or the MCAS or the DOE is not to blame that students are not learning either substance and meaning or “basic facts,” as you put it. Too bad we have to spend so much time addressing the assessment instead of teaching children how to learn and to love learning.
sabutai says
Long comp Tuesday and Thursday for us…
shack says
This is my first time.
<
p>
After I walk the dog on this snowy Sunday morning, I am going to sit down with the 76 page, “Test Administrator’s Manual: Grades 4 and 7, March – April 2007, ELA Composition, Composition Make-Up, and Reading Comprehension.”
<
p>
I’m also going to make cookies and go out to buy a ton of gum. My homeroom has made it clear that they expect food and gum if we want them to do well on the test. They have also said that they don’t do prewriting (rough drafts, for BMG readers who are not hip to the lingo).
<
p>
Instead of reading a test administrator’s manual, I would rather be reading Konigsburg’s, “The View from Saturday”, or Hiaasen’s “Hoot”, the two books I think my classes will be reading within the next week or two. First things first.
<
p>
My school is doing Comp Tuesday (I guess all Mass. 7th graders are) and Literature on Wed., Thurs AND Fri. Eight class periods. Oy. I’m sure there’s a logic to this.
<
p>
Good luck to you and your kids, too.
sabutai says
GGW, I don’t object to the measurement of “basic facts”. However, so many assessments measure something far beyond “basic facts” and drift into arcance trivia.
<
p>
For my students, I call basic facts”facts that every educated American is expected to know.” For example, the fact that the ancient Egyptians had a pantheon of gods typical of animist systems, albeit with a heavy emphasis on death. I think that is some knowledge worth carrying in one’s head. The fact that Isis, the Egyptian goddess of the afterlife was the sister of Osiris, the Egyptian god of the same, is not. The MCAS is just as likely to ask the second question.
<
p>
Worse still, a student can do wonderful at the MCAS by answering that question, without having a faint clue what that says about Egyptian culture. That is the type of skill that we teach despite the MCAS, rather than because of it.
goldsteingonewild says
I think we agree that arcane trivia is not something we’re after.
<
p>
However, I’m not sure you’re fairly characterizing the MCAS history section.
<
p>
There are a bunch of sample questions here (pdf file) so readers can make up their own minds.
<
p>
First, I thought there was a fair number of “interpretation questions” (i.e., a map is given, kid must draw conclusions) and some essay questions (“Pick Columbus or Magellan: Describe what this explorer was seeking on his voyages, and Identify and describe what this explorer discovered on his voyages.”).
<
p>
Second, the “fact-only” questions were, I think, reasonable.
<
p>
Grade 5 (American)
<
p>
<
p>
Grade 7 (World)
<
p>
<
p>
Grade 10 (American)
<
p>
<
p>
I wasn’t cherry-picking any particular questions. There were a few seemingly more “essential” and a few seemingly more trivial. Again, I encourage folks to read themselves and decide.
<
p>
Of course it’s not necessary to know any single one of these facts.
<
p>
But if a kid knows, say, less than 30% of them….I think a) he’s going to have a bunch of “cultural literacy” problems, and b) his ability to read things — newspapers, books, even blogs — will be compromised, and with it the likelihood of becoming a college grad.
<
p>
What say you?
joets says
what’s the answer to the grade 10 question? I figure B or D.
shack says
A number of questions on the MCAS strike me as having more than one correct answer. The point in those cases is not really to think or to find a good answer, the point is to guess what DOE wants you to answer. I found this to be true of the MTEL (teacher education licensure test) as well.
goldsteingonewild says
I’m not sure there were desegregated units in WW2. Af-Ams served in all-black units, like Tuskegee Airmen.
joeltpatterson says
In 1948, so that would be after WW2.
<
p>
But that’s the point: does missing that question really imply the student didn’t have a good 10th grade education in civil rights?
<
p>
There’s too many “basic facts.” Test-writers make you answer questions on the basic facts they choose out this huge pool.
sabutai says
Heck, throw out the curricula and have students use the Dictionary of Cultural Literacy as a 12-year textbook 🙂
<
p>
But in seriousness, is it essential to know what the Lyceum is? I’ll be honest here — I received a 95% on the history teacher exam, and I had to look it up. Perhaps because I learned it, got it right on a test, and forgot it. I know you were picking at random, but it seems that in many of these curriculums that we’re in such a hurry to teach the high school trivia that students perforce forget the important things — such as in the grade 4 question.
<
p>
You referenced in another comment the fact that schools graduated functionally illiterate students. They still do. Students memorize the batch of information they need to pass this year’s MCAS, and then toss it out for the next batch.
kai says
A very thoughtful post, and one I am glad you wrote. A few questions and thoughts for you:
<
p>
You say you have only take tests in a vocational setting three times and that it doesn’t demonstrate that you can use the skills learned in class in a meaningful way in real life. Fair enough. How then do you assess whether your students are learning or not throughout the year? Don’t you test them, or do you have some sort or alternative assessment system?
<
p>
Wouldn’t requiring each of the 300+ school districts in this state to develop their own comprehensive assessment systems place a huge strain on them? Where is the funding for this whole new class of administrators going to come from? Wouldn’t it also create a whole slew of new employees at the DOE to study each of these 300+ individual systems?
<
p>
Shouldn’t both of the assessments be due in to the State by Sept 15th, and a response from the DOE due by Dec 15th? If it takes 3 months to respond to the schools then you could have a situation where the school isn’t finding out its system isn’t up to snuff until March, at which point 80% of the school year has gone by.
<
p>
On an off note, I watched Half Nelson last night. It shows Ryan Gossling as a drug addicted 8th grade teacher in the Bronx. Its gritty but well done, and one teachers would appreciate.
sabutai says
One of the joys of blogging is having to answer detailed questions about a scheme you pulled out your donkey in te last day or two. But, they are fair questions…
<
p>
First, you ask “how then do you assess whether your students are learning or not throughout the year?” To be clear, we do not do that right now in Massachusetts — we only learn if students learned last year. The MCAS does not measure this, and the closest thing is an unfunded DOE mandate to administer one of a selection of standardized tests in the beginning of the year. Of course, these other tests use a different methodology so comparisons are ludicrously invalid. But the DOE demands that is how time and money is to be spent.
<
p>
Second, you ask
<
p>
Well, the emergency brake is always going to the standardized test if you’re out of ideas — just keep killing the kids with MCAS. However, every teacher uses non-MCAS assessments. If the teachers of a department get together and share and modify those assessments to cover more information if need be, they’re done. I don’t really see why the administrators would need to be involved in this in a significant way.
<
p>
And yes, the DOE would need to spend some of their yearly subsidy to Measured Progress on study of these assessments, as well as some of their budget in punition and inspection.
<
p>
About the dates, I’m open to suggestions on that. I was trying to be aware that it may be tough to plan an assessment that far in advance.
<
p>
I never saw Half Nelson, but any teaching movie that doesn’t have Belushi carrying a baseball bat will be second best to me 🙂
kai says
When I asked how you measured your kids’ progress through the year, i meant you. In your classroom in your school school how do you tell if your students are learning or not? I imagine you give them tests and quizes. I’m trying to get at the the difference between the tests you give them in class and the MCAS test.
<
p>
Assuming a district does come up with its own assessment system, don’t we run the risk of falling back into social promotion? Whats to prevent a system from passing a kid who really didn’t do well on their assessment? Or would you send the completed assessments into the DOE and have them grade them? In that case, would they be able to accurately grade 300 individual systems that they didn’t have a hand in creating?
<
p>
For your department, what would you create as an ideal assessment?
sabutai says
Kai, well, we’re at the 2/3 mark of the school year and I’ve administered three spit-and-spill exams (tests and quizzes). The rest have been some form of project based learning. Some examples include PowerPoint shows on Nubia where they taught bits of the curriculum to their classmates, an oral report where the anthopomorphized various parts of the Egyptain agricultural system (one of my students actually fractured her risk while acting out the part of thrashed grain), an illustrated brochure welcoming any prehistoric people transported to the future that explains where they came from and where they’re going, and essays explaining why Hammurabi’s Code and/or the Ten Commandments should/should not be enforced as law today. This allows more opportunities who excel at social learning, oral presentation, group work, technological work, synthesizing knowledge, etc. — all the skills the MCAS ignores. The sit and spill exam is a minor weapon in assessing learning, but try telling that to the state.
<
p>
Given that the DOE is only looking in-depth at 3 of 8 assessments from the year, I’d imagine they could at least assess the accuracy oftaecher grades, if not grade them themselves. It might be difficult, but I say it’s worth it.
<
p>
As for ancient history, I think some of the examples above could be good starting points for ideal assessments. Give nthat 8 are available, we could mix ’em up well.
<
p>
goldsteingonewild says
<
p>
I respectfully disagree, hopefully in the most conversational tone you can imagine, b/c I share your deep interest in this subject.
<
p>
Let’s just look at Grade 10 exams in reading. A kid reads a passage, and a typical question is “What is the main idea of this passage?” How is that valueless?
<
p>
Again, here’s a sample (pdf). I encourage readers to look for themselves.
<
p>
A kid just needs to earn a low score to pass on ELA. Most of the kids who flunk it are really, really weak readers.
<
p>
Before MCAS, high schools — particularly urban ones — graduated a fair number of kids who bordered on illiterate. They couldn’t read their diplomas.
<
p>
Rep. Jeff Sanchez of Boston was in a meeting about two weeks ago with a dozen parents. One woman said to him:
<
p>
<
p>
Pre-MCAS, some schools simply passed illiterate or barely literate kids — like this woman — to the next grade.
<
p>
That’s why Ted Kennedy helped co-author No Child Left Behind — with its requirements for annual standardized testing — and why he and Congressman George Miller (D) will lead the fight to get it re-authorized.
<
p>
They believe, as do I, that for all the admitted imperfections in any form of testing — medical boards (multiple choice little bubbles), CPA and bar exams, et al — that standardized assessments have value. Kids who fail MCAS get help they would never otherwise get. Sometimes that help itself is quite useful, sometimes the school is so bad that the “help” is pointless drilling….hence the need for…..well shall we say……OTHER, um, schools 🙂
<
p>
I think the sort of assessment you describe above can be GOOD. Even terrific. Remember, MCAS is mostly the “low bar.” It’s ADDED to whatever bar is set by teachers and schools….
sabutai says
GGW, I agree that picking the main idea out of a passage is a key skill. I just disagree that a high-pressure environment is the wayto measure it. We both know legions of students whose test results do not reflect their abilities because of test anxiety issues. That’s one reason I’d like to see alternate assessments expanded.
<
p>
I can’t say why Kennedy supported this bill. It may well be because he feels that punishing schools for passing students who aren’t ready will work. On the other hand, what do you suppose we do with a student who is on his/her 3rd trip through seventh grade, and has no appearance of caring? Keep them back until they’re driving to middle school?
goldsteingonewild says
<
p>
However, I don’t see MANY kids who are REALLY FLUENT who still fail MCAS.
<
p>
My sample size is about 300 kids who I’m pretty closely familiar wtih, majority whom arrived to our high school having failed either the math or English middle school MCAS, or both.
<
p>
But since the passing score, as you know, is 20 out of 80, I think even the “test-anxiety” kids can get safely to having the algebra, geometry, essay-writing, and reading a paragraph skills to get to 40 out of 80 range, and then they have enough cushion.
<
p>
The other thing — we require all our seniors to take a humaniites class at Boston University. Final exam pressure there is even higher (because, unlike MCAS, these are timed!)
<
p>
While some people (like my wife) do not like high-stakes tests, I believe learning to face high-pressure situations is a useful part of school — certainly possible to overdo, but not without merit.
<
p>
2. Respectfully, I’d suggest Kennedy supports transparency and accountability because he watched the 80s and 90s — where teacher salaries went up, class size went down, special ed spending rocketed, and achievement gap didn’t budget.
<
p>
3. I think you’re right to pose the 16-year-old in 7th grade question. Nobody has easy answers. Surely I don’t.
<
p>
I have two thoughts:
<
p>
a) I’d re-ask the question this way. Is it better to hold him back, or better to pass him along and then have him get a diploma, whereupon the “real world” show him where he really stands, and meanwhile to have several kids as “collateral damage”…ie, those kids who decided to NOT try hard b/c they realized that everyone gets promoted to the next grade?
<
p>
b) This kid is one reason where school choice, while not a panacea, is useful. I don’t want to introduce it into a thread about MCAS, but it is one component of my answer to your very legit question.
sabutai says
I’d re-ask the question this way. Is it better to hold him back, or better to pass him along and then have him get a diploma, whereupon the “real world” show him where he really stands, and meanwhile to have several kids as “collateral damage”…ie, those kids who decided to NOT try hard b/c they realized that everyone gets promoted to the next grade?
<
p>
That is an incomplete phrasing of the dilemma. It’s not a matter of holding that student back, but a matter of severely downgrading the educational experience of the 26 students in the class with him/her. I’ve yet to talk to a student who says that they believe they’ll be promoted anyway…they just don’t care.
goldsteingonewild says
We agree that the MCAS question, as posed by the Good Rep Sciortino, needs to be more than just about the kids who fail, but whether how standards and accountability help or hurt all the kids — the “net effect.” We disagree about whether the net effect is pro or con, from our own observations.
<
p>
One problem is that real persuasive data on this stuff, to support either side, is hard to come by.
<
p>
Here’s a fairly neutral summary of the relevant research.
<
p>
<
p>
The link, from the Center for Public Education, has links to 30 pretty good empirical studies.
shack says
You say:
<
p>
Kids who fail MCAS get help they would never otherwise get.
<
p>
Open response question: Legislators, parents and teachers would like struggling students to receive educational assistance. Does the MCAS effectively identify barriers to learning? Do needy students receive “help” as a result of taking the MCAS? Define “help.” (Is it the same thing as “corrective action”?) Provide specific examples in your answer. Stray marks outside of the box will not be considered.
goldsteingonewild says
Shack,
<
p>
Here’s a resource that can answer your question quite well. It’s a long powerpoint called ANATOMY OF SUCCESS: LESSONS FROM SCHOOLS ON THE PERFORMANCE FRONTIER. I think you’ll like it.
<
p>
Ed Trust is a liberal think tank in DC devoted to closing the Achievement Gap. They describe certain schools which a) use state test data to identify strugglers, b) respond to the data both in the form of changing what happens in the classroom (teaching, curriculum) and before and after school (tutoring).
<
p>
And because it’s powerpoint, there are no stray marks outside the box!
<
p>
GGW
goldsteingonewild says
Sabutai, your NCAA picks thus far looking good!
mbair says
I can’t comment on these specifics, but I can give you my take on MCAS Math.
<
p>
I’m thinking of dusting off the Pentel 0.7mm mechanical pencil, powder blue, for another Math Camp guerrilla style this spring. I like to volunteer and get a junior or senior their diploma. I can only comment on the Math portion of the test. I think the test is fairly hard, but what I find is that kids that have a lot on the ball fail repeatedly.
<
p>
Many times I find that these kids are so traumatized by the math experience that it’s hard to get them to just start using the frickin’ pencil. Sometimes they don’t want to even copy the problem down and re-write the thing without any changes to get started. They’re so lost and fearful, depressed and humiliated.
<
p>
They’re also not “problem children” or anything like that – at all. Just: Good Kids Bad at Math. Many of the students I’ve come across in the last five years are good enough to get C’s in their regular math classes because they try really hard and put in the extra effort. Teachers don’t want to fail them over and over when the kids are putting in an A for effort. Teachers also lack the time to sit with them one on one for the instruction they need. Then the kids get to the test in 10th and get blown away. It’s sad.
sabutai says
Frankly, the great long-term legacy of no Child Left Behind is to teach students to hate learning, and especially to hate math and English.
joets says
They probably already hated it. I can’t think of anything people in that age group, myself included back then, didn’t hate.