Wednesday, November 22, 2006

A "B" in class, but an "F" on the big test?

"Sylvia James hardly considers herself clueless in mathematics. After all, she finished sixth grade with a B-plus in the subject and made the Honor Roll, which she saw as a victory in a challenging year of fraction conversion and decimal placement.

But what happened when she took the state math test? She flunked it."

So begins an article, "Those Who Pass Classes But Fail Tests Cry Foul," by Ian Shapira, which appeared in yesterday's Washington Post.

Here's the situation:

Many students in the Washington region [and one assumes other parts of the US as well - ed.] are suffering from academic split personalities. Driven by the federal No Child Left Behind law and tougher state diploma standards, the testing blitz has left these students in a curious limbo: They pass their classes with B's and C's yet fail the state exams.

These cases surface frequently, with one local high school reporting, for example, that a quarter of students in beginning algebra passed the course but failed the state test.

So what's happening in schools that could create a situation where a child can pass the classes, but fail the test developed to determine whether or not students have mastered the material and skills deemed necessary? According to the WaPo:

"Students and teachers offer an array of explanations for why test scores sometimes fail to match up with grades. Some students don't take the exams seriously. Some freeze up. Still others trip over unfamiliar language. And teachers sometimes are not prepped in what the exams cover, especially when the tests are new. Occasionally, some school officials suspect, classes aren't rigorous enough to prepare students adequately."

Ken DeRosa (on a white-hot streak of excellent posts in D-Ed Reckoning) thinks the problem is much larger, and he lets it fly in "You've Been Flim-Flammed:"

"Sylvia dear, I have bad news, you've been lied to. Bamboozled. Your well-meaning teachers are pretending to teach you sixth grade math, but they're not. They're teaching you fourth grade math, maybe even third grade. They're probably not even doing a very good job either. Worse still, they're covering their incompetence by giving you high grades. It's a scam from top to bottom."

Over at the Education Sector's, Andrew Rotherham also wonders whether the story dug deep enough into the dilemma of passing grades and failing test scores:

"It's a complicated issue and the story doesn't do it justice. Of course there are going to be students who don't test well, that's a pretty minor issue that garners headlines but is dealt with relatively easily in public policy through a meaningful appeals system that takes into account multiple measures. Hardly front page news.

Where Shapira falls down is by not engaging on the larger question about whether teacher grades are the best indicator of student learning. He's got anecdotes, but on this one there is also data. Grades are surely one important indicator, but the best or only one?

The question of grades versus test scores really boils down to that question, what sort of external benchmarks do we want in a public system like ours? Right now, standardized tests, which help provide information...are the worst way to do that, except for all the others.

And Kevin Carey at the Quick and the Ed weighs in:

"The only mystery here is why everyone in the article is being so circumspect about something that should be pretty obvious: states create standardized tests because local schools, when left to their own devices, don't always hold students to high enough academic standards...

...How do we know this? Because every measure of what students who have graduated from public schools actually know and can do shows deep deficiencies. According to the National Assessment of Education Progress only 59% of seventeen year-olds can perform "moderately complex" procedures in math. 40% of all college students are forced to enroll in at least one remedial--that is, high school-level--course. 43% of all adults score at only the "Basic" level or below on a test of literacy. Etc., etc."

What does all of this mean for a school like JIS, which doesn't fall under the provisions NCLB (or any other external accountability framework)?

It means JIS should be especially mindful of the problems both with standardized tests and with internal assessments as methods of measuring its educational program's success. Neither is perfect. But it's very easy to fall into the trap of believing that internal assessments (like the unit tests, papers, and projects that our students produce) are the best measure when we're talking about accountability.

More on the subject tomorrow.....


Post a Comment

<< Home