Friday, June 20, 2008

CATS Task Force Primer – Writing Tests

The “we-must-test-writing” fanatics in Kentucky have a new credibility problem.

Two years ago the SAT college entrance test added a mandatory writing sample to the traditional multiple-choice tests in verbal and math skills. The first study of how well that might have improved the SAT’s prediction of college success is now available. It looks like the very time-consuming SAT writing element adds virtually nothing, as the New York Times reports.

Of course, the Times has been having credibility problems of late, so we went to the College Board’s Web site to see if the newspaper got it right (The Board creates the SAT). This time, the Times is on target. The College Board itself admits, “The results show that the changes made to the SAT did not substantially change how predictive the SAT is of first-year college performance.”

So, here is a question for those folks who are about to sit down and debate possible changes to our CATS school assessments. We know writing is important. But, if the people at the College Board did not come up with a writing test that adds value to the multiple-choice parts of the SAT, why do we in Kentucky think that the excessive time and cost required to test written answers is an essential part of our assessment program? The issue isn’t the need for writing; it is the question of whether we can effectively assess and score written answers in CATS a way that adds accuracy and value to the overall assessment scores. The College Board just found out they are not getting much from evaluating writing, and I remain unconvinced that Kentucky gets much value from the huge amount of time and money spent on written answers in CATS, either.

No comments: