The dust is settling from the final signing of Senate Bill 1 (SB-1), which marks the well-deserved end of Kentucky’s second unsuccessful attempt to incorporate Progressive Education Theories into a valid assessment program. The demise of the CATS assessments, which never changed enough from the original KIRIS assessment to be successful, reminds me of other ideas that back in the 1990s we were told the “research shows” work best for kids.
Here is a short list of some of those “research shows” ideas that real experience proved were off the mark.
1) Multi-age Classrooms
Technically, the multi-age classroom law is still on the books, but an intentional legal loophole added in 1998 means no-one bothers to observe it. An interesting back and forth on why multi-age didn’t work out is going on over at the Prichard Blog, with two folks who usually fuss about the Bluegrass Institute challenging each other, instead. It’s an interesting read.
2) Writing Portfolios
SB-1 also marks the demise of the dubious idea that writing portfolios, which are a great writing instructional tool, could be forced down every teacher’s throat by putting them into the state’s accountability system.
3) Math Portfolios
Anyone remember those? Writing about math instead of doing math was such a bad idea that these were gone within four years.
4) Performance Events
These died in part due to extraordinarily horrible management. One of the basic rules of large scale test creation is that you need to do trial runs with the test questions. But our testing contractor got cheap and stopped proofing the performance events. The performance events they foisted on Kentucky the next year were so incredibly hard that the middle school results were totally unusable. This KERA darling crashed around 1996.
5) Multiple Choice Questions
The folks who set up Kentucky’s testing program back in 1992 hated multiple choice questions. They thought these led to overly scripted teaching and that multiple-choice questions could not test students’ higher order thinking skills. Thus, while the old KIRIS initially had multiple-choice questions, they were not counted. In fact, if memory serves, the multiple-choice questions were totally dropped from the test for a time after 1994. That misguided decision didn’t last long, though. By the time CATS came along, multiple-choice was back, and this time they counted.
One point to this reminiscing is that education research doesn’t have a stellar record for accuracy. As we gin up the third attempt to create a good school assessment program in Kentucky, the people who will be doing that will need to keep a wary eye on the things they are told “research shows.” You see, in education, what passes for research often doesn’t show anything.
By the way, how’s your recollection of “research showed” education “stuff” that didn’t work in Kentucky’s reform? Comments welcome.
Friday, March 27, 2009
KERA Theories That Just Didn’t Work
Subscribe to:
Post Comments (Atom)
2 comments:
Very insightful and true. The ideal approach would be to study successful approaches elsewhere--why re-invent the wheel if we don't need to--but if need be create a new model ONLY after it is thoroughly vetted by ALL those with a stake in it. "Buy-in" from those of differing philosophies is critical and will lead to a better performing product.
I agree with Richard Dawahare. Kentucky should look for answers that are working elsewhere. In fact, that has been a Bluegrass Institute theme since we started back in 2003.
There has been an excess of "not invented here syndrome" ever since KERA began, preventing our education leaders from doing what Richard recommends.
Hopefully, with a chance to do CATS over and get it right, some of that outward-looking mindset will finally take hold here.
Post a Comment