tag:blogger.com,1999:blog-4468664660833170893.post3530675856007060439..comments2024-01-08T14:53:19.838-05:00Comments on Bluegrass Policy Blog: Kentucky’s Educators May Get to Do Some ExplainingKelly Smithhttp://www.blogger.com/profile/17249335217299732224noreply@blogger.comBlogger4125tag:blogger.com,1999:blog-4468664660833170893.post-86190055880485706312009-04-16T13:10:00.000-04:002009-04-16T13:10:00.000-04:00Please be aware that NCLB requires a state to defi...Please be aware that NCLB requires a state to define <I>proficient</I> as meeting grade level expectations. Moreover, the state test must assess only the grade level content listed in the state standards for that grade level. On the other hand, NAEP achievement levels are not bound by grade level content. In fact, a <I>NAEP proficient</I> reader must know many terms and phrases that are above grade level. Both of these <B>facts</B> are clearly documented in that paper you deem "not convincing" with sources from the U.S. Department of Education and the National Assessment Governing Board.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4468664660833170893.post-54035335125401939132009-04-16T12:56:00.000-04:002009-04-16T12:56:00.000-04:00"The real point here is that Anonymous doesn’t wan..."The real point here is that Anonymous doesn’t want to admit that there is NO excuse for NAEP to show a DECREASE in proficiency rates while CATS shows an INCREASE."<br /><br />Hardly! The real point is that it makes little, if any, sense to become excited by or even interested in comparisons showing that the percent of students performing at a classroom B+/A- level or higher is decreasing (NAEP results) while the percent of students performing at a classroom C-/C level or higher is increasing (state test results).Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4468664660833170893.post-53507198201788224522009-04-07T16:27:00.000-04:002009-04-07T16:27:00.000-04:00RE: Anonymous April 7, 2009 2:53 PMI have indeed s...RE: Anonymous April 7, 2009 2:53 PM<BR/><BR/>I have indeed seen this “explanation” before. It’s not convincing.<BR/><BR/>For our other readers, this paper was created by an employee of the Idaho State Board of Education. At the time this paper was written, Idaho was under fire because its state assessment’s definition of “Proficiency” was – like Kentucky’s – far below what the NAEP deemed to be acceptable performance meriting use of the term. Certainly, the reporter cannot be considered unbiased.<BR/><BR/>More to the point, the paper cited by Anonymous suffers from some incredibly bad logic. Among other issues, there is an implication that the NAEP Achievement Level Scores (which include the score of “Proficient) were somehow derived without any consideration of the grades of the students involved. That is patent nonsense. In fact, the cut scores for NAEP Achievement Level Scores are separately developed for each tested grade. The scores are most definitely related to specific performance expected in specific grades. <BR/><BR/>There is more wrong with this paper, but it isn’t worth going into.<BR/><BR/>The real point here is that Anonymous doesn’t want to admit that there is NO excuse for NAEP to show a DECREASE in proficiency rates while CATS shows an INCREASE. But, that is what happened with CATS as I point out with the graph in the main post. This diverging trend is evidence of severe test inflation on CATS grade 8 reading. <BR/><BR/>Most intelligent people understand that.<BR/><BR/>In addition, there is no excuse for the grading scales in CATS to get progressively easier and easier over time, something I have solidly established happened with CATS as shown here (http://www.freedomkentucky.org/index.php?title=CATS_Academic_Test_Inflation).<BR/><BR/>Kentucky education’s ten-year courtship with an inflated, misleading assessment that didn’t provide us the truth was inexcusable, pure and simple. And, those who continue to defend CATS in the face of the overwhelming evidence to the contrary, such as the just-released bad news about our latest college remediation rates (http://bluegrasspolicy-blog.blogspot.com/2009/04/more-strong-evidence-senate-bill-1-was.html), are badly out of touch with reality.Richard Innesnoreply@blogger.comtag:blogger.com,1999:blog-4468664660833170893.post-39660686989177665992009-04-07T14:53:00.000-04:002009-04-07T14:53:00.000-04:00You've seen the explanation before. Check it out a...You've seen the explanation before. Check it out again. Maybe it will stick this time:<BR/><BR/><I>An explanation for the large differences between state and NAEP "proficiency" scores reported for reading in 2005</I>. <BR/><BR/>It is available online: http://www.eric.ed.gov/contentdelivery/servlet/ERICServlet?accno=ED497395Anonymousnoreply@blogger.com