Surprise: UK’s economists are pushing the latest fairy tales
The Lexington Herald-Leader just announced a reincarnation of a totally bogus education ranking system originally fabricated by the Kentucky Long Term Policy Research Center (KLTPRC).
According to the article, the ‘new’ rankings are loaded with many of the very same statistical gaffs we found in the old KLTPRC reports.
For example, the UK study dares to rank scores from the ACT college entrance test for all 50 states. This is a serious statistical mistake, one students who just completed a freshman college statistics course should easily recognize.
The problem: in most states the ACT is only taken voluntarily by students who want to go to colleges that require it in preference to the SAT. Because most East and West Coast schools prefer the SAT, and because students who don’t plan to go to college don’t take the test at all in most states, there currently is a huge difference in the percentage of high school graduates in each state that take the ACT. The latest available data for 2010 shows that in Maine only 10 percent of the high school graduates took the ACT, while in Kentucky 100 percent of our graduates took this test because Kentucky now requires all students to take the ACT.
Any comparison of such wildly different test samples, especially when the Maine sample was totally voluntary and is in no way related to a statistically valid random sample, is total nonsense.
In fact, if we do a somewhat better comparison, like only looking at ACT scores for those states that had a very high percentage of graduates taking the test (96 percent or more participation), we get the following ranking graph.
Somehow, that doesn’t look too impressive, does it?
By the way, years ago when the Kentucky Long Term Policy Research Center first started ranking ACT scores for all the states, we called ACT, Incorporated to find out what they thought about this. The ACT told us they strongly discourage doing such inappropriate rankings precisely because participation varies so widely from state to state.
I guess the UK crowd never talked to ACT about the character and limitations of this data. I think most Statistics 101 courses go into precisely those sorts of mistakes.
The news article also says that the new rankings include state-to-state comparisons of high school dropout data. That is absolutely unacceptable.
Kentucky’s dropout and graduation rate reporting was officially audited in 2006 and found to be seriously inaccurate, creating an overly rosy picture of what is really happening in our schools.
Even though it is now 2011, the Kentucky Department of Education has yet to release any dropout and graduation rate data based on more solid information that we will get in the future from a recently introduced high quality student tracking system, so the inflation in graduation rates and the unreliably low dropout rate reporting continues in the latest data.
But, Kentucky is way behind in getting accurate dropout and graduation rate data. Most other states have had high quality reporting for some time now, and they are reporting accurate, and usually much lower, graduation rates with correspondingly higher dropout rates.
So, we again have a huge statistical ‘apples-to-oranges’ mess, with inaccurate, picture-inflating data from Kentucky matched up against much more truthful reporting from elsewhere. No wonder we look good, but this comparison isn’t trustworthy. Armed with the facts, a Statistics 101 student would turn up his or her nose.
There are a lot more problems with trying to simplistically rank scores from the National Assessment of Educational Progress. Clearly, the people who created this report didn’t read my freedomkentucky.org Wiki item, “The National Assessment of Educational Progress,” which is often called the NAEP.
If they had, they would have discovered things like this telling map, which treats the latest available NAEP Grade 8 Math data like the statistically sampled data it is.
Once the statistical sampling errors in the NAEP are considered, only one state in the country, West Virginia, had white student scores that were lower than Kentucky’s in the 2009 NAEP Grade 8 Math Assessment. By the way, whites comprise about 85 percent of Kentucky's students.
Now, how’s that again about Kentucky’s rankings against other states?
Why would professors from the University of Kentucky’s school of economics resurrect such an inaccurate ranking system?
I wish I knew.
If you want to know more about what is wrong with this new report, just enter the term “KLTPRC” in the search feature of this blog to pop up a number of articles that will explain all the problems.
Sunday, July 10, 2011
Nonsense Kentucky education rankings appear again like a bad ghost
Labels:
accountability,
education,
education fads
Subscribe to:
Post Comments (Atom)
2 comments:
And the education establishment stood in unison and cheered 'go UK'! Another report that tells just how great our educators can spin data to favor status quo, no change, no reform, no nothing because everything is progressing quickly down the yellow brick road - or is it incompetence?
Why don't the UK profs and KY education leaders come out in the open like big boys and girls and answer the challenges poised by Richard Innes.
Apparently he hasn't got the play book for everyone gets along and everything is wonderful.
Good for Mr. Innes. At least one person is fighting for the kids to get a good education in this state. The first step is look in the mirror and see real results.
There's one. Is there two?
RE: Anonymous July 10, 2011 9:08 PM
Thank you for the nice words of support.
I wish it wasn't necessary to do the fighting, though.
Post a Comment