Wednesday, August 17, 2011

Southern charter school state handily beats Kentucky in new ACT test results

The new ACT results are out, and I already posted a blog earlier today with the news that Kentucky did edge up slightly in ranking among those states where ACT testing is essentially universal.

But, a really big story concerns how Kentucky stacked up against hurricane-racked schools in Louisiana. Louisiana rebuilt its post-Katrina school system largely by harnessing the power and flexibility of charter schools to move out much more quickly than traditional, regulation- and union-constrained public schools could ever accomplish.

Here is our comparison graph again of states where ACT testing is virtually or totally universal.


Note that Louisiana, one of the nation’s strongest charter school states, and also a Southern state, outscored Kentucky by 0.6 point for the 2011 high school graduates’ ACT Composite Score. Here is a further breakdown of ACT Composite Scores by race for the two states:


In particular, notice that blacks in Louisiana outscored Kentucky’s blacks by a full point on the ACT Composite Score. That is a notable difference.

However, the white situation is much more dramatic. Kentucky’s whites languished behind whites from Louisiana by a whopping 1.6 points! That is a very dramatic difference in state-to-state comparisons.

Keep in mind, both states tested all of their graduates, so these comparisons are quite appropriate.

When you consider the challenges Louisiana has faced, this is clearly rather remarkable performance – for them. Good job, Louisiana!

The situation shows that Kentucky could make stronger educational gains, faster, if it would finally establish a good charter school program.

Data sources: ACT Profile Reports for Kentucky and Louisiana, on line here.

22 comments:

Anonymous said...

For school year 2006-07 the average ACT score for Louisanna was 20.1 (100% tested) and for school year 2010-11 the average ACT score was 20.2. Kentucky is making slightly more improvement than Louisanna. Charter schools do not appear to be having the impact that you are reporting. Prior to Katrina, when students were self-selecting for the ACT, there were thousands more students taking the ACT in Louisanna. By non-research based reports, many of the students that did not return to the state were poor kids. You need to report facts, not your bias opinions.

Anonymous said...

Your conclusions are WRONG! Looking at the ACT data for the last 10 yrs. improvement in LA ACT scores is due to the post-Katrina diaspora, N.O. lost 30% population according to 2010 census. Along with KY 100% participation in ACT testing in 2009, the data correlates with the resulting scores. That is, LA lost ~10% of their poor students* and gained 0.3 on the ACT scores, KY tested the 30% of their poor students and lost 1.0 on the ACT scores. The LA ACT scores post-Katrina are trending flat and show no help from charters and a CREDO report on LA charters says that kids in charters are hurt by long-term attendance, what they call a regression toward the mean. Charter schools bring the poor students up a bit, but bore the bejesus out of the smart kids. You're guilty of confirmation bias, looking for data to fit your own prejudice. (*Poor students as in poor academically, but free or reduce price lunches can correlate with a schools academic performance.)

Richard Innes said...

RE: Anonymous August 18, 2011 9:32 AM

Sorry, but you got your facts completely wrong.

Louisiana only tested 79% of its graduates in 2007 and did score a 20.1. In the same year, Kentucky tested a virtually identical 77% of its graduates and got a notably higher 20.7.

In 2011, Kentucky and Louisiana both tested 100% of their graduates, but Kentucky's ACT Composite Score was only 19.6 and Louisiana's was 20.2.

Due to the very close percentages tested in both years, it probably isn't unfair to conclude that Louisiana went from 0.6 points behind Kentucky to 0.6 points ahead of Kentucky, a remarkable improvement on this 36 point assessment.

The non-rigorous reporting on poverty you cite is also interesting, and wrong.

The NAEP offers a good insight into the real poverty situation in Louisiana. It allows us to determine the percentage of poverty in the grade 8 reading assessments of 2003 (most recent prior to Katrina) and 2009 (Most recent currently).

In 2003, free and reduced cost lunch eligible students in Louisiana amounted to 50% of all the students there.

In 2009, the poverty rate based on lunch eligibility rose significantly to 62 percent.

Your anecdotal reports do not jibe with facts from NAEP. There is more poverty in Louisiana today than pre-Katrina.

All of this just adds more weight to our very good case that Louisiana is making notable progress against Kentucky, and charter schools certainly seem a likely part of the explanation.

I think most of our readers will be able to see where the bias in this issue lies. We do work in facts, not "non-research" reports that are often highly biased. I'd watch out for those in the future.

By the way, in the interests of education, the word, the way you are using it, should be "biased," not "bias."

Anonymous said...

Here are the facts:
Composite
2006-07 20.1
2007-08 20.3
2008-09 20.1
2009-10 20.1
2010-11 20.2

Richard Innes said...

RE: Anonymous August 18, 2011 9:55 AM

(Note: The anonymous post may be an extension of the post from Anonymous August 18, 2011 9:32 AM)

First, please read my comments in response to Anonymous August 18, 2011 9:32 AM above.

To reiterate the key points, NAEP data shows Louisiana's (LA) school system has notably more poverty now than pre-Katrina. Do you really think you can make a case that only the smart among those poor came back to LA after Katrina?

Also, comparing the 2007 to 2011 ACT data as the other anonymous commenter chose to do, there actually is a very good case in the real data (which does not show LA's testing 100% of graduates in 2007 with the ACT as anonymous incorrectly claims) that LA made some pretty significant progress from two years after Katrina to the present in comparison to Kentucky.

At least in comparison to Kentucky, Louisiana is most definitely not trending flat in the post-Katrina period. They came from well behind us (0.6 point behind in 2007) to go well in front (0.6 point by 2011).

I must point out that your difficult-to-follow discussion isn't very clear in some areas, but you do claim that LA scores post-Katrina are trending flat. Sadly, you forgot to look at the participation rates, which are an integral and important part of any scores analysis with the ACT.

As recently as 2009, LA only tested 89 percent of its graduates and scored a 20.1. Two years later, the state tested 100% and the scores WENT UP to 20.2. To increase both participation and scores is extremely difficult.

Kentucky didn't do that. In fact, between 2008 and 2009, the first year 100% of Kentucky's grads took the ACT, the scores dropped not by 1.0 point, but by 1.5 points. On the 36-point ACT, that is a notable difference.

I really don't know where you are getting your data, but it can't be from the ACT files.

Next, let's talk about the CREDO report on Louisiana. On page 4 that report says:

"The results suggest that new charter school students receive no significant impact on learning in reading or math compared to their counterparts in traditional public schools. In subsequent years, charter school students have an initial gain in both reading and math from charter school attendance compared to their counterparts in traditional public schools and this impact stays positive and significant through the fourth year of attendance and beyond."

Please tell me how you can possibly construe that to mean that students who attend charters for longer periods of time in LA charters are harmed by the experience.

In fact, if you actually read the CREDO report on LA (available here: http://credo.stanford.edu/reports/LA_CHARTER%20SCHOOL%20REPORT_CREDO_2009.pdf), it overwhelmingly rates LA charters as better performers in virtually every area examined.

So, let's talk a bit about biases.

You misrepresented findings in CREDO, got numbers and trends in numbers wrong and failed to consider changes in participation rates along with changes in scores.

If you had a more open attitude about charters, I think you would start to see there is a lot of strength in them, strength Kentucky badly needs.

But, keep your questions coming. The LA versus Kentucky trend from 2007 to 2011 is really rather remarkable, and it will make a great main blog.

Anonymous said...

You need to know that there are more than one person signing in as anonymous.

You are just wrong. Period. And your 5 readers should be aware of it.

Logan said...

I'll trust Mr. Innes' data. He has a proven record of accuracy regarding education and testing data.

He is also not afraid to identify himself in the conversation.

Richard Innes said...

Re: Anonymous August 18, 2011 12:19 PM

Here are the COMPLETE FACTS:

Composite
2006-07 20.1 with 79% Taking
2007-08 20.3 with 88% Taking
2008-09 20.1 with 89% Taking
2009-10 20.1 with 98% Taking
2010-11 20.2 with 100% Taking

Are you really trying to claim this is flat performance when participation in Louisiana rose 21 percentage points during this period?

Common!

Richard Innes said...

RE: Anonymous August 18, 2011 12:41 PM

Sorry, but "You are just wrong, period," won't work with the vast majority of our readers. They are used to hearing that sort of argument from their kids.

At BIPPS we deal in factual, data-driven discussions.

Please do keep reading us, however. That way, you will learn how to dig hard for good data and then use that information to develop the sort of understanding and insight that almost all our readers do appreciate and value.

Anonymous said...

It's cherry picking time, all the time, on this blog.

Richard Innes said...

RE: Anonymous August 18, 2011 5:24 PM

At BIPPS we like factual, data-driven discussions.

You have offered some arguments against progress in Louisiana, and I have shown how they are all incorrect.

How is that cherry picking?

Ranting about imaginary biases won't impress many of our readers.

To succeed here, you have to come up with some solid, fact-based arguments that hold water. Stomping off while muttering about supposed cherry picking, that isn't happening, won't do it.

Richard Innes said...

RE: Anonymous post 20 Aug 11 at 1:42 PM

I don't know why this post isn't showing up in Blogger. I got a copy from the Blogger mail alerting service, but it is missing here.

If you want to pursue this, please repost. I do have answers for you.

Richard Innes said...

RE: Anonymous RE: Anonymous post 20 Aug 11 at 1:42 PM and several subsequent attempts.

Something in your posts is not being accepted by Blogger. I separately tried to upload your interesting 4-part commentary and Part 1 would not load for me, either.

Not sure if this is a length issue or a problem with the use of the hyperlink to your tables, but I cleared out parts of the message that did post because they are confusing without the rest.

I'll try to break your message out into more parts and do some other things to see if I can finally get Blogger to accept this.

Richard Innes said...

Here is the link to the table discussed in the Anonymous posts below. It will be followed by Anonymous' four-part message (I hope).

http://bit.ly/qynBc9

Richard Innes said...

Post 1 (Reposted by Innes Due to Blogger Problem with Anonymous’ Inputs):
FYI-Post 2 is the one that begins with “In my table from the above link . . ,”

First off, it is confirmation bias, a term from cognitive science, Google it.

Here is the data* I'm using, same as yours: (Find Link to Table in preceding message from Richard Innes)

My table is the ACT website data, population tested in parenthesis. I'm not only looking at the years 2011 and 2007, I'm looking at all the years.

I am arguing that your reported gain in 2011 of LA over KY is totally due to the 100% participation of KY students beginning in 2009. A 28% swing in testing population from 72% to 100% with a concomitant drop of 1.5 in composite ACT score.

The LA ACT scores are trending flat, you argue that they are actually trending up because participation is going up and those now taking the ACT are those most likely to be academically challenged which would tend to pull the average score down. You attribute this positive trend to charters.

If I argue that KY's drop in scores in 2009 is due to increased participation sans charters, then you can argue that LA's flat trend with increased participation is due to charters. As we can see, the post-Katrina exodus 2005-2006 saw a 11% drop in participation and 0.1 rise in raw score.** So we are BOTH saying participation up/scores down and participation down/scores up. Capisce!
(Continued in next post)

Richard Innes said...

Post 2 (Reposted by Innes Due to Blogger Problem with Anonymous’ Inputs):
(Continued from previous post)
In my table from the above link, the difference between the KY-LA scores from 2008 to 2009 is a 1.3 swing from a negative to a positive measure in LA's favor (see the Diff.2009-2008). The LA participation of those years is 88% 2008 and 89% 2009, the score went down 0.2 with 1% more participation, so I'm calling those tested student populations same-same. The KY tested populations between 2008 and 2009 are NOT the same.

As a scientist, I call that 1.3 swing in KY-LA comparison scores in 2008/2009 a BLIP in the data. I'm trained to zero in on BLIPS and then distinguish them from noise. I would ask myself what caused the BLIP, does the BLIP correlate with other measurements I have at hand? We have both argued that participation has a negative correlation to ACT scores. I'm arguing the BLIP is the result of KY testing all our students and that it colors the rest of the comparison between KY-LA test scores

All of KY's loss versus LA happened in a one year period, the same year that the LA testing population was essential the same. From this you would be arguing that charters schools showed the bulk of their demonstrated gains in that one year period. And your argument also follows that LA charters made some of their gains in the years 2009-2010 because the participation rose 9%, while the score stayed the same.
(Continued in next post)

Richard Innes said...

Post 3 (Reposted by Innes Due to Blogger Problem with Anonymous’ Inputs):
(Continued from previous post)
Education improvement is considered to be a slow incremental process. Any pulses in the test scores in high stakes testing are considered to be an indication of cheating. I'd be calling the Louisiana Attorney General if I saw a true 1.3 swing from year to year in the LA ACT test scores. But that is not what we see, what we see is a 1.3 swing against KY ACT test scores. Again as a scientist I would ask myself, what are the LA test scores and what are the KY test scores in the years in question. Then I would ask myself what changed in those years. Answer: The KY test scores from the change in testing population, not some LA charter miracle.

From my table (See Link in earlier message from Richard Innes), I'm confident that most people will see the BLIP in the 2009-2008 KY ACT test score data and agree with my conclusion. Now going forward with 100% testing in both KY and LA, if we see the test scores diverge, i.e. LA scores get better and KY scores essentially stay flatline in our charter poor Kentucky Home, then you can Fire-up the press releases, Blog the blogosphere, Like on Facebook and Tweet to your heart's content about the LA charter revolution. Today, not so much.

*Data source: http://www.act.org/news/data.html

**Raw Score is the State Score minus the ACT National Score since the national score changes year to year, to normalize year to year comparison of a single state score, you have to remove the variability of the national score from the state score. If the ACT National Score is the same from year to year then you can compare state scores without the math.
(Continued in next post)

Richard Innes said...

Post 4 (Reposted by Innes Due to Blogger Problem with Anonymous’ Inputs):
(Continued from previous post)
Robert Innes quote: "Due to the very close percentages tested in both years, it probably isn't unfair to conclude that Louisiana went from 0.6 points behind Kentucky to 0.6 points ahead of Kentucky, a remarkable improvement on this 36 point assessment."

What tortured double negation is that? Try "isn't fair" or "is unfair," or were you trying to say "is fair" with that construction? I think it would be fair to say, as I have above, that a 1.3 swing in scores in a one year period would be cause to call the forensic accountants in Louisiana. With governors like Huey Long, Earl Long and Edwin Edwards, I'm sure they've got a few on staff.

Oddly enough, from the state specific report on Louisiana charters from the CREDO website, students that stay in charters, perform worse than students that stay in TPS. From page 12 of the study, the following: "At the same time, the analysis showed they performed significantly worse with the following groups of students: "Reading-Retained Students." That's where I got the idea that students that stay longer in charters do worse than kids in Traditional Public Schools. RIF-Reading is Fundamental.

And yep, I've got an obvious bias against charter schools and you've got an obvious bias for charter schools, even steven I'd say. Next time up, I might talk about selection bias, Google it! I wouldn't want any "selection biased" awkwardness between us. ;-)

Richard Innes said...

For readers who have chosen to follow us this far, the four-part post above was submitted by an anonymous individual. He or she was unable to get Blogger to accept it, probably because of the link to his table, which for unknown reasons Blogger would not accept from him, but finally did accept from me. I'll respond to this after Anonymous gets a chance to check it for any errors that this convoluted posting exercise might have created.

Richard Innes said...

OK, “Anonymous” has not made any corrections to his/her 4-Part post, so I will proceed with the understanding that I post it without any errors.

Part 1 of my reply

Anonymous’ discussion is laced with a misunderstanding of ‘Real Performance Changes’ on the ACT versus ‘Score Increases.’ The problem is that real performance increases related to the ACT MUST consider both changes in scores AND changes in participation rates.

For example, while Kentucky’s ACT Composite Score dropped rather dramatically between 2008 and 2009 when 100% testing started, the overall ACT performance for the state actually may have made a small improvement. For example, in 2008 only 11,113 Kentucky graduates met the ACT Benchmark Score for math, while in 2009 11,938 students did. There is no way to know for sure if performance did increase, however, because the tested graduates’ sample from 2008 was dramatically different from the group of graduates tested in 2009. The 2008 cohort was not a random sample and probably only included better performing students.

This is really very simple – comparative state performance on the ACT cannot be judged by scores alone unless both states test similar percentages of students (even then, when the percentage is less than 100%, there could be some biases in the two states’ results that puts one of those states at a disadvantage).

I thought Anonymous had figured that out when he/she talked in Part 1 about both scores and participation rates as important.

But, that logic doesn’t hold in Anonymous’ discussion of the more recent data for the years when Kentucky had switched to 100% testing of all graduates.

Anonymous tries to create an image that all of the Kentucky Vs. Louisiana ACT performance change occurred between 2008 and 2009. That’s not true. The scores don’t tell the whole story.

In fact, between 2009 and 2011 Louisiana’s test scores went up by 0.1 point while participation also rose notably from 89% to 100%. This is a period of scores up, participation up, to use the terminology Anonymous used in Part 1. That is clearly a further increase in overall performance for Louisiana, which conflicts with Anonymous’ assertion in Part 2 that all of Kentucky’s loss versus Louisiana occurred in just one year. That assertion is clearly incorrect.

The problem is that no-one knows how to weight the relative importance of the changes in scores versus the changes in participation rates to develop a solid, numerical description of ACT “Performance” instead of looking only at the score changes. That means the only reasonably close comparisons we can make between Kentucky’s and Louisiana’s true ACT performance occurs in those years when participation rates are relatively close. For the period from 2003 to 2011, those years are 2006, 2007 and 2011. In 2006 Louisiana scored 0.5 point lower than Kentucky, in 2007 it changed a bit to 0.6 point lower, and when we again got a fair comparison in 2011, things had flip-flopped dramatically to Louisiana scoring 0.6 points ahead of us.

All the attempts to compare Kentucky and Louisiana across the 2008 to 2009 period are seriously undermined by the very simple fact that in those years the participation rates in both states varied dramatically, hopelessly confounding any simplistic scores comparison attempts.

Richard Innes said...

Part 2 of my reply

By the way, Anonymous engages in a truly off-target discussion by saying that rapid changes in scores are considered evidence of cheating. There is no evidence of widespread cheating here (there WAS cheating in Kentucky in 2010 in Perry County, but the number of students involved was too small to swing statewide scores). The actual changes in scores for over time for both states, except for Kentucky’s one year DROP (which clearly would not be evidence of cheating) are both incremental. There is no reason to suspect cheating in the actual scores. The big relative change in scores over time between Kentucky and Louisiana clearly came from other factors. Even mentioning cheating in this discussion is nonsense.

In any event, Anonymous’ entire discussion relies too much in the post 2008 period on a scores only examination. He/she ignores the equally important issue of participation. I expect newspaper reporters to make those sorts of mistakes. I don’t think “scientists” should do so (of course, I’ve not met any real scientists who don’t sign their name to their work, either. That’s also a part of doing real science.).

I’ll add a few more comments in a following post about Part 4 of Anonymous’ discussion.

Richard Innes said...

Part 3 of my reply

This refers to Part 4 of Anonymous’ post.

I accept the criticism about the double negative. Like many in blogging, I don’t always check everything with the same degree of attention that I use in more formal writing. I guess that goes as well for those who think my name is “Robert.” It’s “Richard.”

Anyway, to revisit what I said above, your attempt in part 4 of your post to relate a cheating claim to what actually happened to ACT results for Kentucky and Louisiana is ridiculous. The fact that you keep harping on this is, however, very good evidence of your strong bias against charter schools, which you finally admit. That bias isn’t scientific, is it?

That bias is also reflected in your continued cherry picking of the extensive overall findings in CREDO’s Louisiana report. Did you even read the full report?

The CREDO Louisiana report’s summary says:

“With the students they have enrolled, Louisiana charter schools provide significantly better results for the following groups of students:

Reading
All Students
Students enrolled for 2 years
Students enrolled for 3 years
Students enrolled for 4 or more years
Blacks
Students in poverty
Students in the lowest starting deciles
Students in the highest starting deciles

Math
All Students
Students enrolled for 2 years
Students enrolled for 3 years
Students enrolled for 4 or more years
Blacks
Hispanics
Students in poverty
Students in the lowest starting deciles
Students in the highest starting deciles


At the same time, the analysis showed they performed significantly worse with the following groups of
students:

Reading
Retained students

For the remaining groups in the analysis, there was no discernable difference between charter school and traditional public school performance.”


Does that sound like CREDO overall found poor performance in Louisiana charters?

Interested readers can see for themselves and decide for if Anonymous is fairly characterizing the overall sense of CREDO’s report or is just cherry picking.

http://credo.stanford.edu/reports/LA_CHARTER%20SCHOOL%20REPORT_CREDO_2009.pdf