Wednesday, April 15, 2009

KLTPRC Fabricating Rankings – Again

Two years ago, before the demise of our old Blog, I wrote about a dubious “Policy Notes #23” release from the Kentucky Long Term Policy Research Center (KLTPRC). The KLTPRC was making a pretty squirrely analysis of some data, trying to evaluate KERA by ranking some of Kentucky’s education data against other states.

One of my favorite examples from that analysis was when the KLTPRC had the audacity to include a ranking of Kentucky’s dropout rates less than half a year after those rates had been officially audited by the Kentucky Auditor of Public Accounts and found to have considerable errors.

The KLTPRC also audaciously ranked Kentucky’s ACT college entrance scores against all the other 50 states. I knew this was bogus and even checked with the ACT, Inc. to see what they had to say about such rankings. The ACT people actively discourage such rankings because ACT participation rates vary dramatically from state to state.

But, never mind. When you want to puff up our education system, pesky little issues like gross misuse of ACT scores and blithe ranking of known bad data don’t matter.

Well, after the legislature just threw out our CATS assessment for cause, I guess we need to be told to feel good about Kentucky’s education again, because here comes the KLTPRC once again with Policy Notes #27.

Most of the same old problems still exist in this new report, right down to the misuse of the ACT and the same bogus dropout rate numbers.

This new report does use somewhat later data for some of the individual statistics (not all, though – when current data is not available, this research crowd has no problem going back in time, sometimes as far back as 2005, to come up with numbers to make us look good).

There was even a new twist. The proficiency rate for 4th grade reading on the National Assessment of Educational Progress (NAEP) wasn’t 34 percent in 2007, it was 33 percent. Is this just a typo, or did that error slip into the KLTPRC’s actual calculations? I don’t know.

I can’t even access the list of sources used in this latest report as the link isn’t working, either.

I suppose that is about par for this course.

Anyway, I wrote a pretty extensive critique of the problems I saw in that earlier KLTPRC report. Here is the latest version written about 9 months after Policy Notes #23 was released. With very little change, I could reissue it to cover Policy Notes #27, but why bother?

9 comments:

Anonymous said...

It's interesting how you are willing to cite ACT regarding the appropriate interpretation and use of ACT scores. This because you seem so unwilling to cite the NCLB office in the U.S. of Department of Education or the National Assessment Governing Board regarding the appropriate interpretations and uses of state test scores and NAEP scores. Go figure.

Richard Innes said...

Anonymous 1:31, you lost me. I have cited cautions from the federal authorities on NAEP regarding using caution in interpreting scores for states over time or across states -- frequently.

What are you talking about?

Anonymous said...

What am I talking about? Well, maybe it's just that you seem to pick and choose the "cautions" that serve your agenda and to ignore guidance that does not. For example, your state vs. NAEP reports compare the percentage of students proficient or above on a state test (i.e., AYP required by NCLB) to the percentage of students Proficient or above on NAEP. This inspite of the fact that the NAEP Validity Studies Panel made it clear in 2004 that “NAEP’s percent At or Above Basic is the most directly comparable statistic for confirming state AYP results."

Richard Innes said...

Anonymous, please cite that report.

Without seeing the report, I suspect what the panel might have been getting at was that the watered down way states were scoring their tests did indeed equate to NAEP Basic, not NAEP Proficient. But, that does not make NAEP Basic an appropriate target.

Consider the NAEP definitions for Basic and Proficient:

"Basic denotes partial mastery of prerequisite knowledge and skills that are fundamental for proficient work at a given grade."

"Proficient represents solid academic performance. Students reaching this level have demonstrated competency over challenging subject matter."

(From NAEP 2007 Reading Report Card)

If the NAEP Validity Studies Panel were unwise enough to claim that "partial mastery" is good enough, I think they would have a real credibility issue themselves. I suspect you are taking whatever they wrote out of context, but I'll reserve opinion until you provide a citation to the report.

Anonymous said...

[1] Here's something from the NVSP report that you want me to cite.

In selecting a gap performance measure, comparability with the AYP statistic is more important than correlation. Adequate yearly progress is already defined within the Act based on the percentage of scores exceeding the basic proficiency level. The basic proficiency level corresponds roughly to the percentage below basic on the NAEP scale. Therefore, of the various statistics that might be used for measuring a gap on the NAEP scale—proportion at or above the basic, proficient, or advanced achievement level, or mean standardized score—the proportion at or above the basic achievement level will both have the greatest correlation with the adequate yearly progress statistic and also be the most directly comparable. Since gaps and AYP measure different performance objectives (equality vs. absolute improvement), it follows that using the same basic statistic to measure each would simplify both interpretation and the presentation of results. [Page 12]

Mosquin, P., and Chromy J. (2004). Federal sample sizes for confirmation of state tests in the No Child Left Behind Act. Washington, D.C.: American Institutes for Research, NAEP Validity Studies Panel. Retrieved April 24, 2009, from: http://www.air.org/publications/
documents/MosquinChromy_AIR1.pdf

[2] The panel recognized that states must implement testing programs and standards to meet AYP as stipulated by NCLB. I wonder if you do. A state had to earn a mark of "YES" on the Peer-Review-Team's checklist as to whether “The proficient achievement level represents the attainment of grade-level expectations for that academic content area.” If no YES, then NCLB financial sanctions for the state. Period.

[3] The policy definition you quote notes that Basic denotes partial mastery of prerequisite knowledge and skills that are fundamental for proficient work at each grade. Language from the framework for the 2007 reading assessment clarifies “partial mastery of prerequisite knowledge and skills” for Basic at the fourth grade: Fourth-grade students performing at the Basic level should demonstrate an understanding of the overall meaning of what they read. When reading text appropriate for fourth graders, they should be able to make relatively obvious connections between the text and their own experiences and extend the ideas in the text by making simple inferences.

Language from the framework for the NAEP 2009 reading assessment clarifies "mastery of challenging content" for Proficient. “Proficient readers,” it says, “will have sizeable meaning vocabularies, including knowledge of many words and terms above grade level.”

Clearly, this language from the NAEP reading frameworks indicates that NAEP Basic represents an estimate of “grade-level expectations,” and that NAEP Proficient demands some above-grade-level knowledge and skills.

For sure, NCLB requires a state to define proficient as meeting grade-level expectations on state content. There can be no doubt that using a state’s NAEP Proficient score to confirm a state’s NCLB proficient score would most likely result in mistaken and misleading conclusions.

[4] I know you have never been impressed by what the National Assessment Governing Board has published about its test and about the achievement levels it established a decade before NCLB became law, but let's try again:

Notice that there is no mention of “at grade level” performance in these achievement goals. In particular, it is important to understand clearly that the Proficient achievement level does not refer to “at grade” performance. Nor is performance at the Proficient level synonymous with “proficiency” in the subject. That is, students who may be considered proficient in a subject, given the common usage of the term, might not satisfy the requirements for performance at the NAEP achievement level. Further, Basic achievement is more than minimal competency. Basic achievement is less than mastery but more than the lowest level of performance on NAEP. Finally, even the best students you know may not meet the requirements for Advanced performance on NAEP.

Loomis, S.C., and Bourque, M.L. (Eds.) (2001). National Assessment of Educational Progress achievement levels 1992-1998 for reading. Washington, D.C.: National Assessment Governing Board. Retrieved April 24, 2009, from http://www.nagb.org/publications/readingbook.pdf

[5] Basic denotes partial mastery of prerequisite knowledge and skills that are fundamental for proficient work. Or, in other words. A grade of C- to B denotes partial mastery of prerequisite knowledge and skills that are fundamental for B+ to A work.

Richard Innes said...

OK, now I know which reports Anonymous is talking about.

Let’s take the Mosquin and Chromy report first. That was item [1] in the comment.

This report was titled “Federal sample sizes for confirmation of state tests in the No Child Left Behind Act.” As the name implies, the report focuses on confirming the trends in the AYP gaps, say between Blacks and Whites, on state tests versus NAEP. The key issue, as the title of the document implies, is on the adequacy of the sample sizes. The report does not look at the validity of comparing what state’s called Proficient to what NAEP calls Proficient.

I don’t think Anonymous really understands what the Mosquin and Chromy report is about. Mosquin and Chromy’s point that Basic, rather than Proficient, correlates better to what state tests claim meet their definition of AYP appears to have some merit, but those authors do not claim that NAEP Basic is the right target for states to use. In essence, they just confirm that NAEP Basic is pretty close to what most states are using for their watered down definition of “Proficient” for NCLB purposes.

Anther note, Anonymous quotes Mosquin and Chromy saying, “Adequate yearly progress is already defined within the Act based on the percentage of scores exceeding the basic proficiency level.” That sentence mixes two key words, “basic” and “proficient” in a misleading way. The NCLB act does not call for states to meet the “basic proficiency” level. In fact, the term “basic proficiency” does not appear anywhere in the act. NCLB calls for states to meet the “Proficient” level, period.

One final point, in Anonymous’ first comment, he or she claims that I am “unwilling to cite the NCLB office in the U.S. of Department of Education or the National Assessment Governing Board regarding the appropriate interpretations and uses of state test scores and NAEP scores.”

Now, it turns out the evidence Anonymous cites comes from a different organization, one that starts out in its report making it clear that the opinions therein have not necessarily been approved by either NAGB or NCES.

Regarding item [2] in the latest Anonymous comment, the entire peer review process for NCLB has been severely criticized due to the actual state programs that resulted. There was no standardization, not even in the obvious area like a common formula to compute graduation rates. Furthermore, many organizations, both liberal and conservative in orientation, have roundly criticized not just the low level of performance that many state’s adopted as Proficient caliber work, but also the variation in those performance levels.

To be fair, NCLB was new territory when those standards reviews took place, and the US Department of Education did a poor job of leading the effort (witness the chaos in the graduation rate reporting). There is now evidence that the department is learning from those earlier missteps, and states are being told that more stringent rules are coming. For example, states currently face a mandate to get on board with honest, and uniform, graduation rate reporting before the 2010-2011 school year closes.

Item number [3] in Anonymous’ reply has two problems. First, the comments made by Anonymous actually undermine his or her case dramatically. If the frameworks had been revised and changed over such a short period of time as severely as Anonymous indicated, then the stability of the NAEP would be under severe threat.

But, I don’t think Anonymous summarizes the 2009 Reading Framework correctly. Pages 43 to 47 in this document (http://www.nagb.org/publications/frameworks/reading09.pdf) need to be consulted to see what the real difference between Basic and Proficient is supposed to be. Here are a few points:

The document repeats the basic definitions for NAEP Basic and Proficient that I quoted in an earlier comment. The document then amplifies this for fourth grade as follows:

“Grade 4 students at the Basic level should be able to: • Find the topic sentence or main idea. • Identify supporting details. • Identify author’s explicitly stated purpose. • Make simple inferences.”

“Grade 4 students at the Proficient level should be able to: • Identify author’s implicitly stated purpose. • Summarize major ideas. • Find evidence in support of an argument. • Distinguish between fact and opinion. • Draw conclusions.”

I would argue that the Basic level requirements are pretty rudimentary.

Things get more obvious when we look at the additional comments about vocabulary for the different NAEP Achievement Levels.

For NAEP Basic, the framework says:

“Readers at the Basic level will generally have limited concrete vocabularies that consist primarily of words at and below grade level. Knowledge of these words will be limited to the most familiar definition, making it difficult to identify the appropriate meaning of a word among the distractors.”

Limited vocabulary for the grade level is not a suitable target, period.

So, in fact, the 2009 NAEP Frameworks don’t say what Anonymous would like us to believe.

That is probably enough on this subject. If Anonymous wants Kentucky to aim at low standards that won’t get kids prepared for what they need next, I think he or she is in very short company. And, I still have not seen anyone at NAGB, NCES or the US Department of Education, or most independent research organizations, actually claim that NAEP Basic is without question the most suitable target for state education systems.

Anonymous said...

I don’t know where to start, maybe with three points regarding Mosquin and Cromy.

[1] You say, “Now, it turns out the evidence Anonymous cites comes from a different organization, one that starts out in its report making it clear that the opinions therein have not necessarily been approved by either NAGB or NCES.” True, but not the whole truth. The year after receiving the NVSP report, NAGB and NCES prepared and published reports for the NAEP 2005 assessment that -- for the first time ever -- placed focus on the “Percent At or Above Basic” statistic. This significant change in long-standing NAEP reporting practices clearly shows that NCES and NAGB understood, accepted, and implemented Mosquin and Cromby’s findings. This is just one more example of a great trait shared by NAGB and NCES. They both give heed to findings from sound research.

[2] You seem confused by the term basic proficiency level. The term refers to the level of achievement addressed in NCLB. It refers to the commonly used definition of proficiency that is associated with achieving grade-level expectations, not to the special definition that NAGB stipulated a decade before NCLB was passed. As NAGB explained in 2001, it is “important to understand clearly that the Proficient achievement level does not refer to ‘at grade’ performance. Nor is performance at the Proficient level synonymous with ‘proficiency’ in the subject. That is, students who may be considered proficient in a subject, given the common usage of the term, might not satisfy the requirements for performance at the NAEP achievement level.”

[3] It is true that Mosquin and Cromy did not argue that “NAEP Basic is the right target for states to use.” They did argue, however, that it is the correct target for states to use for confirming AYP results. The fact is that NCLB and U.S. Department of Education guidance do require states to focus on achievement as defined in NAEP Basic. “Right” is a policy issue; “correct” is a technical issue. You are welcome to argue about the “rightness” of NCLB and Dept. of Education requirements placed on states (you might even find me in agreement with some of your arguments), but you are so out-of-bounds when you condemn your state or any other state for following a “correct” course of action as mandated by law.

While there are many of your "claims" that need to be addressed, I have time remaining only for two brief comments about the reading framework and NCLB expectations:

[4] Text below from the reading framework seems to suggest that NAEP Basic students might be expected to receive grades in classroom work no lower than C-/C. It’s not going to happen, but it would be a great thing if ALL grade 4 students could perform at least as the NAEP Basic level by 2014:

“Grade 4 students at the Basic level should be able to: • Find the topic sentence or main idea. • Identify supporting details. • Identify author’s explicitly stated purpose. • Make simple inferences.”

“Readers at the Basic level will generally have limited concrete vocabularies that consist primarily of words at and below grade level. Knowledge of these words will be limited to the most familiar definition, making it difficult to identify the appropriate meaning of a word among the distractors.”
[5] Text below from the reading framework seems to suggest that NAEP Proficient students might be expected receive classroom grades no lower than B+/A-. It strains rationality to claim that the intent of NCLB was to have ALL grade 4 students performing above grade-level expectations by 2014:

“Grade 4 students at the Proficient level should be able to: • Identify author’s implicitly stated purpose. • Summarize major ideas. • Find evidence in support of an argument. • Distinguish between fact and opinion. • Draw conclusions.”

“Proficient readers will have sizable meaning vocabularies including knowledge of many words and terms above grade level. They will also have greater depth of knowledge of words (beyond the most common meaning). Proficient readers will be flexible with word meanings and able to extend the senses of words whose meanings they know in order to appropriately fit different contexts and understand passage meaning.”
AS NAGB has said, some of the best students you know will not score NAEP Advanced. This meansthat some of the best students you know will score "only" NAEP Proficient. I don't think NCLB requires every student in America to achieve at the same level as "some of the best students you know."

# # #

Anonymous said...

My apology to James Chromy for the incorrect spelling of his name.

Richard Innes said...

All of Anonymous' lengthy comments aside, his argument still comes down to a defense of NAEP "Basic or Above" as an appropriate NCLB target. I have now assembled more evidence in an April 25th main Blog item that strongly challenges that position.

Find that here: http://bluegrasspolicy-blog.blogspot.com/2009/04/which-naep-achievement-level-is.html.