Monday, December 21, 2009

Would you call this “Substantial” progress?

Over at the Prichard Committee’s blog, they are waxing ecstatic about a new report that Prichard head Bob Sexton and Prichard number cruncher Susan Weston just released called, "Substantial and Yet Not Sufficient: Kentucky's Effort to Build Proficiency for Each and Every Child."

I think they only got the title half right.

This graph shows the latest available eighth grade results for Kentucky from the National Assessment of Educational Progress.


After almost two decades of KERA, I don’t see how anyone can be very happy about the fact that less than one in three of our eighth grade kids is proficient in any of these subjects, while in the critical areas of reading, writing and “rithmetic,” the proportion is closer to only one in four students.

“Substantial” progress – Hardly!

Even if we had started at zero back in 1990 (and we didn’t start that low), the proficiency rate growth is so slow that we are decades away from seeing rates that I would call “substantial” in Kentucky’s NAEP results.

Not sufficient – You Bet!

12 comments:

Anonymous said...

NAEP does suggest that Kentucky students are matching their national peers in the percentage of eighth graders who are meeting grade-level expectations. NAEP's Kentucky vs National percentage of eighth grade students at or above grade-level expectations was 65 vs 67 in 2003, 64 vs 68 in 2005, 69 vs 70 in 2007, and 70 vs 71 in 2009.

Anonymous said...

Sorry, the statistics above are for eighth grade mathematics.

Richard Innes said...

RE: Anonymous Dec 21 at 3:21PM

Sorry, Anonymous, but NAEP does not show what you allege.

Very simply, the NAEP does not report scores in an "at or above grade level format."

The numbers you cite are for kids who only obtained partial mastery of the NAEP subject material. The official classification for this less-than-desired performance level is the percent of students achieving "At or Above Basic." This is not a suitable target.

NAEP reports say, "Basic denotes partial mastery of prerequisite knowledge and skills that are fundamental for proficient work at each grade."

That, most definitely, is not an "at or above grade level" expectation. It's only partial mastery of just basic and prerequisite skills.

Your attempt to redefine NAEP results to confuse our readers is unacceptable.

By the way, the nonsense in this improper use of NAEP scores is very similar to an "at or above grade level" abuse that the Jefferson County school system is committing with the CATS KCCT achievement level scores. I have written plenty about that already.

For our other readers who don't want to be hoodwinked by the "at or above grade level nonsense," find the "right stuff" about the NAEP by checking this blog and other Bluegrass Institute Web products like www.freedomkentucky.org.

On more point, since the original blog was about the lack of "significant" progress. Consider even Anonymous' data. Note that in 2003 even using the At or Above Basic NAEP percentages that Anonymous quotes, Kentucky eighth graders were two points behind the nation in 2003 and only closed that by one point in the next six years. Anyone think that is significant? And, to repeat, those numbers are for kids performing to a low target that most definitely isn't what NAEP wants (which is the NAEP "Proficient" level).

Note: You can find the NAEP scores and definitions in the 2009 NAEP Mathematics Report Card at: http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2010451.

Anonymous said...

Here's a interesting quote from NCES/NAEP about the interpretation of "NAEP Proficient." You will note that the NCES/NAEP definition is very much different from the one the Bluegrass Policy Blog is trying to promote here.

"NAEP's policy definition of its 'Proficient' achievement level is 'competency over challenging subject matter' and is implicitly intended to be higher than grade-level performance." [Andrew Kolstad, Senior Technical Advisor, Assessment Division, National Center for Education, Statistics]

Kentucky eight graders did gain only one point on their national peers in mathematics. What make the gain interesting is that while Kentucky gained one point on the nation, the nation itself gained four points. No basis for condemnation here!

Richard Innes said...

RE: Anonymous 22Dec09 at 9:31 AM

The Bluegrass Institute isn't pushing its 'own' NAEP definitions. The definitions stand by themselves.

I already cited the definition of NAEP "Basic" in my earlier comment.

Here are the other definitions for NAEP Achievement Levels from the 2009 NAEP Report Card.

"Proficient" represents solid academic performance. Students
reaching this level have demonstrated competency over
challenging subject matter.

"Advanced" represents superior performance.

By the way, while he is at NCES, Andy Kolstad wouldn't be the official, final voice on NAEP Achievement Level definitions even if he did say what Anonymous alleges (I would like a reference for that alleged comment). The National Assessment Governing Board is the final determining organization for NAEP definitions.

The NAEP report card does not say what Kolstad is alleged to have said. I did not find any such language in the expanded discussions on NAEP Achievement Level Scores in the NCES web site, either.

Perhaps Anonymous confused comments about the "Advanced" score in NAEP with the "Proficient" score.

In any event, I see nothing in any of the definitions about NAEP Basic being an indication of "At or Above Grade Level" performance.

Note: You can find the NAEP scores and definitions in the 2009 NAEP Mathematics Report Card at: http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2010451.

Anonymous said...

The confusion about the difference between NAEP Proficient and NAEP Advanced is not mine. Here a quote from a 2001 National Assessment Governing Board publication that tells how NAEP Proficient should be understood. "Notice that there is no mention of 'at grade level' performance in these achievement goals. In particular, it is important to understand clearly that the Proficient achievement level does not refer to 'at grade' performance. Nor is performance at the Proficient level synonymous with 'proficiency' in the subject. That is, students who may be considered proficient in a subject, given the common usage of the term, might not satisfy the requirements for performance at the NAEP achievement level. [...] Finally, even the best students you know may not meet the requirements for Advanced performance on NAEP." This means that some of the best students you know who are more than proficient in the subject -- given the common meaning of proficient -- will score at NAEP Proficient, not at NAEP Advanced.

I've read somewhere that NAEP Basic starts at the C-/C+ classroom performance level, while NAEP Proficient starts at the B+/A- classroom performance level.

Anonymous said...

I found this clairification about the NAEP Proficient achievement level in the NAEP 2009 reading framework (p.47), which is an official National Assessment Governing Board publication. "Proficient readers will have sizable meaning vocabularies including knowledge of many words and terms above grade level."

Richard Innes said...

RE: Anonymous 22Dec09 at 12:18

It's no surprise that NAGB didn't want an "at or above grade level" tag applied to the "Proficient" level score. The term "Proficient" implies a suitable target. Current average performance in US schools isn't good enough to be a target.

Since Anonymous don't mention it, I assume NAGB said nothing about "at or above grade level" applying to NAEP "Basic," either. As I mentioned earlier, I have not seen any previous attempt to equate those.

I have not seen any discussion of classrooom grade levels being associated with NAEP achievement level scores, either. Of course, all sorts of independent writers say all sorts of things about NAEP. Some are more accurate than others.

I had another thought about the "at or above grade level" business. I think some are trying to say that current average performance is "at or above grade level."

I suppose you could make a case for such a definition, but problem, as I previously mentioned, is current academic performance across the country is perceived by most to be too low. US performance on international tests like TIMSS and PISA strongly reinforce that perception.

In this sense, if we only shoot for current "at or above grade level" performance, we shoot way too low.

There are also definitional issues. How, exactly, do you define "at or above grade level." Is it measured by the 50th percentile on a test? The 40th percentile? Anything above the 20th percentile? By a set of detailed performance standards?

What does the term really mean? What should it mean? What does the public interpret it to mean? How about educators?

NAEP certainly offers no help to answer, as the NAEP avoids this term.

Anonymous said...

Here is the problem. In 1990, the National Assessment Governing Board set and defined three achievement levels, i.e., Basic, Proficient and Advanced, as goals to which the nation should aspire. In 2001, the National Assessment Governing Board made it perfectly clear that NAEP Proficient is more than “grade-level” performance. Also in 2001, the No Child Left Behind Act required states to establish achievement levels with the same labels (i.e., proficient and advanced) but made the states define the labels differently. The No Child Left Behind Act required states to identify grade-level expectations for their students, and to define “proficient” as meeting grade-level expectations. As a result, there are two federal definitions of “proficient,” one for the national assessment and one for state tests. When NAEP reports percent at or above Proficient, it looks at students whose work is more than (or above) grade-level performance. When states report percent at or above proficient, they look at students whose work is at or above grade-level performance. It is pure nonsense to compare NAEP “percent at or above Proficient” to a state’s “percent at or above proficient.” While the Bluegrass Policy Blog is free to ignore the No Child Left Behind Act when convenient, a state must always attend to the law's demands.

Richard Innes said...

RE: Anonymous 22Dec09 2:31 PM

There has indeed been considerable concern about what passes for proficient work on state assessments like CATS versus what the technical experts who create the NAEP consider to be truly proficient performances.

As events have proved, the states actually had free reign to set their standards at any level and to revise those standards pretty much at will (as Kentucky did, further downward, in 2007).

Most states set their expectations very low, and many then lowered them even more over time, but NCLB certainly didn't require that.

I think the congress expected states to do their job well and expected standards closely aligned to the NAEP. That is probably why NCLB most definitely requires all states to participate in NAEP math and reading if they want to continue to receive federal money.

We see dramatically different results for state and NAEP testing today because the states violated the trust of congress to do the right thing for kids versus doing things to protect the status quo for the education bureaucracy.

Your entitled to your opinion that it is pure nonsense to compare NAEP and state tests, but that opinion isn't very widely shared.

For example, NCES has completed several studies that directly compare state results to the NAEP. NCES probably wouldn't have done that if it was inappropriate.

Anyway, congress and the US Department of Education are clearly upset that the states took the low road on NCLB test standards.

That is why there is now great emphasis on getting the National Governors' Association/Council of Chief State School Officers common core standards up and running as the de facto national standard that each state will follow. It remains to be seen if this effort will work out in practice.

If you try to sell the idea that NCLB required states to adopt low testing standards, I think you will find yourself in very limited company.

NCLB did make a mistake. It trusted the states to set standards that would help our kids survive in worldwide competition. The states let us all down, but the states didn't have to do that because of NCLB.

One last point, how, exactly, do you think the Bluegrass Institute is ignoring NCLB? Please provide some specific examples. Your general accusation won't due.

Anonymous said...

Six rejoiners: [you said], my reply

[1] There has indeed been considerable concern about what passes for proficient work on state assessments like CATS versus what the technical experts who create the NAEP consider to be truly proficient performances.

It is surprising after all this time that the Bluegrass Blog still don’t now that “proficient” on the state test had to pass an on-site review conducted by a team of out-of-state experts in curriculum, measurement and statistics before the term could be implemented (a review of technical experts, if you will). The U.S. Department of Education assigned membership to the team, not the state.

[2] We see dramatically different results for state and NAEP testing today because the states violated the trust of congress to do the right thing for kids versus doing things to protect the status quo for the education bureaucracy.

We see dramatically different results for state and NAEP testing when someone makes apples to oranges comparisons between the two tests. These dramatic differences reside in the organizations and individuals who conduct and publish bogus comparisons, not in anything the states have done.

[3] Your entitled to your opinion that it is pure nonsense to compare NAEP and state tests, but that opinion isn't very widely shared.

You totally misrepresent my opinion. It is pure nonsense to compare NAEP apples to state oranges. The NAEP Validity Studies Panel (in the American Institutes for Research), for example, has statistically demonstrated that NAEP Basic is the most appropriate comparison for state proficient. It seems these NAEP experts don’t measure up to your standards.

[4] For example, NCES has completed several studies that directly compare state results to the NAEP. NCES probably wouldn't have done that if it was inappropriate.

I am unaware of any NCES studies that “directly compare state results to NAEP.” NCES has conducted studies using the NAEP scale to define and compare the “rigor” of the various state standards, and to compare the rigor of the state standards to the NAEP achievement level cut-scores. (NCES mapping study No. 2010-456, for example.) The only clear finding from these NCES mapping studies is there is little or no correlation between the rigor of a state’s standards and the overall student achievement in the state.

[5] If you try to sell the idea that NCLB required states to adopt low testing standards, I think you will find yourself in very limited company.

My position is that NCLB required states in 2001 to define the term “proficient” by the common meaning of the term, namely meeting the state’s grade-level expectations for each grade (and enforced that requirement via a mandatory review process conducted by curriculum, measurement, and statistics experts). NAEP did not use the common language definition for the term in 1990. In fact, the National Assessment Governing Board stipulated a definition for NAEP that describes some of the best students you know as “Proficient.” (Footnote: it's hard for me to believe that Congress in NCLB intended that by 2014 one hundred percent of America's students would be numbered among the "best students" that you know.)

[6] One last point, how, exactly, do you think the Bluegrass Institute is ignoring NCLB? Please provide some specific examples. Your general accusation won't due.

The Bluegrass Bog, for instance, as illustrated by all of the above, absolutely ignores the NCLB burden placed on the states to define “proficient” differently than NAEP defined “Proficient.” Ignoring this has lead to bogus studies (i.e., NAEP Proficient vs. state proficient) that misinform and undermine public dialog by creating and reporting magical “dramatic differences” in student achievement as reported by the state test and NAEP.

Richard Innes said...

RE: Anonymous 23Dec09 10:53 AM

Anonymous writes:

“It is surprising after all this time that the Bluegrass Blog still don’t now (sic) that “proficient” on the state test had to pass an on-site review conducted by a team of out-of-state experts in curriculum, measurement and statistics before the term could be implemented (a review of technical experts, if you will). The U.S. Department of Education assigned membership to the team, not the state.”

We are well aware of that review. Anonymous misreads what happened. All this process was allowed to do in each state was confirm that the state tests did indeed measure the watered down goals that the states had set for themselves. The reviewers were not allowed to alter the low rigor of the tests. That is why the current degree of rigor in the state assessments varies so widely.

Anonymous writes more. The quick gist of much of those comments is that Anonymous essentially doesn’t understand the existence and implications of several recent National Center for Education Statistics (NCES) studies.

NCES has published at least two comparison studies of the type Anonymous claims are “bogus.”

The most recent is: “Mapping State Proficiency Standards Onto NAEP Scales: 2005-2007,” on line at: http://nces.ed.gov/nationsreportcard/pubs/studies/2010456.asp

The NCES did an earlier study, “Mapping 2005 State Proficiency Standards Onto the NAEP Scales,” on line at: http://nces.ed.gov/nationsreportcard/pubs/studies/2007482.asp

Mapping state standards onto the NAEP scales involves a basic and fundamental comparison activity between the state tests and the NAEP, whether Anonymous admits it or not.

So, Anonymous is calling the activities of the National Center for Education Statistics “bogus.”

It should also be noted that the 2005-2007 report is co-authored by Victor Bandeira de Mello and Charles Blankenship of the American Institutes for Research and Don McLaughlin of Statistics and Strategies. While the report doesn’t mention it, McLaughlin has also been a member of the NAEP Validities Study Panel. That’s an interesting collection of individuals and organizations that are engaging in “bogus” activity, if Anonymous were to be taken at face value.

Anonymous also misreads the implications of the American Institute for Research study that did show most state standards for “Proficient” are equivalent to the NAEP “Basic” level. That’s not a surprise. The point is that the NAEP Basic level is too low a target. The fact that the states are shooting at too low a target is just substantiated, yet again, by the AIR study. Partial mastery of material can’t possibly be a suitable target, but, as I pointed out in an earlier comment in this string, that is what NAEP “Basic” represents.

One more comment. NCLB did require states to define Proficient against their standards, but it didn’t require states to have low standards. In fact, the law required the states to have,

“(i) challenging academic content standards in academic
subjects that—
(I) specify what children are expected to know
and be able to do;
(II) contain coherent and rigorous content;
and
(III) encourage the teaching of advanced
skills; and
(ii) challenging student academic achievement
standards that—
(I) are aligned with the State’s academic content
Standards.

The standards were supposed to be challenging. They didn’t turn out that way. But, nothing in NCLB said they had to be set low. Quite the contrary was hoped for, but not delivered.

That is why NCES is doing all those studies as congress gets ready to fix this problem.