close
close
How to interpret—and not misinterpret—upcoming NAEP results

Recently my daughter asked me to describe my job. I had to think for a minute. Statistical interpreter? Result translator? None worked. I needed an elevator pitch, one that my quasi-curious teenager would easily understand without drawing too much attention.

“Every two years I tell people how the country’s school systems educate America’s students,” I told her. She looked almost impressed, and I congratulated myself for omitting any mention of statistical significance or performance levels.

Even better, I could tell that another one of these is imminent every two years.

We are currently preparing for the next publication of the results. Book your calendars for early 2025. In the meantime, I’ll tell you what I’ll look for in the results and what I’ll avoid when I talk about them. Namely misNAEPery – a clever term to describe how the results can be misused and misinterpreted. Kudos to Morgan Polikoff for explaining generic variants of misNAEPery. My goal here is to highlight specific pitfalls to avoid in this upcoming release.

The way I see it, yes. NAEP is not a magic 8-ball of political support or failure. However, the NAEP reading scores will not serve as a referendum on whether, for example, the science of reading is real or whether it works. State results in 2024 will invariably be different from those of 2022 and 2019, perhaps in small ways, perhaps in large ways. Regardless, this cannot be viewed as proving or disproving the science of reading.

BPE (pre-pandemic era). It’s easy to attribute any drop in scores since 2019 to Covid. The results fall; The pandemic is to blame. But even before school disruptions due to the pandemic, we were seeing a drop in NAEP scores. NAEP 8th grade reading scores began declining in 2015. Think beyond the pandemic.

Please stop this ride. NAEP assessments from 2024 should not dictate or derail policy implementation in 2025. Trends require more than one or two data points. NAEP cannot be used to evaluate a specific policy. Implementing policies requires time, persistence, consistent investment and patience. A data point from 2024 should not signal the need to stop, drop and run, nor the need to throw everything at a particular intervention. See above on the topic “Science of Reading”.

No to the Nyah Nyah. If one state’s averages go up and another state’s go down, that doesn’t justify “I told you so!” claims. What shapes the average of a state or district depends on countless factors. And what rises now may fall in the future. Check out the results to see who is doing well. (Massachusetts led educational changes in the early 2000s, most recently Mississippi.) This is certainly a smart way to identify and share promising practices. No gloating, just learning. Knowing that your experiences may vary.

Competent, but not competent. Remember: NAEP success levels indicate what percentage of students demonstrate “competence in challenging topics.” That’s what we call NAEP proficient (Note the italics!) to distinguish this from what states describe as competent in their state assessments. They are not the same even though they use the same word. Our definition may seem vague, but we’ll try to make it clear here. Different assessments, different purposes. Performance comparison Cut scores A nationwide representation is possible, but only if you carry out the state mapping study.

Be real. NAEP results may or may not differ from previous years’ results. This is the nature of assessment. But will we suddenly see that 50 percent of students are? NAEP proficient in math? Probably not. This isn’t a spoiler; that’s just realistic. And if the results show less than 50 percent NAEP proficientDoes this mean the education system should be destroyed? No. The results simply outline the scope of the challenge and underscore the P in NAEP: not performance, but Progress.

So far I’ve focused on what not to do, which is a bummer. Let me pivot to the positive.

No shortcuts. Progress comes through hard work, not miracles, panaceas or panaceas. The results show us rough patterns; Because of the initial decline in average scores almost a decade ago, we began to think about problems in reading instruction. NAEP suggests no easy solution. It tells the story.

They do their part. It’s worth repeating: progress isn’t easy. School administrators, staff, teachers and students work hard every day. You know that, and we appreciate it. Nothing in any review refutes this. Period.

A little competition. I warned you not to get upset if your state or district looks good in the next report card. But I didn’t say not to engage in healthy competition. Maryland State Superintendent Carey Wright, a former Mississippi superintendent, stepped up efforts to improve education in the Magnolia State by refusing to let the adage “At least we’re not Mississippi” stand any longer when the NAEP results were announced. She seems determined to make progress from her new location and will no doubt be keeping an eye on how Maryland’s neighboring states fare.

NAEP as the backbone. Highly accomplished researchers use NAEP data as the backbone of pragmatic tools to understand how states and districts are helping their students make academic progress. See for example the Education Recovery Scorecard and the New York Times‘s school district tool.

Do your part. Prioritize school attendance. Read to your children at home – even if they are still teenagers. Who doesn’t love having a good story read to them? Check your audiobook history for proof of this. Encourage your children to read books at home or wherever.

One more thing: don’t forget. NAEP is scheduled to be released in early 2025. Stay tuned.

Leave a Reply

Your email address will not be published. Required fields are marked *