Rethinking Emergency Remote Learning

The story that we’ve been hearing—and telling—about emergency remote learning is something like this:

  1. The COVID-19 pandemic hit in early 2020.

  2. Physical schools closed and shifted immediately to emergency remote learning.

  3. Emergency remote learning wasn’t the same as well planned online learning, because it was implemented with almost no planning.

  4. Emergency remote learning also wasn’t nearly as effective as in-person instruction, for the same reasons.

  5. Therefore students “lost” considerable amounts of learning compared to expectations, as evidenced by NAEP and other assessments.

That’s the simple and compelling story. But now we have data that suggests the reality is more complex around the comparison of emergency remote learning to in-person instruction during the pandemic.
 
The data are clear that student achievement was down during the pandemic. For example, NAEP scores demonstrate that students did not do well during the pandemic. From NPR:
 
“Math and reading scores for students across the country are down following years of disrupted learning during the pandemic. On Monday, the National Assessment of Educational Progress (NAEP), also known as the Nation's Report Card, released a full report for the first time since 2019; the results show a slight dip in reading scores and a drop in math. (snip)
 
When you compare the most recent results to past years, it paints a stark picture:
 
In 2022, the average fourth-grade math score decreased by 5 points to its lowest level since 2005. The average eighth-grade math score decreased by 8 points to its lowest level since 2003.”
 
Other outlets were even more negative:
 
“COVID and the resulting school closures spared no state or region. The pandemic resulted in historic learning setbacks for America's children, erasing decades of academic progress and widening racial disparities…”
 
From The74:
 
“What staggers is the scale of the declines — especially in math. Multiple states saw drops of more than 10 points in math at the fourth- and eighth-grade levels. Reading results, too, were adversely impacted — again, sometimes substantially. There is the occasional bright spot, yet overall it’s an across-the-board disaster for the United States.”
 
I added the emphasis in the first quote, because the sentiment was common that a primary cause of learning loss was remote learning. Many advocates and politicians called for schools to re-open because they believed that emergency remote learning wasn’t effective; the corollary was that re-opening schools would result in better instruction and learning outcomes.
 
But the latest data call that assumption into question. From Chalkbeat:
 
“Using the latest national and state test score data, a team of researchers found that districts that stayed remote during the 2020-21 school year did see bigger declines in elementary and middle school math, and to some degree in reading, than other districts in their state. 
 
But the losses varied widely — and many districts that went back in person had bigger losses than districts that stayed remote. The pattern is inconsistent enough that school closures, it seems, were not the primary driver of those drops in achievement.
 
‘Based on the discussion before these results came out, you’d think that the only thing driving achievement losses would be remote learning, but actually that does not seem to be the case,” said Thomas Kane, a Harvard professor of education and economics who co-led the research. “I was really surprised by these results.’” (emphasis added)

Other sources reported similar findings. For example, from The Hechinger Report:

“…there were no easy explanations and no clear connections between policy decisions on remote learning and how much academic achievement suffered.
 
‘There’s nothing in this data that says we can draw a straight line between the time spent and remote learning, in and of itself, and student achievement,’ said Peggy Carr, commissioner of the National Center for Education Statistics (NCES). ‘We have massive comprehensive declines everywhere, where in some cases, they were in remote learning longer or shorter than others. It’s just too complex to draw the straight line.’”

How to think about these new findings? It’s unclear. For one thing, previous studies had found links between lower student achievement and remote learning. One new data set, even as comprehensive as NAEP, doesn’t rebut all other findings. Also, there’s plenty of anecdotal evidence of poor emergency remote instruction, so the data were fitting perceptions.

But sometimes it’s exactly when data fit perceptions that we must look more closely. In this case, perhaps some drivers of student outcomes during the pandemic were related to broad COVID-19 impacts on students’ health and wellbeing, and these factors were more important that instructional modality.

It’s unclear at this point, and may remain unclear for a long time. The evidence that emergency remote learning was different than high quality online learning remains. And, to be clear, it’s not that the data are now showing that remote learning worked well—instead, the data suggest that both emergency remote learning and onsite learning produced poor results during the pandemic.

Previous
Previous

Online learning is well understood

Next
Next

Adventures in awful headlines