Since March 2020, there has been much speculation about “COVID slide,” the adverse impact of interrupted, remote, and hybrid learning on student achievement. Research-based not-for-profit assessment organization NWEA indicated after spring 2020 school closures that students could return to the classroom in fall 2020 with less than 50% of the gains in math compared with a typical school year but, in November 2020, reported drops in performance around half of what was projected. Results from the 2020-2021 school year in full found that students made less growth and had lower overall performance in reading and especially math.
Now, in addition to NWEA’s ongoing reporting primarily on K-8 data, we have ACT releasing its first year-over-year study on its ACT suite of assessments for students in 5th through 12th grade.
Prior research shows that ACT tests are sensitive to instruction; in other words, ACT scores increase as students have more time in the classroom and more exposure to rigorous coursework, and scores decrease—when other factors are held constant—with disruptions in learning opportunities.
The study summarizes data from school-day testing on ACT Aspire Interim tests (grades 5-10), PreACT (grade 10), and the ACT test (grades 11-12) during the 2020-21 school year. The findings only represent schools testing a comparable number of students before (in the 2018-19 and/or 2019-20 school years) and during (in the 2020-21 school year) the pandemic and only include school-day testers, so likely were given in schools offering some instruction hybrid or in-person.
Across assessments and grade levels, the report indicates score declines, suggesting that disruptions due to the pandemic had a negative effect on students’ learning opportunities.
On ACT tests administered during the school day in fall 2020 and spring 2021, average scores decreased in every subject (see Table below). To better understand what these score declines represent in terms of student learning, ACT represents score declines in instructional months.
[Typically, ACT test scores increase with each additional month of schooling by 0.31 points in English, 0.19 points in math, 0.18 points in reading, and 0.19 points in science. An ACT English score decline of 1.02 points, then, is comparable to 3.3 fewer months of instruction (1.02/0.31 = 3.3). ACT also represents changes in percentile units, allowing comparisons of score changes for tests with different score scales.]
Across the subjects and grade levels assessed, score declines represented approximately one to three months of lost instruction.
A Trend in Lower Grades
Across all grades included in the analysis of ACT Aspire Interim test scores (Table below), the average percentile score change ranged from a maximum decline of eight points in fifth and sixth grade math to a minimum decline of one point in tenth grade English and science. The general trend is that lower grade levels seem to be affected more than upper grade levels and that score declines were most severe in math.
The findings show that students have still made learning gains over the last year—just not to the degree we would expect to see under normal circumstances. These score declines are relatively small, leading to the conclusion that students can catch up with additional instructional time. The rhetoric around “COVID slide” and “learning loss” has become divisive, leading to alarmism and fearmongering on the one hand and outright denial on the other. This ACT research points to a more specific way to measure and respond to concerns around interruptions in learning: represent it as specific “delays” by subject, by grade level, and by student.
As we continue to track research measuring learning delays by subject and by grade, we can remain focused on some of the simple premises behind the research:
1) Good, rigorous instruction leads to growth in skills that raise both scores and grades.
2) Measurement matters because the impact of interrupted learning varies by student, by subject, by grade level.
3) While the research indicates no need for panic (as some of the earlier predictions did), we can take specific, impactful action to support students and prevent this trend from persisting or compounding.
In the end, this research gives us simple direction: measure progress and focus instructional responses on learning delays this year, so students maintain academic progress next year and beyond.