Liz Twist, Head of Assessment Research and Development
‘Missed’ or ‘lost’ learning is something we have been hearing a lot about recently, but what does it actually mean? I would declare a preference for ‘missed’ learning over ‘lost’ learning – how can you lose something you never had? Given that schools were partially closed several times between March 2020 and March 2021, and that some children missed additional schooling as they needed to isolate, it is completely reasonable that they will have missed covering parts of the curriculum that schools might have expected to cover in a normal school year.
We now have evidence about the actual impact of the pandemic on children’s achievement. One source is a study NFER is conducting, funded by the Education Endowment Foundation, with key stage 1 pupils in England. While this study is ongoing, we analysed and published two rounds of findings and diagnostic information into the impact of partial school closures on KS1 and potential implications for practice in year 1 and 2, which compared how year 1 and 2 children performed in standardised assessments in the 2020/2021 academic year, compared to previous cohorts.
Findings from autumn 2020 found that these children were, on average, 2 standardised score points behind where we might have expected them to be, equivalent to two months less of progress, i.e. performing as though they were two months younger. And it’s not the only study that has found this level of progress.
But of course, this is the average. Some children have made more progress than this, and some, critically, have made less. And no one would deny that some children were already trailing behind their peers before the pandemic struck. But it’s important to note: most children are making progress. The pandemic may have impacted on the rate of progress in some basic skills such as reading and maths but children continue to develop and learn, out of school as well as in school.
All of which points to the need to be able to identify as clearly as possible the specific aspects of learning that need to be addressed in the time left in this school year. One way of doing this is to undertake a question analysis from standardised test data. This involves looking at how your class or year group perform, individually and collectively, compared to a nationally representative standardisation sample, and then using that information to inform teaching. When that’s combined with the rich data available from diagnostic commentaries, then there’s a real opportunity to target teaching effectively as part of the recovery plan.
NFER’s spring tests include these diagnostic commentaries. While you may opt to use tests standardised in the summer term in the coming months, using tests intended for use in the spring term in the summer term is appropriate in this year of disruption, and gives schools flexibility to focus on the diagnostic information in the teacher guide. Because they were standardised at a slightly earlier point in the school year, the tests are also slightly easier than the summer series, which may be appropriate this year.