Tracking Attainment Without a Counterfactual – Lessons from the Covid-19 Pandemic: A Meta-method Review
15 January 2026
This methodological review highlights the need for robust, comprehensive methodologies in tracking educational attainment during disruptions, drawing on expertise in complex sampling, psychometrics, and observational studies.
The report reviews seven major studies that tracked pupil attainment during and after the Covid-19 pandemic in England to provide actionable recommendations for future research both before and during wide-spread prolonged educational disruptions.
The seven studies, commissioned in challenging circumstances, sought to understand the impact of school closures and disruption on learning across all key stages. They vary widely in their methodologies, measures and conclusions.
Using an extended ROBINS-E risk of bias framework, the report identifies common risks of bias, arising from insufficient confounding factors, selection effects, missing data, post-exposure interventions, limited validation of outcome measures.
It also considers risks of misinterpretation, given the complexity of findings and the potential for overgeneralisation.
Key Findings
There are several recommendations made, which aim to improve the reliability and validity of future studies:
- Foundational for future studies, comprehensive causal models for learning and participation should be developed and agreed upon by the research community.
- Studies should be designed to enable inclusion in any subsequent meta-analysis studies, and this may be enhanced by improved coordination.
- Increased use of questionnaires should be considered to reduce the impact of, and help interpret, any selection bias and missing data. This can be achieved by surveying both selected and non-selected schools, and participation and non-participation schools, the responses of which would give valuable confounder measurement.
- Missing data analysis should include sensitivity analysis under various plausible missing not at random (MNAR) scenarios.
- To enable the efficient use of existing data in future studies the data gathered by test publishers should enable direct matching to the national pupil database, and student test data use permissions should consider being opt-out and should include usage for such studies.
- Score and analysis validity should be strengthened to ensure that reported findings reflect true effects rather than artifacts of data generation or analytical procedures.
- Reporting transparency should be enhanced to prevent over-generalisation and to ensure that models and assumptions are presented comprehensively and clearly for technical experts.