By Julie McCulloch, Director of Policy at the Association of School & College Leaders (ASCL)
Monday 2 December 2019
In 2001, the International Dictionary of Educational Terminology (if such a tome doesn’t exist, it really should) gained a new phrase. PISA-shock (noun, the impact of a country’s results in the new Programme for International Student Assessment not matching that country’s own perception of its education system) hit a number of countries, most famously Germany, particularly hard, and led to unprecedented soul-searching among policy-makers around the world.
Education systems are currently bracing themselves for the latest PISA eruption. The results of the 2018 PISA tests are set to publish on 3 December. Taken by 600,000 15-year-olds in 79 countries, the data from these tests will be pored over from Albania to Vietnam. Focusing particularly this time on reading, the results will provide a compelling snapshot of how well we are preparing young people across the globe for their future lives.
The PISA tests are designed to assess students’ knowledge and skills in the domains assessed (reading, maths, science, financial literacy and ‘global competencies’), as well as their capacity to apply their thinking. The tests are intended to help policy-makers and anyone else interested in education to answer three questions:
- Are schools adequately preparing young people for adult life?
- What kind of learning environments do we find in high-performing countries?
- Can schools improve the futures of students from disadvantaged backgrounds?
These are important questions. And over the last eighteen years the OECD (which administers the tests) has used the ever-expanding data the test results provide to propose some answers. For example, the findings suggest that money matters – but only up to a point, after which results tend to level out. There appears to be no direct correlation between the time students spend in school and their learning outcomes (at a national level) – what matters is how productively that time is spent. And some countries appear to be doing much better for their disadvantaged students than others. Astonishingly, in 2012, the most disadvantaged students in Shanghai performed as well on the maths test as the least disadvantaged students in the USA.
As ever, such statements raise many more questions than they answer. How much money is enough – and does that vary between countries? What does ‘productive’ time in the classroom look like – and is this the same in different societies and cultures? What’s the story behind the Shanghai maths results?
School and college leaders across the UK, struggling to make it through the last few weeks of a long, dark autumn term, may be forgiven for looking up briefly on 3 December before getting back to the day job. And, in many ways, they’d be right to question the weight now put on these assessments. ASCL’s long-held view on the way in which we judge our schools and colleges is that we must avoid giving too much importance to any one measure, or narrow set of measures. Over-focusing on any small subset of data risks not only providing a false picture of a school’s effectiveness, but also distorting the education they provide, as schools strive to meet those measures to the detriment of other, equally important, factors.
As with individual schools, so with education systems. The PISA tests undoubtedly tell us something, and something useful, about the standard of education in the countries that take part. The opportunity to benchmark the performance of one country against 78 others is valuable, in an age in which many of the young people leaving our schools and colleges at 16 or 18 will be competing for jobs with others from around the world.
But we, and more importantly the policy-makers making decisions based on these results, must be cautious about how we interpret and respond to what we find when we open that PISA results envelope next week. Just as its SATs or GCSE results don’t define a school, PISA results don’t define a country.
We should treat the results as an opportunity to think deeply about what we want for our young people, to explore the questions they raise, and to add to the knowledge we already have about our own education systems. Let’s move beyond kneejerk responses and PISA-shock, and encourage a measured, thoughtful response to what is, after all, one piece of information among many.
NFER has conducted the survey on behalf of the DfE (England), the Department of Education (Northern Ireland), the Welsh government and Scottish government. However this publication does not necessarily reflect the views of the respective departments.