Ask the expert: do methods matter?

In this ‘Ask the expert’, Emily Jones, Research Director at NFER, explores the relative success of some different strategies used by pupils completing maths questions in NFER’s new tests for year 6 pupils.   

When interpreting assessment results, it can be tempting to focus on the number of pupils who find the correct answer. However, it can also be revealing to study the incorrect answers provided and also the range of strategies used by pupils, as these can provide us with additional insights into what misconceptions need to be addressed or what skills pupils must develop.

It is for these reasons that NFER has developed diagnostic commentaries for all papers in the new range of year 6 tests. These commentaries are designed to help teachers identify the common errors and incorrect approaches that their pupils have taken and to understand the root causes. Further, they provide suggestions of how to address these in subsequent teaching. The commentaries are the result of an in-depth analysis looking at the test performance of a large nationally representative sample of pupils, the scale of which has enabled a focus on pupils of different abilities. The broad guidance therefore makes suggestions about how to improve the attainment of lower achieving pupils whilst also providing extension for more able pupils.

An example of the analysis carried out can be seen in the question below. This relatively simple question prompted no common errors but a look at the strategies for answering the question is quite revealing.

 

 

Pupil group

Proportion of pupils…

lower achieving

middle achieving

higher achieving

answering correctly

16%

67%

90%

not attempting the question

32%

4%

<1%

using short division and answering correctly

14%

63%

85%

using short division and answering incorrectly

30%

23%

8%

using an alternative written strategy and answering correctly

<1%

4%

4%

using an alternative written strategy and answering incorrectly

9%

5%

1%

using no written strategy and answering correctly

1%

<1%

1%

using no written strategy and answering incorrectly

13%

2%

<1%

 

Short division

In total, 60% of pupils gave the correct answer: 68. Predictably, most pupils arrived at the answer using short division, and in all three achievement groups (lower, middle and higher) most marks were achieved this way. However, the number of pupils attempting to use this method to begin with, and their relative success, increases with ability. Less than 50% of lower achieving pupils thought to try this method, while 32% didn’t attempt the question at all. In contrast, 86% of middle achieving pupils used short division – though this was executed with less success than was seen with the higher achieving group.

Other methods

The use of alternative strategies to short division led to fewer correct answers, but the effectiveness of using these strategies also varies by ability. The lower and middle achieving groups were more or less equally likely to try a method other than short division, but while almost half of middle achieving pupils using an alternative approach gained a mark, less than a tenth of lower achieving pupils were successful. The likelihood of achieving the mark by an alternative method to short division increases for higher achieving pupils – of the few that didn’t use short division, 4 in 5 pupils gained the mark. These pupils were also more likely to be able to work out the answer with no written working, perhaps because they were able to visualise the short division method without writing it down. In comparison, only a small proportion of the lower achieving pupils who showed no working managed to achieve the mark.

What does this all mean?

This example demonstrates that the approaches used to tackle questions varies between the three ability groups. To improve their performance, lower achieving pupils need to become more familiar with short division so they can recognise when to use it. Additionally, more practice would help to improve their accuracy. However, it also highlights that all pupils, regardless of ability, need reminding of the importance of showing their working, as doing so is more likely to lead to a mark. Therefore methods do matter: using a visual aid makes it easier to organise mathematical thinking, monitor progress through the task and check the final answer.

Did you know?

The NFER Tests range includes mathematics assessments for use across years 1-6. These standardised tests provide reliable standardised and age-standardised scores to help you confidently monitor attainment and progress, and are supported by a free online analysis tool. Click here to find out more.

Written by Emily Jones, Research Director at NFER.

Emily has been with NFER since 2001 and has worked on a wide range of projects, centred on developing test materials in science and mathematics, including for the Key Stage 1 and Key Stage 2 statutory tests in England and the Australian statutory NAPLAN test.

Emily is responsible for the development of NFER Tests in mathematics, reading, grammar and punctuation and spelling. She is also a consultant member of the TIMSS (Trends in International Mathematics and Science Study) 2019 Science and Maths Item Review Committee and in recent years has contributed to the development of their new practical science e-assessment tasks, as well as developing and trialling the standard science questions.

Do you have a question on assessment that you’d like to put to one of our assessment team? Send it through to us at assessmenthub@nfer.ac.uk.