What are the challenges of assessing life skills?

By Lisa Kuhn, Research Manager

Wednesday 1 May 2024


In the second of a series of life skill assessment blogs, assessment experts at NFER consider challenges and complexities when developing life skills measures. 

From Kenya to New Zealand, education systems around the globe have recently started to shift to competency-based curricula. As well as more traditional cognitive skills, which are usually being taught as part of the school curriculum (i.e. literacy or numeracy), competency-based curricula include a focus on the development of life skills, such as problem-solving or critical thinking. Such skills are also directly related to a range of education outcomes. They can for instance influence factors such as school attendance, academic success or employability and are necessary for leading a happy and healthy life.  

As a result, one question that NFER and other assessment experts have been grappling with is how to reliably measure life skills with young people. When developing a new measure, there are a number of factors to consider. Researchers and statisticians must ensure that the tool is reliable, which means that the construct is measured in a consistent way across questions, participants, or time points. There are also challenges to validity regarding how accurately a specific construct can be captured or measured by the tool and how easily the findings can be applied to different situations. Throughout the development of a life skills measure, we commonly encounter complexities that need to be navigated carefully:  

Theoretical challenges

  • It is undeniable that the constructs underlying these skills are difficult to categorise. In developing life skills measures, careful consideration is required to ensure that assessment domains are clearly defined and assessment questions are targeted to the relevant skills. 
  • There are no agreed approaches for measuring skills, even within the most similar settings. Definitions might need to be adapted for individual contexts.  
  • The measurement tool needs to be fit for purpose. The intended use should be determined early on in order to ensure the outcomes are valid. For example: Is this a formative or summative assessment? What will the data be used for? 

Methodological challenges

  • The most common approach for measuring skills are self-report tools such as questionnaires. These have limitations as they can introduce bias, but they are comparatively easy to administer and analyse. More innovative item types require considerably more resources to develop but data might offer a truer reflection of skills (see our first blog in this series).
  • There is a trade-off between measuring a skill holistically and in depth versus measuring at a scale that can be efficiently applied to a wide pool of participants. Tool developers are often dealing with constrained resources and must therefore weigh up the most appropriate assessment method.
  • In developing skills assessments, we must determine the minimum level required to pass the assessment (known as standard setting). This can be difficult as the curriculum coverage of competency-based skills is an evolving area. There is however much progress, for example the Victorian Curriculum and Assessment Authority has defined standards for critical and creative thinking in their curricula.  

Practical challenges

  • School curricula are more often dominated by traditional cognitive skills, meaning that life skills are not always integrated into teaching practices. The implementation of skill-based curricula and corresponding assessment may put additional pressure on schools and teachers.  
  • Context matters: Delivery mode (for example online vs pen and paper), biasing (for example cultural influences and expectations) or socioeconomics (for example parental income) can introduce unexpected movements in the data. 
  • Any data that reveals sensitive facts about an individual, such as the collection of information relating to an individual’s mental well-being, is classified as ‘special data’. Compared to cognitive assessments, this data must be treated with more caution and participants must be made aware of relevant ethical and safeguarding implications. 

Despite these challenges, significant progress has been made in determining the best approaches for measuring skills and we are pleased to have worked across a variety of life skills projects. For example, in our work supporting UNICEF and the World Bank in collaboration with the International Association for the Evaluation of Educational Achievement (IEA) and Roehampton University, we developed an assessment based on the Life Skills and Citizenship Education (LSCE) framework for use in the Middle East and North Africa (MENA) region (see our spotlight page that showcases more of our work on life skills).

The take-away message from our work to date is that there isn’t a ‘one size fits all’ approach for developing life skills measures. Further methodological and statistical adjustments might have to be considered to minimise biasing. For example, anchoring vignettes could be used, a little like practice questions. This detects differences in the way people might respond to the same self-report question and adjusts for these biases accordingly. Evaluating the strengths and limitations of different approaches carefully and being aware of the potential implications on the data are key to successfully measuring skills in a meaningful manner and in supporting the learner's journey. The education community is making brilliant strides in the development of life skills measures, and we look forward to continuing our contribution to this vital area.  

References

  1. UNICEF 
  2. CBC MATERIALS 
  3. Kia Ora - NZ Curriculum Online 
  4. Victorian Curriculum and Assessment Authority 

NFER Links

  1. Life skills and wellbeing spotlight page
  2. International development case study - Life skills assessment development for UNICEF and The World Bank