Measures bank: evaluation questions for STEM outreach providers

Date published: 27 April 2026
A group of girls doing the Energy Quest fruit experiment in a team as their teacher supports them

Overview

This measures bank offers STEM engagement providers example questions that can be included in, or provide inspiration for, their research and evaluation. It has been developed from EngineeringUK’s own research and evaluation. 

The resource is designed to support users:

  1. consider the outcomes of the STEM outreach you want to evaluate
  2. find survey questions relevant for your own research and evaluation

Key information

The measures bank was created in 2021 from a range of resources including The Engineering Brand Monitor and EngineeringUK’s own student and teacher evaluation surveys.

It's been updated using resources such as the Science Education Tracker 2023 and findings from EngineeringUK’s cognitive testing with young people in 2024. 

Updates to the measures bank include:

  • adding 'age group' for each question its appropriate for
  • separating STEM fields in questions
  • definitions of key terms (such as engineering, technology and science jobs)
  • using the word ‘jobs’ instead of ‘careers’ in young people questions
  • removing outdated questions

The measures bank has example questions to ask young people, teachers, and parents. They can be included in, or provide inspiration for, STEM engagement providers’ research and evaluation. The questions come from:

  • The Engineering Brand Monitor an annual survey of young people, their parents, and STEM secondary teachers on their knowledge, perceptions and understanding of STEM and engineering
  • teacher feedback surveys which collect feedback from teachers whose students have taken part in EngineeringUK programmes
  • student evaluation surveys which assess the effectiveness of EngineeringUK’s events and activities to inspire young people to pursue STEM education and careers
  • Science Education Tracker 2023 which tracks evidence on key indicators for science engagement, education, and career aspirations among young people in England
  • findings from EngineeringUK's cognitive testing with young people 2024. We asked young people to explain their thought process while answering survey questions. This helped us assess how clear and accessible the questions were

The resource is designed to help users consider the outcomes they want to evaluate, and find relevant questions for their own programme evaluation.

It will continue to be updated periodically based on internal reviews, user feedback, as well as measures shared by others across the STEM community.

Who this is for

  • STEM outreach organisations

How to use the measures bank 

Download the measures bank spreadsheet. There are some handy steps to follow below. 

The tabs and dropdown filters in the spreadsheet will help you consider the STEM engagement outcomes you want to evaluate. You can then find the survey questions relevant for evaluating your activities.

Before using the Measures Bank, consider the outcomes of your activity. Ask yourself: 

  • what's the aim of the evaluation?
  • what outcomes are you trying to evaluate? 

Outcomes refer to the changes you want to make through your STEM engagement activity that contribute towards achieving a final goal. The final goal should be ambitious, relevant to the needs of your target group and, to some extent, linked to your activity.

An example of a final goal could be, 'to achieve a diverse engineering workforce that meets the needs of the UK, now and into the future'. 

Once your final goal is explicitly stated, the next step is to work backwards. Define the intermediate outcomes of your activity progressing towards the final goal. 

Given the many factors that can contribute towards a final goal (such as institutions, policies, norms and resources) it's important your outcomes are realistic and specific.

At EngineeringUK, we measure outcomes in line with the impact framework. The impact framework supports the engineering community consider how to measure the collective and long-term impact of STEM outreach.

We use a COM-B model. This is a framework for understanding behaviour change. It suggests that 3 key factors, namely: 

  1. Capability
  2. Opportunity
  3. Motivation

interact to produce behaviour (B). 

To make an informed decision as to whether a STEM career is right for them, young people need the capability, opportunity and motivation to do so. The impact framework is based on the COM-B model. 

To help you measure outcomes in line with the impact framework, we've categorised the questions in the measures bank spreadsheet. These are based on topics matching the sub-elements of the COM-B model, below:

COM-B elements Outcome Sub-element (mapped to topic) Outcome
Capability
Young people have the skills necessary to pursue educational and career routes into engineering
Hard skills Young people have the hard skills needed for engineering such as maths, physics, etc
    Soft skills Young people have the soft skills needed for engineering for example,  problem-solving, team-working, presenting
Opportunity Young people know the next steps, and have access to the opportunities, to pursue routes into engineering Careers information  Young people know about the career opportunities in, and pathways into, engineering
    Educational opportunities Young people have access to educational opportunities to pursue pathways into engineering for example, triple science
Motivation Young people are motivated to imagine an engineering occupation as a possibility for them Access to opportunity Young people have access to engineering employability/work readiness opportunities such as mentoring, work placements
    Image and application  Young people have an accurate picture of what engineering entails and its vast range of applications
    Inclusivity and attainability  Young people perceive engineering to be a relatable and inclusive profession

You may find it helpful to reflect on the outcomes of your activity to see which outcomes are most relevant for you. This can help you select questions that allow you to measure progress towards your outcome.

For example, if an outcome of a STEM engagement activity is that 'young people have the skills necessary to pursue educational and career routes into engineering', filter the questions in the 'Young person' tab by 1 of these topics: 'hard skills' or 'soft skills'.

The type and number of questions you ultimately choose to include in your survey will depend on a variety of factors. These are not only based on your outcomes, but also the way your activity is delivered, the objectives of your evaluation as well as the resources available to you.

Firstly, start by choosing the audience you want to evaluate. Ask yourself: who will be answering your survey questions?

Through the tabs on the measures bank spreadsheet, you can select from

  • young people
  • parents
  • teachers

Choose the type of measure. Ask yourself: what aspects of your programme do you want to measure? For each audience tab, we’ve included additional filters. You can choose from:

  • impact measures which can be used to evaluate the extent to which the activity is successful in achieving its outcomes
  • process measures which can be used to collect feedback on the activity and how it was delivered
  • context measures which can be used to measure the sociodemographic characteristics of respondents. It can also be used to analyse how the impact of the activity differs for these sub-groups

Select a topic. For each audience tab in the spreadsheet you can use this filter to narrow down your search further to see questions relating to particular topics of interest for your evaluation.

 

Take time to review the definitions. In the ‘glossary’ tab we've included definitions of key terms used in the measures bank. 

You can refer to the ‘updates made in 2026’ column to see the updates made to each question if helpful.

There are also links to further reading that you may find useful when developing your own evaluation.

 

 

Key learnings from 2024's cognitive testing

We interviewed 12 young people aged 10 to 14 for the cognitive testing and asked them to explain their thought process while answering survey questions. This helped us assess how young people interpret our questions, and whether the language used is clear and accessible.

We then piloted the questions at The Big Bang Fair 2024. Students completed surveys after spending at least an hour exploring the event. 

We've updated the young people questions using these key learnings:

  1. Tailor questions to content: make sure your questions align with the specific content of your STEM outreach. This means deciding whether to include questions about engineering, technology, or science based on the main focus of your activities. Including all options will lengthen the survey and increase the burden on young people responding, so choose relevant topics carefully. For example, if a programme focuses solely on engineering, avoid technology or science questions, as they may not reflect the programme’s impact accurately
  2. Separate STEM fields in questions: address engineering, technology, and science separately in surveys. Responses differ between these fields, so it's important to be specific in your questions to gather accurate data
  3. Provide definitions of key terms: defining key terms such as ‘engineering jobs’, ‘technology jobs’, and ‘science jobs’ helps young people understand key terms. The content of STEM outreach needs to be explicit about its definitions of STEM jobs. This helps to accurately measure changes in interest resulting from the activities, rather than relying solely on young peoples' pre-existing perceptions
  4. Use ‘jobs’ instead of ‘careers’: a 'job' is more meaningful, especially for younger respondents who might not be thinking about a longer-term ‘career’