Lead Institution: National Coordinating Centre for Public Engagement
Collaborating with:
This project created training resources to support HE STEM practitioners to develop their evaluation work in particular, of STEM outreach activities. By developing their understanding and skills in evaluation, practitioners would be able to improve the effectiveness of their projects, and demonstrate this to others. This was prompted by a range of drivers: the access agreements universities provided need to have evidence of effective outreach practice; a need to justify the funding spent on outreach in terms of effective outcomes from the work and a desire by many in the sector to evidence the impact of their work.
Informed by a survey of current practice within the sector (with 161 respondents), three training courses were developed. The Beginner’s Guide to Evaluation was aimed at those with little experience of evaluation, and sought to build confidence and skills in developing an evaluation plan. The Evaluation Masterclass was for those with more evaluation experience, who wanted to better understand how to make an impact with their evaluation. The Training the Trainer course sought to support trainers to run their own Beginner’s Guide to Evaluation course within their institution. Finally, three one hour Plug and Play sessions were developed that could be integrated into conferences and other meetings. These covered: Measuring Quality in Engagement; Logic Models & Theories of Change; Creative Ways to Evaluate.
The courses were supported through an online Ning site where participants could download all the resources and ask questions of the course trainers. You can find out more at www.publicengagement.ac.uk/evaluating-stem-outreach
The final report for the entire project can be downloaded below:
Evaluating STEM Outreach: Final Report
1. The Beginner’s Guide to Evaluation course aimed to build knowledge and confidence for HE STEM practitioners to develop their evaluation work – the evaluation of the course clearly indicated that the course delivered this, and much more. It was great to hear people reflecting on the increase in confidence they now had to develop an evaluation plan for their own work.
2. The Evaluation Masterclasses provided a great opportunity for people to discuss more detailed evaluation practice. The evaluation indicated that people really valued the opportunity to come together to discuss impact and reporting, with the interactive nature of the course much appreciated.
3. The impact agenda is encouraging more people to evidence the impact of their research – many of the resources developed as part of this project are equally relevant to a wider audience of people seeking to develop their evaluation practice. This is particularly true of the Evaluation Masterclass. In addition, we have had interest from HE staff in Europe who are also tackling these questions.