![]() With a similar ethos, the Evaluation Collective committed to supporting HE practitioners with evaluation, recognising varying levels of confidence, experience, skills, and knowledge. ![]() In a participatory evaluation of the PGCert/MA Student Engagement at the University of Winchester we provided evaluation capacity building for participants (students) to enable meaningful evaluation co-design. ![]() During a 3-year independent evaluation of the Scottish Enhancement Themes, we built in capacity building for developing confidence and skills as part of the commission, and designed a Universal Evaluation Framework, accessible for all, to support better evaluation design and reporting in the future. I have been involved in other work which has embedded evaluation capacity building. This resource is now being used to develop evaluation confidence across the HE sector. With further support from AdvanceHE, we developed a pedagogically informed resource to build confidence with intervention design and evaluation (#ChangeBusters, Theory of Change Game). In 2021, a team from Sheffield Hallam University published a similar recommendation in a review of evidence of demonstrable impact on access, retention, attainment, and progression – the sector needed better evaluation and evidence of impact on long term student outcomes. No-one learns by just being told to “do more” and “do it better”. Continuous calls to “evaluate causally” and “increase quality and volume” may only contribute to the problem. I also believe that the sector has not been sufficiently supported to develop evaluation confidence, which is a crucial step in the development of effective practices. I believe the positivist focus on collating causal evidence is too restrictive – but with colleagues, I have written about this before.Ī further conclusion is that the sector is stagnating, grappling with issues in intervention design and evidence generation that have manifested into “wicked” problems – open ended and with no obvious solution. This perceived lack of progress on gathering evaluative evidence could lead to various conclusions. This is echoed by the Office for Students (OfS) which is about to set expectations for an increase in the “ quality and volume” of evaluation within institutions as part of updated Access and Participation Plans. Since 2020, a variety of TASO reports have reached similar conclusions and recommended more and better evaluation across the sector. ![]() Yet again, a TASO report has concluded there is limited sector evidence of causal impact regarding interventions which aim to close equality gaps in higher education. ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |