Designing and trialling program evaluation processes, protocols and tools: Reframing 'evidence of impact' for democratic accountability

Year: 2019

Author: Carter, Jenni, Callaghan, Fiona, Comber, Barbara, Kerkham, Lyn

Type of paper: Abstract refereed

Abstract:
This paper reports on themes and critical questions emerging from a collaborative project designed by researchers in the School of Education, University of South Australia and curriculum consultants in Catholic Education South Australia (CESA). The intention of the project CESA Professional Development Network Inquiry: Designing and trialling program evaluation processes, protocols and toolsisto investigate affordances of innovative approaches to demonstrating ‘evidence of impact’ of CESA’s professional learning programs.

It is conventional wisdom that improving professional learning for educators is a crucial step in transforming schools and improving student achievement, but haunting this conventional wisdom is the question of ‘evidence of impact’. Current policy narrowly frames this question in terms of causal and measurable phenomena: input-output = measurable deliverables. In the current performative policy context, there are pressures to generate ‘objective data’; there are pressures to comply with system requirements; there are pressures to pursue causal relationships between professional learning and improved student achievement. However, CESA’s Network Inquiry approach to professional learning also provides an opportunity to explore an alternative framing of ‘evidence of impact’ that privileges a range of qualitative data over benchmarks and standardised checklists to capture changes in teacher knowledge, classroom practice and student learning.

In this dissonant space where competing ideologies and conflicting expectations are at work, the consultants are engaging in ongoing collaborative efforts to understand and explain how CESA’s professional learning programs support teachers to adjust or adapt their classroom practice, or redesign their curriculum, or develop evaluative thinking that informs their ongoing teaching. Recognising the limitations of ‘evidence of impact’ in the form of feedback at the end of a professional learning session, end-of-year reflections or surveys, the consultants are grappling with the differing goals and multiple co-existing frameworks and accountability measures that govern their work.

This research encourages dialogue, reflection and deliberationas theconsultants collect, share and interpret data alongside the research team. It seeks to engage them in ongoing collaborative efforts to understand and explain the effects of CESA’s professional learning programs, and to make judgments in a way that is sensitive to, and relevant for, their own contextualized settings. It is working towards an approach to generating program evaluation data that opens up critical discussion and expands the possibilities for consultants to change the narrative about what counts as ‘evidence of impact’.

Back