Responding to ‘tricky' data: examples from a study of induction and mentoring in New Zealand early childhood services

Year: 2013

Author: Nolan, Andrea, Nuttall, Joce

Type of paper: Abstract refereed

Abstract:
 In this paper we present a reflexive account of research practice when faced with unexpected - and potentially problematic - findings in commercial research. We begin by briefly describing the project, which was commissioned by the New Zealand Teachers Council to evaluate the effectiveness of the Guidelines for Induction and Mentoring and Mentor Teachers (NZTC, 2011). The three-year project is being undertaken by a team of New Zealand and Australian researchers who bring a range of research experience, methodological preferences, and ontological assumptions. We also touch on some of the realities of commercial research, including the need to negotiate the research design in a short space of time, reshape research tools to meet client needs, and remain sensitive to client expectations. 
In the main part of the paper we explore what happened to our attempt to capture the dynamics of mentoring programs when we decided to draw on three potentially incongruent methodologies. To try to ‘cover all bases' in researching such a messy social phenomenon, we brought together the conceptual resources of program logic (Frechtling, 2007), ‘third generation' activity theory (Engeström, 1987), and statistical analysis (Connolly, 2007) to create a methodological ‘tool kit'. But a major methodological crack appeared and ‘tricky' data from the early childhood education surveys and case studies slipped through. As a consequence of applying our disparate methodologies and underlying epistemologies to meet client expectations, we found ourselves experiencing the puzzle and awkwardness of unexpected data. 
We share some examples to show how this data acted as a catalyst for questioning our research design. Our explorations confirmed for us the need to be mindful of the strengths and weaknesses of differing approaches, the importance of reflecting on the research process itself, the richness to be found in questioning tensions between data sets and, ultimately, to think of our experience not as failure to apply a mixed toolkit, or a flaw arising from conceptualisation of our research design, but rather as an ‘opportunity for knowing' that could sharpen our methodological thinking. We conclude by reflecting on how tricky data forced us to renew our attentiveness as researchers to the political, temporal, and historical aspects of educational settings. Our aim here is not to present findings but to share the back story of a research process that has renewed our sensitivity to the parallel narratives of research practice we experience as researchers.

Back