Measures of doctoral student expectations

Year: 2016

Author: Holbrook, Allyson, Shaw, Kylie, Scevak, Jill, McInerney, Dennis, Preston, Greg, Fairbairn, Hedy

Type of paper: Abstract refereed

Abstract:
Early research on mismatch between doctoral student expectations and experience primarily explored experience of supervision and course choice (Hair, 2009; Moxham et al., 2012; Spaulding & Rockinson-Szapkiw, 2012) but tended to neglect the research task itself. In studies of mismatch in expectations recent work has highlighted that students emphasise expectations about the research task (Holbrook et al., 2014), and how the demands of the degree affect them (Juniper et al., 2012). Other authors have pointed out that candidates are often unable to accurately envisage the demands of a PhD (Kandiko & Kinchin, 2012). Holbrook et al. (2014) found that more than one third of 104 interviewees in the Australian study on doctoral student metacognition nominated that doing a PhD was hard or challenging but no more or less than they had expected. Most, however, noted a clear mismatch. This is a potential problem as mismatch has been linked to attrition (Gardner 2009). One unanticipated finding was that supervision received relatively little emphasis in student reference to expectations, suggesting that an emphasis on supervision to the exclusion of other factors is likely to offer a limited model for understanding barriers to progress. Where student expectations on supervision were not met this negative mismatch was significantly related to low satisfaction (Holbrook et al., 2014). This is more in line with previous research that identified the importance of supervision in student experience (Pole et al., 1997; Zhao et al., 2007; Moxham et al., 2013). It remains to be tested how strong these relationships are because there have been no validated measures of expectations that traverse the breadth of student experience. This paper reports on the piloting of a 51 item instrument developed to measure expectations. Through a number of iterations of Principal Components Analysis, the 51 items were reduced to 30, forming six scales of five items each: Community, Supervisor Support, Supervisor Control, University, Task and Future. All scales had strong reliabilities. Mean responses were between Tend to Agree and Agree. Agreement with Supervisor Control was significantly stronger and Supervisor Support weaker than all other scales. The scales differentiated between demographic groups on all but Entry Qualification, Months Enrolled and First in family to do a PhD. Clearly expectations data has potential to make an important contribution to understanding more about attrition and completion.

Back