Bob Lingard

QandA:‘what works’ in ed with Bob Lingard, Jessica Gerrard, Adrian Piccoli, Rob Randall,Glenn Savage (chair)

See the full video here

Evidence, expertise and influence are increasingly contested in the making of Australian schooling policy.

More than ever, policy makers, researchers and practitioners are being asked to defend the evidence they use, justify why the voices of some experts are given preference over others, and be critically aware of the networks of influence that determine what counts as evidence and expertise.

The release of the ‘Gonski 2.0’ report raises a number of complex questions about the use of evidence in the development of schooling policies, and the forms of expertise and influence that are increasingly dominant in shaping conversations about the trajectory of schooling reform.

The report signals an ever-increasing presence of federal government influence in shaping schooling policy in Australia’s federal system. It also strongly reflects global shifts towards a “what works” reform narrative, which frames policy decisions as only justifiable in cases where there is evidence of demonstrable impact.

Proposals such as the creation of a ‘national research and evidence institute’ by the Labor party, and related proposals by the Australian Productivity Commission to create a national ‘education evidence base’, signal a potentially new era of policy making in Australia, in which decisions are guided by new national data infrastructures and hierarchies of evidence.

These developments raise serious questions about which kinds of evidence will count (and can be counted) in emerging evidence repositories, which experts (and forms of expertise) will be able to gain most traction, how developments might change the roles of federal, state and national agencies in contributing to evidence production, and the kinds of research knowledge that will (or will not) be able to gain tradition in national debates.

On November 6th, I hosted a Q&A Forum at the University of Sydney, co-sponsored by the AARE ‘Politics and Policy in Education’ Special Interest Group and the School and Teacher Education Policy Research Network at the University of Sydney.

It featured Adrian Piccoli (Director of the UNSW Gonski Institute for Education), Jessica Gerrard (senior lecturer in education, equity and politics at the University of Melbourne), Bob Lingard (Emeritus Professor at the University of Queensland and Professorial Research Fellow at the Australian Catholic University) and Rob Randall (CEO of the Australian Curriculum, Assessment and Reporting Authority).

What follows is an edited version of the event, featuring some key questions I posed to the panelists and some of their highlight responses.

See the full video here

Glenn: I want to start by considering the changing role and meaning of ‘evidence’ and how different forms of evidence shape conditions of possibility for education. What do you see as either the limits or possibilities of “what works” and “evidence-based” approaches to schooling reform?

Bob: It seems to me the ‘what works’ idea works with a sort of engineering conception of the relationship between evidence, research, policy making and professional practice in schools, and I think it also over simplifies research and evidence … I would prefer a relationship between evidence (and evidences of multiple kinds) to policy and to practice which was more of an enlightenment relationship rather than an engineering one … I think policy making and professional practice are really complex practices, and I think we can only ever have evidence-informed policy and evidence-informed professional practice, I don’t think we can have evidence-based … I think ‘what works’ has an almost inert clinical construction of practice. And I think there’s an arrogant certainty.

Adrian: The problem with the ‘what works’ movement is that it lends itself, particularly at a political level, to there being a ‘silver bullet’ to education improvement and the thing you launch the silver bullet on is a press release. I’ve always said the press release is the greatest threat to good education policy because it sounds good, in the lead up to an election, to say things like ‘independent public schools work’ so fund them, or it might be a phonics check, so let’s fund this because it works, but I think it lends itself to that kind of one-dimensional approach to education policy. But education reform is an art. What makes the painting great? It’s not the blue or the yellow or the red, it’s actually the right combination of those things. Education, at a political level, people can try to boil it down to things that are too simple.

Rob: I actually think the term [what works] is a useful term. If I go back to when I first started teaching, it’s a good question, ‘what works?’ Can you give me some leads? It’s not a matter of saying ‘this is it entirely’, but we’ve got to be careful of how the language enables us and not continue to diss it.

Glenn: NSW has created its Centre for Education Statistics and Evaluation, which describes itself as Australia’s first ‘data hub’ in education that will tell us “what works” in schools and ensure decisions are evidence-informed. On the Centre’s website, it tells us that NSW works with the concept of ‘an evidence hierarchy’. On top of the hierarchy is ‘the gold standard’, which includes either ‘meta analyses’ or ‘randomised controlled trials’. To me this begs a question: how might the role of researchers be shifting now ‘the best’ evidence is primarily based on large-scale and quantitative methods?

Jess: To me it’s a funny situation to be in when your bread and butter work is producing knowledge and evidence but you find yourself arguing against the framing and enthusiastic update of something like ‘evidence-based policy’. Particularly concerning is this hierarchical organisation of evidences where randomised controlled trials, statistical knowledge and other things like meta analyses are thought to be more certain, more robust, more concrete than other forms of research knowledge, such as qualitative in-depth interviews with school teachers about their experiences. The kind of knowledge that is produced through a statistical or very particular causal project becomes very narrow because it has to bracket out so many other contextual factors in order to produce ‘a certainty’ about social phenomena. We can’t rely on a medical model, where RCTs come from, for something like classroom practice, and you can see this in John Hattie’s very influential book Visible Learning. You just have to look at the Preface where he says that he bracketed out of his study any factor that was out of school. When you think about that it becomes unsurprising that the biggest finding is that teachers have the most impact, because you’ve bracketed out all these other things that clearly have an impact … With the relationship between politics and policy, I think it’s really interesting that, politically speaking, evidence-based policy becomes very popular around some reforms, yet not around other reforms, so school autonomy, great example, there’s no evidence to say that has a positive impact on student achievement but yet it gets rolled out, there’s no RCT on that, there’s no RCT on the funding of elite private schools, but yet we do these things. I think we can get into a trap of ‘policy-led evidence’ when political interests try to wrestle evidence for their own purposes.

Glenn: Let’s consider which ‘experts’ tend to exert the most influence in schooling. For example, a common claim is that some groups and individuals might get more of a say than others in steering debates about schooling. In other words, not everyone ‘gets a seat at the table’ when decisions are made – and if they do, voices are not always equally heard. A frequent criticism, for example, is that certain thinks tanks or lobby groups, or certain powerful and well-connected individuals, are often able to exert disproportionate power and influence. Would any of you like to comment on those dynamics and the claim that it might not be an even playing field of influence?

Bob: I think ‘think tank research’ is very different from the kind of research that’s done by academics in universities. The think tank usually has a political-ideological position, it usually takes the policy problem as given rather than thinking about the construction, I think it does research and writes reports which have specific audiences in mind, one the media and two the politicians. I remember once when I did a report for a government and the minister told me my problem was that I was ‘two-handed’. I’d say ‘on the one hand this might be the case, and on the other hand…’, but what he wanted was one-handed research advice, and I think in some ways the think tanks, that’s what they do.    

Glenn: Another important dimension here is that even when one’s voice is heard, often what ‘the public’ hears is far from the full story. And I think this is where we need to consider the role of the media and the 24-hour news cycle we now inhabit. For example, so much of what we hear about ‘the evidence’ driving schooling reform is filtered through the media; but this is invariably a selective version of the evidence. Do any of you have any thoughts or reflections on this complex dynamic between the media, experts, evidence and policy?

Adrian: Good education policy is really boring, right? It’s boring for the Daily Telegraph, it’s boring for the Sydney Morning Herald, it’s boring for the ABC, Channel 7, it’s boring. You talk curriculum, you talk assessment, you talk pedagogy, I mean when was the last time you saw the ‘pedagogy’ word in a news article? … what’s exciting is ‘you know what, here’s the silver bullet’ … and the public and media and the political process doesn’t have the patience for sound evidence-based education reform.

Rob: I think we’re at risk of underestimating the capability of the profession in terms of interpreting and engaging with this. I think we’re at risk of under-estimating the broader community.

Glenn: To me, it seems there’s something peculiar in terms of how expertise about education is constructed. For example, in the medical profession, many would see the expertise as lying with the practitioners themselves, the doctors, surgeons, and so on, who “possess” the expertise and are, therefore, the experts. If education mirrored this, then surely the experts would be the teachers and school leaders – and expertise would lie in their hands? But this often seems to be far from the way expertise is talked about in schooling. Instead, it seems the experts are often the economists, statisticians and global policy entrepreneurs who have little to do with schools. Why is it that the profession itself seems to so often be obscured in debates about expertise and schooling reform?

Jess: What we see now is because education and schooling is such a politically invested enterprise, with huge money attached to it, it’s never really been wrestled from the hands of government in terms of a professional body. So, a body like AITSL, for instance, which is meant to stand in as a kind of professional body, isn’t really representative of the profession, it doesn’t have those kinds of links to teachers themselves as the medical equivalent does. So, we’re in a curious state of affairs, I think you’re right Glenn, where who counts as having expertise are often not those who are within the street level, within the profession … We don’t have enough of an opportunity to hear from teachers themselves, to have unions and teachers as part of the public discussion, and when they are a part of the discussion they’re often positioned as being argumentative or troublesome as opposed to contributing to a robust public debate about education.

Bob: As we’ve moved into the kind of economies we have, the emphasis on schooling as human capital and so on, it is those away from schooling, the economists and others, who I think have formulated the big macro policy, rather than the knowledge of the profession.

Glenn: Up to this point we’ve been mainly talking about influence in terms of specific individuals, or groups, but also I think certain policies and forms of data also exert significant influence. I need only mention the term NAPLAN in front of a group of educators to inspire a flood of conversations (and often polarised opinion) about how this particular policy and its associated data influence their work. Is it a stretch to say that these policy technologies and data infrastructures now serve as political actors in their own right? Is there a risk when we start seeing data itself as a “source of truth” beyond the politics of its creation?

Jess: I think it’s absolutely seen in that way and I think that’s the problem with the hierarchy of knowledge or evidence. There’s a presumption that these so-called higher or more stable forms of knowledge can stand above the messiness of everyday life in schools or the complexity of social and cultural phenomena … there’s no way a number can convey the complexity, but because they seem so tantalisingly certain, they then have a life of themselves.

Adrian: NAPLAN is the King Kong of education policy because it started off relatively harmless on this little island and now it’s ripping down buildings and swatting away airplanes. I mean it’s just become this dominant thing in public discourse around education.    

Rob: Let’s not get naïve about how people are using it [NAPLAN]. People use the data in a whole range of ways. It’s not that it’s good on one side and bad on the other … now if we want to, we could take the data away, or we could actually say, ‘let’s have a more complete discussion about it’ … give parents the respect they deserve, I do not accept that there’s a whole bunch of parents out there choosing schools on the basis of NAPLAN results.

Glenn: To finish tonight, I want to pose a final ‘big sky’ question. The question is: If you had the power to change one thing about how the politics of evidence, expertise or influence work in Australian schooling policy, what would that be?

Bob: I would want to give emphasis to valuing teacher professional judgment within the use of data and have that as a central element rather than having the data driving.

Adrian: I would make it a legal requirement that systems and governments have to put the interests of child ahead of the interests of adults in education policy.

Jess: I think I’m going to give a sociologist’s answer, which is to say that I think what I would want to see is greater political commitment to acknowledging the actual power that is held in the current production of data and the strategic use of that. The discussion also needs to address the ethical and political dimensions of education and schooling beyond what data can tell us.

Rob: I would like to pursue the argument about increasing the respect and nature, the acknowledgment of, and the expectation of, the profession … I think there is a whole bunch of teachers out there who do a fantastic job … given their fundamental importance to the community, to the wellbeing of this country going forward I’d be upping the ante for the respect for and expectation of teachers.

See the full video here

Glenn C. Savage is a senior lecturer in education policy and sociology of education at the University of Western Australia. His research focuses on education policy, politics and governance at national and global levels, with a specific interest in federalism and national schooling reform. He currently holds an Australian Research Council ‘Discovery Early Career Research Award’ (DECRA) for his project titled ‘National schooling reform and the reshaping of Australian federalism’(2016-2019).

PISA-shock: how we are sold the idea our PISA rankings are shocking and the damage it is doing to schooling in Australia

When the first PISA results were released in 2001, there was a reaction in Germany that is now referred to as ‘PISA-shock’. It was likened to a tsunami-like impact where the perceived poor performance of German children compared with those in other countries participating in the international rankings dominated the news in Germany for weeks. Germans had believed they had one of the best schooling systems in the world and this first round of PISA results seriously challenged their perception. The shock led to major changes in education policy that Germany is still dealing with today.

Part of Germany’s PISA-shock was also precipitated by the fact that Finland was the outstanding performer in all the PISA tests in 2000. Historically, Finland had looked to other nations, including Germany, to learn about how schooling might be improved.

The term PISA-shock is now used widely within education circles. We would define PISA-shock as the impact of PISA results when those results are disjunctive with a nation’s self-perception of the quality of the schooling system.

We believe Australia also experienced PISA-shock in 2009 and this was subsequently compounded in 2012. Education policy changed here too as a result of PISA-shock. As with Germany, Australia is still dealing with the fallout of those changes.

In this blog post we want to look at what happened with that PISA-shock. Specifically we want to look how it played politically and educationally in Australia, the role the Australian media played and most importantly what Australia should be doing about its PISA-shock.

What is PISA?

The OECD’s PISA was first administered in 2000 and then every three years. PISA tests a sample of 15 year-olds in all participating nations on measures of reading, mathematical and scientific literacies. The number of nations participating has increased substantially since 2000 with 71 nations participating in 2015, including the 35 OECD member countries. The PISA results are reported in December of the year after the test is administered.

The test reports results on two dimensions, namely quality and equity. Quality refers to a nation’s performance on each of the tests, which usually have a mean score of 500, and documents the comparative performance against all other participating nations. Equity refers to the strength of the correlation between students’ socio-economic backgrounds and performance. Interestingly and importantly in policy terms, PISA results have shown that high performing nations tend to have more equitable schooling systems.

PISA-shock around the world

This PISA shock had real policy impact in Germany, leading to a large number of reform measures, both at national and Länder (states) levels, aimed at improving Germany’s subsequent PISA performance. We note here that Germany, like Australia, has a federal political structure and that some of the states did well on PISA 2000, but others did poorly. However, the aggregated German results demonstrated overall poor comparative performance.

We believe the German PISA shock in 2001 and its significant policy impact were important factors in insuring the legitimacy and significance of the PISA testing regime.

From the time of the first PISA, more nations have participated giving even greater significance to PISA in national policy reforms. As more nations have participated and as PISA has continued to provoke PISA-shocks, there has been enhanced media coverage in national and metropolitan newspapers of a nation’s comparative performance.

In 2009, several cities and provinces in China participated in PISA for the first time. Yet the Chinese government intervened and only allowed the public publication of Shanghai’s results. We stress here that Shanghai is not representative of China and that indeed access to the results of all participating systems suggest that at an aggregated level, China did much worse than Australia in 2009. However, it was Shanghai’s stellar performance on all the test measures that precipitated a PISA-shock in Australia.

PISA-shock in Australia

Political context

There is a specific context to Australia’s PISA-shock. Since the time of the Hawke/Keating governments, Australia has been seeking to reorient its economic policies towards Asia. There has been much talk as well of the 21st century being the Asian Century with the socio-political and economic rise of China. Australia’s response to Shanghai’s results must be seen in this context. The federal Labor government had commissioned the Henry Review on Asia and Australia’s economic future.

2009 and the beginning of our PISA-shock

There was a great deal of media coverage in 2010 in Australia of Australia’s poor and declining comparative performance on PISA 2009. We had our own ‘tsunami-like impact’ of media coverage. All major news services covered our ‘declining’ rankings and broadcasters and media commentators offered much advice as to why Australian schooling was ‘failing’.

Also contributing to this PISA shock was the fact that four of the top performing nations in PISA 2009 were located in East Asia (Shanghai, South Korea, Hong Kong, Singapore).

2012 and our PISA-shock deepens

Contributing further to Australia’s PISA shock was the extensive media coverage given in January 2012 to a report produced by an independent think tank, the Grattan Institute, Catching Up: Learning from the best school systems in East Asia.

The Prime Minister at the time, Julia Gillard, along with Australian and East Asian education system leaders, Andreas Schliecher from the OECD, and a number of academics had all attended a seminar convened by the Grattan Institute in late 2011 focusing on the nature of East Asian schooling systems that had performed so well in PISA 2009. The media coverage of the Grattan report and of this meeting caused another spike in media coverage. This occurred in January 2012 and it could be described as a media ‘frenzy’ about Australia’s PISA performance.

We note that ‘research reports’ produced by think tanks like the Grattan Institute are written with a media audience in mind. They are purposefully produced to impact on politicians and policy makers, and the broader public through media. They utilise the genre of a high-quality media story rather than an academic research report. Think Tank usage of publicly available PISA data has real media effects.

In the front-page story in The Australian on the 24th January 2012 the headline read: We risk losing education race, PM warns. In this story the then Prime Minister, Julia Gillard, was quoted as saying:

Four of the top five performing school systems in the world are in our region and they are getting better and better … On average, kids at 15 in those nations are six months ahead of Australian kids at 15 and they are a year in front of the OECD mean … If we are talking about today’s children – tomorrow’s workers – I want them to be workers in a high-skill, high-wage economy where we are still leading the world. I don’t want them to be workers in an economy where we are kind of the runt of the litter in our region.

Flawed use of mean scores

When framing counts and comparisons, the press frequently utlilised mean scores to rank participating countries as a mode of evidence regarding performance. In reading, Australia went from a mean score of 528 in 2000 to 512 in 2012, a drop of sixteen points, with a drop of seven points in science literacy, from 528 in 2000 to 521 in 2012. The worst change was in mathematics literacy where the country fell 29 points from a mean score of 533 in 2000 to 504 in 2012. This enabled dramatisation-style media coverage (with visuals such as graphs) as a downward trend and provided greater opportunity for sensationalism. For example, using mean scores and country ranks, Australia’s performance in mathematics shows a downward trend, with a significant decline starting in 2003, and subsequently out of the top 10 by 2006.

We would suggest that discussions about a country’s performance, based solely on mean scores and averages, are flawed. Focusing on Australia’s mean scores hides the substantial disparities between the performance of the States and Territories. The ACT, for example, always does well, while Tasmania and the Northern Territory always do poorly. All of the subsequent league tables and visual representations that continue relentlessly from the media in Australia are therefore flawed.

There is limited coverage of the equity measure, which shows a strengthening correlation between socio-economic background and performance and a substantial Socio Economic Status (SES) impact on performance. Furthermore, the number of 15 year old Australians from the bottom quartile of socio-economic background who perform in the top categories on each of the tests has declined sharply since the first test was administered in 2000.

The education fallout from PISA-shock in Australia

An upshot of this Australian PISA shock was the Gillard government legislating amendments to the Education Act that Australia would be back in the top 5 in PISA by 2025.

We see this as classic ‘goal displacement’.   We believe what is required is better quality and more equitable outcomes for all young Australians. That needs to be the target; it needs to be the goal of policy. Improved performance on PISA would flow from policy interventions aimed at achieving that goal. What we need is redistributive and targeted funding, along with research-informed interventions for classroom and school change.

Following Shanghai’s stellar performance on PISA 2009 and the extensive media coverage of Australia’s declining comparative performance, Australia joined the nations that have responded very seriously in political and policy terms to PISA-shocks

Very different results if we go back to the original set of countries

However we point out that there would be very different results if we go back to the original set of countries that participated in PISA in 2000 and compare Australia’s results to this particular set of countries.

Only 43 nations participated in the 2000 PISA, however the number of participating countries has grown substantially since that time with 65 nations participating in 2012, with a further 40% increase in participation rates in 2015 to 71 participating countries. Many of the additional countries are East Asian with Confucian traditions. Four countries in the top five ranks in 2009 were Australia’s East Asian neighbours.

These increases in the number of participating countries are rarely acknowledged in the press when discussing Australia’s position in global rankings. But this is a fundamental piece of information. Simple mathematics would suggest that ranks are more likely to change and decrease when the number of participants changes, irrespective of changes in performance.

Furthermore, it is probably only statistically reliable to compare longitudinal changes in performance across the years when one of the test domains was the major focus (e.g. science in 2006 and 2015). This is neglected in media coverage.

We conducted a subsequent analysis of Australia’s PISA rank using only participating countries that were represented in all five test years (2000-2012). Only 32 countries participated in PISA each year with data being available across the three literacies. Our analysis illustrates the arbitrary nature of using mean scores to rank countries and not taking into account the increases in numbers of countries participating over the years.

For example, in each of the literacies, Australia is ranked higher in 2009 and 2012 when analysed against the 32 countries, than when compared with the participating countries of a particular year, making the changes in position less dramatic. For example, in mathematics, Australia is placed 12th rather than 19th; in reading, 9th rather than 13th; and in science, 10th rather than 16th.

Our comparisons like those of the newspapers were conducted longitudinally across independent data-sets (year of test). However, the difference was that the number of participating countries was consistent, thereby eliminating this variance and producing a very different result in the rankings.

Importance of sociocultural and socio-political differences

Except for Finland, all other countries in the top five on the 2009 and 2012 PISA are Asian. Each of these nations is significantly different from Australia in sociocultural and socio-political terms, but they are still identified as reference societies for Australian educational reforms. Subsequently, a nation’s referential position is no longer conditioned and legitimated by similarities with a society and a schooling system (for example in the past, the UK), but on the basis of their placement in the global rankings on PISA.

Media constructions also emphasise policy, rather than structural inequality explanations of national performance. While the Australian press did not stop referencing Finland, coverage also included Asian nations, especially Shanghai in 2009. The Australian reported, ‘Shanghai, which joined the international testing movement in 2009 and ousted Finland from the top spot it had occupied for almost 10 years’ with the Sydney Morning Herald adding, ‘Australian policy makers could learn much from China’. The Grattan Institute Report (mentioned above) sought to draw on the high performing East Asian nations to make policy suggestions for Australia.

Despite major cultural, demographic and political differences between Finland and Australia, and Shanghai and Australia, and Shanghai being erroneously seen as all of China, this did not prevent media constructions of Shanghai as a suitable reference system for Australian schooling.

Talking about ‘Australian’ performance hides the large disparities within Australia

The media speak of Australia’s performance more than they speak of say New South Wales’ performance or Western Australia’s performance on PISA. This approach hides quite large disparities in performance across the various state schooling systems in Australia. Yet Australia oversamples on PISA (we have more children sit the tests than required) so that the results can be disaggregated to school system levels (other countries, such as the US, do not do this). The media rarely acknowledge these disparities in their PISA reporting.

On the analyses for 2012 PISA, Western Australia and the Australian Capital Territory did very well, while the Northern Territory and Tasmania performed comparatively poorly. This went largely unreported and what we saw instead was the media’s fixation on national average scores and international comparisons within league tables.

What we should be doing with PISA results

As suggested above, PISA provides important data for policy makers on the quality and equity of schooling systems. As we have already noted, the media fail to report the increasing inequities in Australian schooling. There is a deafening media silence about this situation; indeed, almost no media coverage of equity in respect of PISA.

The PISA test is administered every three years (beginning in 2000). The results for each PISA are released in December of the year following. In the subsequent year after the publication of the results, the OECD releases very detailed secondary analyses of the PISA data, with these reports usually running to about 1200 pages.

While there is always huge media frenzy over the initial release of results of international rankings there is seldom any media coverage of the subsequent detailed reports. In our view, it is these analyses that should inform policy makers and indeed the Australian people.

The PISA-shock type media coverage has huge policy effects. Governments make decisions that have lasting fallout on our education systems as a result of this coverage. However the deep inequities of performance based on socio-economic background that show up in detailed PISA results and the differences between the jurisdictional schooling systems is where the media should be shining the spotlight. This is where the real story of what is happening in school education in Australia can be uncovered. This is where policy makers should be searching for policy changing data.

There is a pressing policy need for the inequities uncovered by PISA testing to be addressed by federal and state governments, in both funding and policy ways. We think these inequities are symbiotic with broader structural inequalities and historical legacies, which also need to be addressed by a range of new social policies.

As with all tests, PISA should be used for the purposes for which it was constructed, that is, to help policy makers to make informed decisions about schooling to ensure we have a high quality and equitable schooling systems.

Full report Counting and comparing school performance: an analysis of media coverage of PISA in Australia, 2000–2014

 

Aspa Baroutsis is a senior research fellow in the Faculty of Education at Queensland University of Technology. She is currently working on the Learning to write in the early years project (ARC DP150101240). Her research interests include media, policy, social justice, science education, digital technologies and literacies.

 

 

Bob Lingard is a Professorial Research Fellow in the School of Education at The University of Queensland, where he researches in the sociology of education. His most recent books include: Globalizing Educational Accountabilities (Routledge, 2016), co-authored with Wayne Martino, Goli Rezai-Rashti and Sam Sellar,  National Testing in Schools (Routledge, 2016) (The first book in the AARE series Local/Global Issues in Education),co-edited with Greg Thompson and Sam Sellar, and The Handbook of Global Education Policy (Wiley, 2016), co-edited with Karen Mundy, Andy Green and Antonio Verger. Bob is a Fellow of the Australian Academy of Social Sciences and Co- Editor of the journal, Discourse: Studies in the Cultural Politics of Education.  You can follow him on Twitter  @boblingard86

The creeping commercialisation of public schools

The privatisation of public education is attracting a lot of attention around the world but what is happening within public schooling is falling under the radar. Increases in commercialisation in public schooling, both in Australia and internationally, is attracting less scrutiny. Commercialisation is the creation, marketing and sale of education goods and services to schools by private providers.

With commercialisation private providers work with and within public schools to support schooling processes. They don’t take over the delivery and running of schools in the way privatised school models work, such as low-fee for-profit schools and some Charter schools in the US, Academies in the UK or Free Schools in Sweden.

In the commercialised school, public monies intended for public schooling are being used to fund the operation of commercial businesses. However, the scope of commercial activities in schools remains largely invisible to taxpayers, as commercialisation has crept into schools as a seemingly necessary way to deliver education in the 21st century.

On this point it is worth noting that commercialisation has had a long (and relatively uncontroversial) history in schools, beginning with commercially produced textbooks which have been around since the early 20th century. Similarly, schools have tended to involve the private sector for transportation services, food supply and specialised instruction and facilities. However, since the 1990s many educators have become interested, and concerned, about the scale and scope of commercialisation.

The increasing economy of standardisation

In Australia for example, the creation of a national system of schooling (e.g. the Australian curriculum, NAPLAN, a national funding approach) has helped create an economy of scale that is attractive to businesses who now have the opportunity to become major suppliers to school systems in local education markets. Commercial providers can utilise increasing standardisation to offer ready-made ‘solutions’ to the various education ‘problems’ schools are facing in improving student outcomes at scale – meaning they can develop a product and sell it nationally.

These services complement and supplement basic education facilities often in a context where bureaucratic or central support is being withdrawn. These services include the provision of curriculum content, assessment services, data infrastructures, digital learning, remedial instruction, professional development for staff and school administration support.

It’s not all bad

Not all aspects of schooling have become commercialised. A lot of teachers are doing what they have always done and are going about their business without engaging in commercialisation. However, there are particular services that are considered useful, even necessary for teachers to effectively do their jobs.

Our recent research commissioned by the New South Wales Teachers Federation, the largest teachers’ union in Australia, about the extent of commercialisation in Australian public schooling, surveyed AEU members and found that 40% of the participants suggested resources and curriculum materials that supported their development of innovative learning experiences were important. Indeed, 28% of teachers reported they regularly use commercial lesson plans.

Similarly, many participants argued that ICT and technology solutions including things such as attendance and timetabling software, as well as programs that assist in the recording, summarising and reporting of student assessment were absolutely necessary to purchase from the private sector, particularly because teachers, school leaders and even Education Departments do not have the skills or expertise to develop these services and programs themselves.

But commercial providers should not influence decision-making or de-professionalise our teachers

Those responses that argued for some level of commercialisation in public schools tended to offer a caveat for commercial assistance, suggesting commercial providers should not be able to influence school, state or national decisions about curriculum, pedagogy or assessment.

What teachers and school leaders did express concern about was the idea that increasing commercialisation would lead to an intensification of the de-professionalisation of teaching. For example, some respondents referenced their unease with the outsourcing phenomenon in schools, particularly in Health and Physical Education (HPE). This means that rather than employing a specialist HPE teacher, schools contract an external provider to come in and deliver HPE for them. Often this results in sports coaches rather than teachers delivering these lessons. An associated concern with this shift is that these providers are not 4-year, university trained teachers and far from experts in curriculum, pedagogy and assessment. Ultimately, this jeopardises the academic value placed on subjects like HPE.

Transferring of costs to parents

Others expressed concern about how the costs of commercial programs were being transferred to parents. For example, one participant observed that at their school parents are asked to pay for their child’s subscription to online learning programs, and if they were unwilling or unable to pay, their child would not be able to use the program while all other students could.

Given our research is exploratory we do not know how common this practice is, but it is certainly cause for concern in the public education system that has historically been considered free and based on principles of social democratic equality.

‘Free’ public schooling in jeopardy

Interestingly, it was this traditional, social democratic view of public education that many teachers argued was being jeopardised by the increasing commercialisation of schooling. 72% of respondents had significant concern that schools were being run like businesses and 68% were significantly concerned about the notion that schools will be increasingly privatised and commercialised, following the path of reform in the US or even in Australia’s own VET education sector. Respondents to the open-ended survey question called on governments and Education Departments to learn from these failed models and implement stricter regulations about the role of commercial providers in schools.

We need to learn more and do more about commercialisation in public schooling

It must be stressed that this survey was intended as an exploratory study. As this is the first research of its kind in Australia, it is important to note that all exploratory studies suffer from limitations, which means that it is not advisable to assume causal conclusions as a result. We are only just beginning to map this phenomenon in Australia and we need further research to understand the affordances of commercialisation, because some commercialisation in schools is inevitable. But we also need to consider at which point commercialisation has detrimental effects on the rationale for public schooling.

It is clear we need a strong and informed system to help regulate commercial activities in public schools and ensure that we are putting student interests before profits.

 

Anna Hogan is a lecturer in the School of Human Movement and Nutrition Sciences at the University of Queensland. Anna has been researching the commercialisation and privatisation of education policy and practice. She is currently working on projects that investigate the commercialisation of Australian public schooling, global for-profit models of schooling, the effects of curriculum outsourcing on teachers’ work and the commercialisation of student health and wellbeing. Anna has recent publications in the Australian Educational Researcher, Journal of Education Policy, Discourse: Studies in the Cultural Politics of Education, and Critical Studies in Education

‘School funding on a budget’ paper (a justification for dumping Gonski) is nonsense, here’s why

The Centre for Independent Studies is an Australian free market think tank that produced a policy discussion paper School funding on a budget in the lead up to the first Coalition federal budget in April, 2014. The paper was part of the think tank’s campaign to get the Australian Government to reduce spending.

The paper got substantial media coverage in the lead-up to the Abbott/Hockey budget that reduced funding to states and territories by $80 billion. This is significant because the paper provided a justification for the government’s failure to implement Gonski and helped push an agenda for further privatisation of Australian schooling.

As these policies have implications for all Australians I decided to have a closer look at what the paper said, how it was said and the evidence used to justify its stance.

School Funding on a Budget (SFoB)

SFoB is an exemplar of the think tank report genre. It is written in plain language, by author Jennifer Buckingham, and purports to be a research report, with this claim affected through some academic accoutrements (such as tables, footnotes and appendices). It has the user in mind and is readymade for mainstream media. Even the campaign name, TARGET30 (all in upper case) has the feel of an advertising slogan about it. The length of the paper is 27 pages; long enough for policy makers, politicians and journalists to take seriously, but not too long to put them off reading it.

There are eight tables, nine figures, 60 footnotes, and two appendices. In the footnotes, there is cross-referencing to other CIS reports and those of other think tanks, the work of a conservative free choice US Foundation that promotes the use of school vouchers, and to the reports of consultancy firms such as Pricewaterhouse Coopers. These references are granted equivalence with academic research by influential Australian education academics such as Steve Dinham and John Hattie and to analyses by the OECD in PISA reports and in Education at a Glance.

Something of the political framing is indicated in the section of the report outlining why government spending on schools has to be reviewed – because of the supposed need to reduce government expenditure and to enhance ‘productivity’ in schooling. This is an economistic construction of the work of teachers and their achievements.

The argument goes that increased funding does not result in better outcomes for students, therefore funding should be linked to improvements in outcomes. To do this, tight controls should to be in place.

Please go to my full paper (link at the end of this blog) for more detail about my thoughts on all of this.

Here, I want to take serious issue with the argument categorically stated in SFoB that there is no relationship between increased funding and improved student outcomes.

The OECD demonstrates quite unequivocally that schooling systems with the most equitable funding approaches are those that achieved the best PISA outcomes, that is, higher quality and more equitable outcomes. Beyond a threshold level of funding, what matters is the equitable targeting of additional funds. This is the Gonski approach to school funding – targeting those most in need. It is also important to acknowledge that across the time since the first Programme for International Student Assessment (PISA) in 2000, the strength of socio-economic correlations with performance in the 34 member countries has grown.

Additionally, across this period there has been a decline in the percentage of resilient students in all OECD countries, that is, students in the bottom socioeconomic quartile who perform in the top two categories (poor students who achieve top results.) For Australia, this figure was 8% in 2003 and 6 % in 2012. This is the period of school reforms supported by the CIS and that the SFoB seeks to strengthen even further. It is also the period of growing inequality, a reality not acknowledged in SFoB.

SFoB speaks of the decline in Australia’s PISA performance. On one reading this is correct. However the performance of different states and territories are added together to depict a crisis. The substantial differences between school systems are hidden. If it is broken down we see a vastly different, but more informative picture: Western Australia and the ACT, for example, perform very well indeed, while Queensland is at the overall average for both quality and equity. It is the very poor comparative performance of the Northern Territory and Tasmania that contributes most to the picture of declining Australian PISA performance.

This evidence about the Northern Territory tells us more about structural poverty in remote Indigenous communities and high levels of youth unemployment in Tasmania, along with a lower level socio-economic community in total. I would stress that the pressing educational policy issue in contemporary Australian schooling is equity. This is silently denied by SFoB.

The report gives Eight ‘Tips’ for reducing government expenditure on schools

Here are the ‘tips’ and my responses. (I see the use of ‘tip’ rather than recommendation is another way this report seeks particular media attention.)

Tip 1 is to ‘revise the federal government funding model’

The Coalition government has been forced to do this, given its rejection of the Gonski model.

Tip 2 is ‘abolish the federal Department of education’

Any remaining programs, it is suggested, could be run through other federal departments or agencies. It is also argued that 90% of federal recurrent and capital funding for schools could be funded and overseen through Treasury. In assessing, and rejecting, this recommendation, we need to think about the Whitlam government’s reasons for federal involvement in schooling. It was to ensure that all young Australians, no matter where they lived, their socio-economic background or which schools they attended, had the same educational opportunities as all others. Back then it was mainly about giving more funding to poor Catholic schools.

This was an equity framing of federal involvement in schooling and it is one still needed. This is why we need a federal department, redistributive funding as articulated by Gonski and targeted federal programs such as the Rudd/Gillard government’s National Partnerships, which were abolished by the Abbott government.

Tip 3 is ‘reduce the cost of state and territory bureaucracy’.

This amounts to reduction of ‘out of school’ costs compared with ‘in school costs’. It is linked to more devolution of education policy and funding directly to schools. Interestingly, in research I have conducted recently with a group of schools in regional Queensland, the major criticism proffered by the principals has been that they are now responsible for everything, without systemic support.

Tip 4 is to remove ‘mandatory class size minimums and eschew further class size reductions’

This is a covert criticism of the teacher unions, who have lobbied hard and long for class size reductions. It is also seen as a straightforward way to rein in expenditure. Here SFoB also links class size reduction to the employment of more teachers, pointing out that in most OECD countries the ‘major commitment of education expenditure is teaching staff’. Interestingly, the evidence on class size relationships with student achievement and other kinds of outcomes is equivocal, and it suggests the biggest impact is in early years and for disadvantaged students. Class size is an equity issue, but not recognised as such by SFoB.

SFoB also argues class size reductions have negatively affected both teacher salaries and teacher quality because of the growth in teacher numbers. The evidence provided in SFoB for rejecting any relationship between class size and student performance is think tank reports, including from the CIS itself and the Grattan Institute.

Tip 5 is ‘Education Bursaries for low-income students to use at non-government schools’

SFoB argues that such bursaries would save money, as there is more government money expended on government schools than on independent schools. SFoB states, ‘Low-income students could be offered an education bursary valued above the average per student expenditure on non-government schools but below the average cost of attending a government school (say $10,000)’. This is choice at the extreme, and destabilising of the democratic and social justice purposes and qualities of government schooling.

TIP 6 is ‘charge high-income families to attend government schools’

The specific charge mentioned is $1000, a small amount, but which would be the ‘thin edge of the wedge,’ so to speak. It is noted that there are half a million ‘students in government schools from families with a household income that might be considered high’. Reflecting the ideological bent of CIS, this call is argued to be equitable because the schools these high earning parents attend received $15,000 per student of government funding, at the same time as some low income families pay for their children to attend non-government schools with less government support. This is a perverted reconstruction of equity.

Tip 7 is ‘reduce the oversupply of teachers by elevating entry standards to teaching degrees’

I have some sympathy for this ‘tip’. We see here the tension between state intervention and market driven approach to university enrolments. This ‘tip’ is represented as a supply and demand question and one of economic efficiency at one level. It is interesting that the recent Review of teacher education commissioned by the previous federal minister rejected such a call for minimum entrance standards to teaching degrees. The Australian Education Union supports a minimum entrance score.

Tip 8 is ‘decentralise teacher employment and make it easier for principals to dismiss ineffective teachers

This is also part of devolution that is supported by CIS. Historically Australian schooling systems have had centralised staffing because of the difficulty of staffing schools in remote communities and in very disadvantaged urban communities. There is also an issue of social justice and equity.

Conclusion

SFoB is about agenda setting and ideas for policy in the context of a down-sized state and fast policy making. It sought to use a political moment to drive an agenda that, in my opinion, would further entrench inequalities in Australian schooling.

The Abbott government appointed Professor Steven Schwartz, currently an academic advisor for the Centre for Independent Studies (CIS), to chair the Australian Curriculum, Assessment and Reporting Authority ( ACARA) and Dr Jennifer Buckingham, author of the SFoB paper, to the Board of the Australian Institute for Teaching and School Leadership ( AITSL).

I won’t be alone in wondering if CIS will be as influential with the Turnbull government as it was with the Abbott government.

 

SBS-017Bob Lingard is a Professorial Research Fellow in the School of Education at The University of Queensland, where he researches in the sociology of education. His most recent books include: Globalizing Educational Accountabilities (Routledge, 2016), co-authored with Wayne Martino, Goli Rezai-Rashti and Sam Sellar,  National Testing in Schools (Routledge, 2016) (The first book in the AARE series Local/Global Issues in Education),co-edited with Greg Thompson and Sam Sellar, and The Handbook of Global Education Policy (Wiley, 2016), co-edited with Karen Mundy, Andy Green and Antonio Verger. Bob is a Fellow of the Australian Academy of Social Sciences and Co- Editor of the journal, Discourse: Studies in the Cultural Politics of Education.  You can follow him on Twitter  @boblingard86

The paper  Think Tanks, “policy experts’ and ‘ideas for’ education policy making in Australia can be found here