PISA-shock: how we are sold the idea our PISA rankings are shocking and the damage it is doing to schooling in Australia

By Aspa Baroutsis and Bob Lingard

When the first PISA results were released in 2001, there was a reaction in Germany that is now referred to as ‘PISA-shock’. It was likened to a tsunami-like impact where the perceived poor performance of German children compared with those in other countries participating in the international rankings dominated the news in Germany for weeks. Germans had believed they had one of the best schooling systems in the world and this first round of PISA results seriously challenged their perception. The shock led to major changes in education policy that Germany is still dealing with today.

Part of Germany’s PISA-shock was also precipitated by the fact that Finland was the outstanding performer in all the PISA tests in 2000. Historically, Finland had looked to other nations, including Germany, to learn about how schooling might be improved.

The term PISA-shock is now used widely within education circles. We would define PISA-shock as the impact of PISA results when those results are disjunctive with a nation’s self-perception of the quality of the schooling system.

We believe Australia also experienced PISA-shock in 2009 and this was subsequently compounded in 2012. Education policy changed here too as a result of PISA-shock. As with Germany, Australia is still dealing with the fallout of those changes.

In this blog post we want to look at what happened with that PISA-shock. Specifically we want to look how it played politically and educationally in Australia, the role the Australian media played and most importantly what Australia should be doing about its PISA-shock.

What is PISA?

The OECD’s PISA was first administered in 2000 and then every three years. PISA tests a sample of 15 year-olds in all participating nations on measures of reading, mathematical and scientific literacies. The number of nations participating has increased substantially since 2000 with 71 nations participating in 2015, including the 35 OECD member countries. The PISA results are reported in December of the year after the test is administered.

The test reports results on two dimensions, namely quality and equity. Quality refers to a nation’s performance on each of the tests, which usually have a mean score of 500, and documents the comparative performance against all other participating nations. Equity refers to the strength of the correlation between students’ socio-economic backgrounds and performance. Interestingly and importantly in policy terms, PISA results have shown that high performing nations tend to have more equitable schooling systems.

PISA-shock around the world

This PISA shock had real policy impact in Germany, leading to a large number of reform measures, both at national and Länder (states) levels, aimed at improving Germany’s subsequent PISA performance. We note here that Germany, like Australia, has a federal political structure and that some of the states did well on PISA 2000, but others did poorly. However, the aggregated German results demonstrated overall poor comparative performance.

We believe the German PISA shock in 2001 and its significant policy impact were important factors in insuring the legitimacy and significance of the PISA testing regime.

From the time of the first PISA, more nations have participated giving even greater significance to PISA in national policy reforms. As more nations have participated and as PISA has continued to provoke PISA-shocks, there has been enhanced media coverage in national and metropolitan newspapers of a nation’s comparative performance.

In 2009, several cities and provinces in China participated in PISA for the first time. Yet the Chinese government intervened and only allowed the public publication of Shanghai’s results. We stress here that Shanghai is not representative of China and that indeed access to the results of all participating systems suggest that at an aggregated level, China did much worse than Australia in 2009. However, it was Shanghai’s stellar performance on all the test measures that precipitated a PISA-shock in Australia.

PISA-shock in Australia

Political context

There is a specific context to Australia’s PISA-shock. Since the time of the Hawke/Keating governments, Australia has been seeking to reorient its economic policies towards Asia. There has been much talk as well of the 21st century being the Asian Century with the socio-political and economic rise of China. Australia’s response to Shanghai’s results must be seen in this context. The federal Labor government had commissioned the Henry Review on Asia and Australia’s economic future.

2009 and the beginning of our PISA-shock

There was a great deal of media coverage in 2010 in Australia of Australia’s poor and declining comparative performance on PISA 2009. We had our own ‘tsunami-like impact’ of media coverage. All major news services covered our ‘declining’ rankings and broadcasters and media commentators offered much advice as to why Australian schooling was ‘failing’.

Also contributing to this PISA shock was the fact that four of the top performing nations in PISA 2009 were located in East Asia (Shanghai, South Korea, Hong Kong, Singapore).

2012 and our PISA-shock deepens

Contributing further to Australia’s PISA shock was the extensive media coverage given in January 2012 to a report produced by an independent think tank, the Grattan Institute, Catching Up: Learning from the best school systems in East Asia.

The Prime Minister at the time, Julia Gillard, along with Australian and East Asian education system leaders, Andreas Schliecher from the OECD, and a number of academics had all attended a seminar convened by the Grattan Institute in late 2011 focusing on the nature of East Asian schooling systems that had performed so well in PISA 2009. The media coverage of the Grattan report and of this meeting caused another spike in media coverage. This occurred in January 2012 and it could be described as a media ‘frenzy’ about Australia’s PISA performance.

We note that ‘research reports’ produced by think tanks like the Grattan Institute are written with a media audience in mind. They are purposefully produced to impact on politicians and policy makers, and the broader public through media. They utilise the genre of a high-quality media story rather than an academic research report. Think Tank usage of publicly available PISA data has real media effects.

In the front-page story in The Australian on the 24th January 2012 the headline read: We risk losing education race, PM warns. In this story the then Prime Minister, Julia Gillard, was quoted as saying:

Four of the top five performing school systems in the world are in our region and they are getting better and better … On average, kids at 15 in those nations are six months ahead of Australian kids at 15 and they are a year in front of the OECD mean … If we are talking about today’s children – tomorrow’s workers – I want them to be workers in a high-skill, high-wage economy where we are still leading the world. I don’t want them to be workers in an economy where we are kind of the runt of the litter in our region.

Flawed use of mean scores

When framing counts and comparisons, the press frequently utlilised mean scores to rank participating countries as a mode of evidence regarding performance. In reading, Australia went from a mean score of 528 in 2000 to 512 in 2012, a drop of sixteen points, with a drop of seven points in science literacy, from 528 in 2000 to 521 in 2012. The worst change was in mathematics literacy where the country fell 29 points from a mean score of 533 in 2000 to 504 in 2012. This enabled dramatisation-style media coverage (with visuals such as graphs) as a downward trend and provided greater opportunity for sensationalism. For example, using mean scores and country ranks, Australia’s performance in mathematics shows a downward trend, with a significant decline starting in 2003, and subsequently out of the top 10 by 2006.

We would suggest that discussions about a country’s performance, based solely on mean scores and averages, are flawed. Focusing on Australia’s mean scores hides the substantial disparities between the performance of the States and Territories. The ACT, for example, always does well, while Tasmania and the Northern Territory always do poorly. All of the subsequent league tables and visual representations that continue relentlessly from the media in Australia are therefore flawed.

There is limited coverage of the equity measure, which shows a strengthening correlation between socio-economic background and performance and a substantial Socio Economic Status (SES) impact on performance. Furthermore, the number of 15 year old Australians from the bottom quartile of socio-economic background who perform in the top categories on each of the tests has declined sharply since the first test was administered in 2000.

The education fallout from PISA-shock in Australia

An upshot of this Australian PISA shock was the Gillard government legislating amendments to the Education Act that Australia would be back in the top 5 in PISA by 2025.

We see this as classic ‘goal displacement’.   We believe what is required is better quality and more equitable outcomes for all young Australians. That needs to be the target; it needs to be the goal of policy. Improved performance on PISA would flow from policy interventions aimed at achieving that goal. What we need is redistributive and targeted funding, along with research-informed interventions for classroom and school change.

Following Shanghai’s stellar performance on PISA 2009 and the extensive media coverage of Australia’s declining comparative performance, Australia joined the nations that have responded very seriously in political and policy terms to PISA-shocks

Very different results if we go back to the original set of countries

However we point out that there would be very different results if we go back to the original set of countries that participated in PISA in 2000 and compare Australia’s results to this particular set of countries.

Only 43 nations participated in the 2000 PISA, however the number of participating countries has grown substantially since that time with 65 nations participating in 2012, with a further 40% increase in participation rates in 2015 to 71 participating countries. Many of the additional countries are East Asian with Confucian traditions. Four countries in the top five ranks in 2009 were Australia’s East Asian neighbours.

These increases in the number of participating countries are rarely acknowledged in the press when discussing Australia’s position in global rankings. But this is a fundamental piece of information. Simple mathematics would suggest that ranks are more likely to change and decrease when the number of participants changes, irrespective of changes in performance.

Furthermore, it is probably only statistically reliable to compare longitudinal changes in performance across the years when one of the test domains was the major focus (e.g. science in 2006 and 2015). This is neglected in media coverage.

We conducted a subsequent analysis of Australia’s PISA rank using only participating countries that were represented in all five test years (2000-2012). Only 32 countries participated in PISA each year with data being available across the three literacies. Our analysis illustrates the arbitrary nature of using mean scores to rank countries and not taking into account the increases in numbers of countries participating over the years.

For example, in each of the literacies, Australia is ranked higher in 2009 and 2012 when analysed against the 32 countries, than when compared with the participating countries of a particular year, making the changes in position less dramatic. For example, in mathematics, Australia is placed 12th rather than 19th; in reading, 9th rather than 13th; and in science, 10th rather than 16th.

Our comparisons like those of the newspapers were conducted longitudinally across independent data-sets (year of test). However, the difference was that the number of participating countries was consistent, thereby eliminating this variance and producing a very different result in the rankings.

Importance of sociocultural and socio-political differences

Except for Finland, all other countries in the top five on the 2009 and 2012 PISA are Asian. Each of these nations is significantly different from Australia in sociocultural and socio-political terms, but they are still identified as reference societies for Australian educational reforms. Subsequently, a nation’s referential position is no longer conditioned and legitimated by similarities with a society and a schooling system (for example in the past, the UK), but on the basis of their placement in the global rankings on PISA.

Media constructions also emphasise policy, rather than structural inequality explanations of national performance. While the Australian press did not stop referencing Finland, coverage also included Asian nations, especially Shanghai in 2009. The Australian reported, ‘Shanghai, which joined the international testing movement in 2009 and ousted Finland from the top spot it had occupied for almost 10 years’ with the Sydney Morning Herald adding, ‘Australian policy makers could learn much from China’. The Grattan Institute Report (mentioned above) sought to draw on the high performing East Asian nations to make policy suggestions for Australia.

Despite major cultural, demographic and political differences between Finland and Australia, and Shanghai and Australia, and Shanghai being erroneously seen as all of China, this did not prevent media constructions of Shanghai as a suitable reference system for Australian schooling.

Talking about ‘Australian’ performance hides the large disparities within Australia

The media speak of Australia’s performance more than they speak of say New South Wales’ performance or Western Australia’s performance on PISA. This approach hides quite large disparities in performance across the various state schooling systems in Australia. Yet Australia oversamples on PISA (we have more children sit the tests than required) so that the results can be disaggregated to school system levels (other countries, such as the US, do not do this). The media rarely acknowledge these disparities in their PISA reporting.

On the analyses for 2012 PISA, Western Australia and the Australian Capital Territory did very well, while the Northern Territory and Tasmania performed comparatively poorly. This went largely unreported and what we saw instead was the media’s fixation on national average scores and international comparisons within league tables.

What we should be doing with PISA results

As suggested above, PISA provides important data for policy makers on the quality and equity of schooling systems. As we have already noted, the media fail to report the increasing inequities in Australian schooling. There is a deafening media silence about this situation; indeed, almost no media coverage of equity in respect of PISA.

The PISA test is administered every three years (beginning in 2000). The results for each PISA are released in December of the year following. In the subsequent year after the publication of the results, the OECD releases very detailed secondary analyses of the PISA data, with these reports usually running to about 1200 pages.

While there is always huge media frenzy over the initial release of results of international rankings there is seldom any media coverage of the subsequent detailed reports. In our view, it is these analyses that should inform policy makers and indeed the Australian people.

The PISA-shock type media coverage has huge policy effects. Governments make decisions that have lasting fallout on our education systems as a result of this coverage. However the deep inequities of performance based on socio-economic background that show up in detailed PISA results and the differences between the jurisdictional schooling systems is where the media should be shining the spotlight. This is where the real story of what is happening in school education in Australia can be uncovered. This is where policy makers should be searching for policy changing data.

There is a pressing policy need for the inequities uncovered by PISA testing to be addressed by federal and state governments, in both funding and policy ways. We think these inequities are symbiotic with broader structural inequalities and historical legacies, which also need to be addressed by a range of new social policies.

As with all tests, PISA should be used for the purposes for which it was constructed, that is, to help policy makers to make informed decisions about schooling to ensure we have a high quality and equitable schooling systems.

Full report Counting and comparing school performance: an analysis of media coverage of PISA in Australia, 2000–2014


Aspa Baroutsis is a senior research fellow in the Faculty of Education at Queensland University of Technology. She is currently working on the Learning to write in the early years project (ARC DP150101240). Her research interests include media, policy, social justice, science education, digital technologies and literacies.



Bob Lingard is a Professorial Research Fellow in the School of Education at The University of Queensland, where he researches in the sociology of education. His most recent books include: Globalizing Educational Accountabilities (Routledge, 2016), co-authored with Wayne Martino, Goli Rezai-Rashti and Sam Sellar,  National Testing in Schools (Routledge, 2016) (The first book in the AARE series Local/Global Issues in Education),co-edited with Greg Thompson and Sam Sellar, and The Handbook of Global Education Policy (Wiley, 2016), co-edited with Karen Mundy, Andy Green and Antonio Verger. Bob is a Fellow of the Australian Academy of Social Sciences and Co- Editor of the journal, Discourse: Studies in the Cultural Politics of Education.  You can follow him on Twitter  @boblingard86

7 thoughts on “PISA-shock: how we are sold the idea our PISA rankings are shocking and the damage it is doing to schooling in Australia

  1. peter O'Brien says:

    What about questioning the validity of PISA in the first place? Didn’t Gillard also introduce NAPLAN after discussions with Joel Klein? What about the US and their destruction of public education? We are heading down the same path.

  2. Aspa Baroutsis says:

    Thanks for your comments, Peter. They are all great points that you make and potential topics for additional blogs either by Bob and me, or others. Ta,

  3. Bob Lingard says:

    My position is not so much opposing PISA per se, but rather criticising its usual usage and criticising the way the media representations of Australia’s changing PISA results have such profound policy effects. Sensible usage of PISA results, including of the extensive secondary analyses, would be and should be useful for policy makers. There are real lessons in the data regarding equity issues in Australian schooling, These appear to go uncommented on. I tend to agree with Thomas Picketty here that denial of numbers has never helped the most disadvantaged.
    Thanks, Peter.
    I would make similar comments about NAPLAN. In my view, it is the politicisation of the test, the impact of the My School website and its census character that disaffects its potential usefulness.

  4. Karen Yager says:

    The other aspect of the PISA tests that is ignored is that the questions are grounded in applied learning. Only a small proportion of the Mathematic’s syllabuses are about applied learning. In NSW, Working Mathematically fits this but is only one aspect of the curriculum. Having met one the lead writers for the PISA tests in Finland, it is obvious that the tests reflect their approach to curriculum. Other countries are introducing more applied learning, such as British Columbia in Canada.

  5. Aspa Baroutsis says:

    The alignment of curriculum, pedagogy and assessment is very important. Thanks for making that point, Karen.

  6. Bob Lingard says:

    Agree. PISA is not curriculum based. Rather, it measures the application of knowledge. It is interesting in this respect, that Japan has introduced application of knowledge type questions in part B of their national test.
    Thanks for your comment.

  7. Bob Lingard says:

    In the 2018 PISA a measure of global competence has been added to the usual measures of maths, science and reading literacies. Interestingly, the Japanese Ministry has just decided to not participate in this element of PISA 2018.

Comments are closed.