New South Wales, Day 1, Term 1. A whole staff meeting to begin the school year. At some point after the Principal’s address, the Leader of Learning (or similar) starts their Powerpoint to go through the previous year’s HSC results subject by subject, particularly the number of Band 6s. In non-selective high schools, invariably the English, Humanities, Visual Arts, Music, PDHPE and even Maths teachers are patting themselves on the back and being lauded by the school leadership. This is in complete contrast to the responses from and directed towards the science teachers. Equally, when principals have their HSC performance review meetings with their own superiors, quite often it’s a tricky conversation regarding the ‘performance’ of the sciences.
What is the obsession with Band 6s? Band 6s sound elite, the very best. But the facts are that a Band 4 or 5 in a difficult subject such as Physics or Chemistry may make as big – or even bigger – contribution to ATAR (Australian Tertiary Admission Rank) (more on that later) than a Band 6 in say, Music. Also, Band 6s are the only metric made publicly available and shared with the media.
Band 6s and exam results (raw and moderated, see Raw Results and Band 6s discussed later) from NESA are not to be confused (but are so often confused) with ATAR scores which are calculated by the Universities Admissions Centre (UAC). The ATAR is a rank, not a mark, indicating a student’s position relative to all the students in their age group in their state. Calculating the ATAR is based on an aggregate of ‘scaled’ marks of a student’s courses. This is totally different from ‘moderating’ by NESA in NSW (see later). Importantly (and often detrimentally) for teachers, only students are measured by ATARs; teachers are measured by Number of Band 6s. So students in subjects that scale well such as Physics will receive a good ATAR contribution if they perform reasonably well in that subject. But ‘reasonably well’, say high Band 4 to Band 5, doesn’t cut it for teachers measured by their Number of Band 6s. It is also interesting to note that last year a Band 1 (a fail!) in Physics could rank higher than a Band 4 in Visual Arts and Band 3 in Legal Studies, Ancient History, Business Studies and PDHPE.
Last millenium in the UK, I was fortunate enough to achieve two thirds of my A-Level Physics class attaining ‘A’ grades in a non-selective Government school. In NSW, it took me 4 years just to achieve just one Band 6 in two non-selective schools. For the past few years in my work supporting schools in all sectors in the sciences, I regularly spend much of Term 1 advocating for science teachers, coordinators and principals who are feeling the heat for “poor Band 6 results”. I am constantly witnessing (and have suffered first-hand) the negative impacts of these judgements on teacher and principal morale and well-being. Student well-being is also being detrimentally impacted by similar unfair judgements and subject comparisons in Year 11 and HSC Trial (raw) exam results.
And what is the cause of all of this anxiety? The completely flawed metric of comparison that is the ‘Number of Band 6s’. So why is this overly blunt measure, that appears in all school marketing literature, on school billboards, in the Sydney Morning Herald, and is part of NSW vernacular, so flawed as a point of comparison? The answer is that, importantly, it was never intended to be a point of comparison in the first place, particularly between subjects.
The NSW HSC is a standards-based assessment. The whole construct of NSW HSC standards-based assessment was devised by Dr John Bennett, former Chief Executive of the Office of the Board of Studies NSW (now NESA), as part of his PhD thesis. Dr Bennett’s supervisor was Professor Jim Tognolini, now Director of the Centre for Educational Measurement and Assessment at The University of Sydney, who has been senior advisor on educational measurement issues for every state and territory education department and examination board including NESA. The whole premise of the standards-based model is to maintain the integrity and consistency of measuring standards year on year for an individual subject i.e. that a Band 6 in Chemistry in 2021 is comparable with a Band 6 in Chemistry in 2020; with no mandate or mechanism to say that Band 6 is comparable between subjects.
In this standards-based model, each subject had its own set of ‘Band Descriptors’ (now called Band Descriptions) providing descriptions of typical student performance for each of Bands 2-6. These respective Band Descriptors were devised by respective subject experts. They provide guidance for marking school based assessments (although this raises an issue discussed later), but most importantly, the Band Descriptors provide the standards for subject-specific ‘Judge Markers’ to measure student HSC examination responses against. This means that the Band Descriptors for say PDHPE were devised by experienced PDHPE teachers and judged annually by experienced PDHPE teachers. Similarly, the same can be said for Physics, or any subject. What this means is that the standards are different for each subject. If the Band Descriptors are different for each and every subject, and they are interpreted/judged differently in each subject, then we cannot use ‘Number of Band 6s’ by way of comparison between subjects. It stands to logic – it is simply comparing apples with oranges.
Following the release of the 2020 HSC results, in a quote in the SMH, Professor Tognolini reiterated:
“we’ve never convinced the community that a band 6 in physics was not designed to be the same as a band 6 in biology or a band 6 in chemistry” .
(This example is somewhat ironic since in the new science syllabuses every science subject has the same Band Descriptions, but the general point is being made by one of the original designers of the HSC itself). In the same article, Dr Timothy Wright, former Headmaster of Shore stated:
“it is really hard to get a Band 6 in say Chemistry and easier in say Business Studies”.
A few days prior, also in the SMH, a NESA spokesperson is quoted as saying:
“[the] number of Band 6s achieved in science courses can’t be compared with the number achieved in other courses”.
Consider the following examples comparing Band Descriptions between subjects:
“demonstrates knowledge and some understanding of business functions and operations”
Biology, Chemistry, Physics, Earth & Environmental Science, Investigating Science:
“demonstrates sound knowledge and understanding of scientific concepts”
How is “some understanding” comparable to “sound understanding”? It is not.
“demonstrates comprehensive knowledge and understanding of business functions and operations”
Biology, Chemistry, Physics, Earth & Environmental Science, Investigating Science:
“demonstrates an extensive knowledge and understanding of scientific concepts, including complex and abstract ideas”
Again, there is much greater rigour in the supposed equivalent Band Description in the sciences compared to the non-science.
Perhaps most telling of all is where the Band Description for a typical Band 5 student in any of the 2 unit science subjects is:
“applies knowledge and information to unfamiliar situations…”.
Applying to ‘unfamiliar’ situations doesn’t appear in any other subject apart from Mathematics (and that is only for Band 6). If a student merely learns all of the content of a science, they cannot get above a Band 4 unless they can also apply their skills to unfamiliar situations. Whereas this is not the case in any other subject.
Equity of Access to Band 6s?
In 2017, I attempted to publish an article in The Conversation entitled Battle of the Bands: HSC Physics and Chemistry bottom of the Band 6 charts (co-authored with my PhD supervisor and co-supervisor). The article was refined, approved by editors and ready to go, only to be pulled at the 11th hour for external reasons. The article looked at data for 25% of the State to determine the rate of access to Band 6s among all HSC subjects in high schools in NSW. What was important about our analysis was that rather than compare blunt total numbers of Band 6s (which are readily available on the NESA website), we made a ‘common-cohort comparison’ i.e. what was the relative access to Band 6s of individual students in one subject when compared with themselves in the same other subjects?
The findings were staggering. Students in Physics and Chemistry (in non-selective schools) were only only 26% and 27% as likely to achieve a Band 6 respectively as they were in the average of their other subjects. By way of comparison, students in PDHPE, Community & Family Studies and Society & Culture were twice (200%) as likely, and Music 1 and Design & Technology were two-and-a-half (250%) as likely to achieve a Band 6 than in the average of their other subjects. In extremis, this was a tenfold or one order of magnitude difference! That is hardly equitable access to Band 6s. Our findings confirmed what science teachers have been reporting for years, that even though the most able students often studied Physics and Chemistry, with relatively low numbers of Band 6s awarded for the State in total, combined with the over-representation of selective schools in these subjects, non-selective schools were left fighting over scraps in terms of access to Band 6s. Even in a school where say the performance of Physics is well above the State average, it is still destined to be below average compared to the other subjects in the same school (by definition, some subjects (usually about half) have to be below average when compared against each other in the same school, yet we still persist with this type of in-house comparison).
Gaming the System
So if you can’t compare Band 6s, and it is more difficult to get Band 6s in some subjects than others, yet schools are still being measured by their numbers of Band 6s, what can be done?
A genuine, yet morally wrong, short-term solution to maximise Band 6s is to guide students away from subjects with a low frequency of Band 6s. We know this happens already with subjects like English Standard: even though many students are better suited to English Standard, many schools push them into the higher Band 6 frequency English Advanced course. If this strategy is applied to the sciences, then schools simply stop offering the sciences. This is happening already in some quarters, not least with the compounding issue of the shortage of science teachers, let alone science-trained teachers. In the short term, this could genuinely address some of the shortfall of Band 6s in a school, but it is only a short-term solution. If a school stops offering any of the sciences, particularly the traditionally ‘rigorous’ ones such as Chemistry and Physics, then the school will ‘residualise’ as aspirational families reject such a school and attend elsewhere offering the full complement of sciences.
Raw Results and Band 6s
Further confusion and anxiety reign with many schools, students and parents misunderstanding raw exam results and any correlation with performance bands. In every HSC exam, a student’s raw exam mark is internally moderated by NESA by subject, based on the Judges’ interpretation of that year’s exam inline with that subject’s Band Descriptors. For example, in one particular subject, a raw HSC exam mark of 76% might be moderated up to a 90 i.e. Band 6, and a raw exam mark of only 18% might be moderated up to a 50 i.e. a Band 2. Another subject might have 93% moderated to 90 (Band 6) and 52% moderated to 50 (Band 2). However as mentioned, many people, particularly students, parents and sometimes school leadership, don’t understand this. They think that a raw exam mark directly and equally translates to a band i.e. that a raw exam mark of 90+ is needed for a Band 6 in any subject. Following the example above, a Band 6 performing student in the first subject, with a raw mark of 76 in their Y11 exams or Y12 Trials, may incorrectly think they are only operating at Band 4, and the adults around them may equally think so. A statistically more commonplace example might be a student only achieving a 46% raw mark in the first subject in a school exam and interpreting that as a fail, whereas the same score may scrape a Band 4 when moderated in the HSC. This misunderstanding can lead to undue anxiety, misplaced self-deprecation and self-efficacy, students dropping the wrong subjects, and yet again, flawed comparisons between subjects.
Where to from here with ‘Number of Band 6s’?
So comparing the ‘Number of Band 6s’ between subjects is completely untenable. Does this knowledge help principals? Right now, not in the slightest. They don’t need me telling them the pressure they are under for Band 6s. What we need is for all stakeholders to spell this out publicly and abide by no longer comparing subjects (and ideally schools) by the numbers of Band 6s. This has to start with media outlets who are responsible for publishing such league tables and contributing to this statewide obsession and very parochial NSW vernacular in the form of ‘Number of Band 6s’. Along with the media, we need NESA (not just anonymous spokespersons), all school sectors, principals’ associations, parent bodies, teacher associations and universities to formally declare and abide by not publicly publishing, advertising or comparing between subjects (and schools) using Number of Band 6s. Only by formally denigrating ‘Number of Band 6s’ can we get to a point where we have “convinced the community”. Discussing this with journalists associated with this blog, “it’s the only metric we have. We’ve asked many times for this to change. More diverse data would stop the league tables.”
However, schools and school systems still need to measure performance. Moving beyond Number of Band 6s should not be a problem. During these COVID times, we have finessed our metrics from blunt, not so useful ‘Number of Cases’ to more pertinent measurements such as ‘Cases in the Community’. With 75,000+ students annually sitting HSC exams, there is more than enough data to measure statistically significant ‘value add’ performance of every school, subject and even teacher (if class sizes are large enough). Mathematically, this is achieved by ‘multiple regression analysis’, controlling for all other variables such as gender, socioeconomic status, school type etc. (see an example of multiple regression analysis here). Such in-house data sets are already in use: in the Department of Education with Scout, in Catholic schools with the CSNSW HSC Analysis Project, and in independent schools with various analysts. In Victoria, there has been a long established effort to celebrate value-add through the Schools that Excel lists, though what is suggested above would be far more finessed.
If the expectation is there for public comparison of all, the data is there if all schools are willing to share. But there’s the catch. With privacy of individual student information laws, and copyright of data, no detailed comparison as suggested above of the whole state i.e. all three sectors, can be published publicly unless everyone signs up to it, which is unlikely. But does that really matter?
Publishable Performance Metrics
Stakeholders need to decide, are they willing to share all of their data so that true ‘value add’ measurements (with error margins) can be reported fairly? Or is everyone happy to have league tables of school rank only by subject, taking into account all bands (Number of Band 6s alone is no longer an option)? Or do we even need public ranking/comparison? The first option as mentioned is unlikely to be agreed to, plus would require a level of statistical numeracy across society that doesn’t exist, as evident in dealing with COVID. The second option is completely achievable, schools could be ranked within individual subjects. This would eradicate the current inaccurate comparison between subjects, but would continue to perpetuate the anxiety induced by comparison between schools. So ultimately, do we even need to publicly publish relative performance metrics at all if they are essentially meaningless and harmful, or can we just keep these in-house to help monitor progress and improve our individual education of students? Either way, we must stop comparing subjects using the Number of Band 6s.
Dr Simon Crook is director of CrookED Science, a STEM education consultancy, and Honorary Associate at the School of Physics, The University of Sydney. He works with primary and high school teachers and students around many aspects of science and STEM education, and assists the Sydney University Physics Education Research (SUPER) group with their work, including liaising with NESA regarding science syllabuses. His PhD research evaluated the impact of technology on student attainment in the sciences. Previously, Simon was a high school physics teacher for 15 years.