Phonics screening check

The flawed thinking behind a mandatory phonics screening test

The New South Wales Government recently announced it intends to “trial an optional phonics screening test” for Year One students. This seems to be following a similar pattern to South Australia where the test, developed in the UK, was first trialled in 2017 and is now imposed on all public schools in the state.

The idea of a mandated universal phonics screening test for public schools is opposed by the NSW Teachers Federation, but is strongly advocated by neo-liberal ‘think tanks’, ‘edu-business’ leaders, speech specialists and cognitive psychologists. The controversy surrounding the test began in England, where it has been used since 2012. As in England, advocates of the test in Australia argue it is necessary as an early diagnosis of students’ early reading.

No teacher would dispute the importance of identifying students in need of early reading intervention, nor would they dispute the key role that phonics plays in decoding words. However I strongly believe the efficacy of the test deserves to be scrutinised, before it is rolled-out across our most populous state, and possibly all Australian public schools.  

Two questions deserve to be asked about the tests’ educational value. Firstly, is it worthwhile as a universal means of assessing students’ ability in reading, especially as it will be costly to implement? Secondly, does it make sense to assess students’ competence in reading by diagnosing their use of a single decoding strategy?

Perhaps these questions can be answered by interrogating the background to the test in England and by evaluating the extent to which it has been successful.       

What is in the test?

The test, which involves two stages, consists of 40 discrete words that the student reads to their teacher. They do so, by firstly identifying the individual letter-sound (grapho-phonic) correspondences, which they then blend (synthsise) in order to read the whole word. So, in fact what is specifically being tested is a synthetic phonic approach to reading, not a phonic approach per se. It could even be argued that calling the test a ‘phonics’ check is a misnomer since analytic phonics is not included.

Students pass the test by correctly synthesising the letter blends in thirty-two of the forty words.  In order to preserve fidelity to the strategy and to ensure students do not rely on word recognition skills, the test includes 20 pseudo words. In the version used in England, the first 12 words are nonsense words.

The back ground to the phonics screening check in England.  

We can trace the origins of the phonics screening check in England to two influential sources: ‘The Clackmannanshire Study’ and the ‘Rose Report’. In his 2006 report on early reading, Sir Jim Rose, drew heavily on a comparative study conducted by Rhona Johnston and Joyce Watson, in the small Scottish county of Clackmannanshire. After comparing achievements in reading of three groups of students taught using different phonic methods, the two researchers concluded that the group taught by means of synthetic phonics achieved significantly better results than either of two other groups. These other groups were taught by means of analytic phonics and a mixed methods approach. Although the study received little traction in Scotland and has subsequently been critiqued as methodologically flawed, it was warmly embraced in England, especially by Rose who was an advocate of synthetic phonics.             

The 2006 Rose Report was influential in shaping early reading pedagogy in England and from 2010 systematic synthetic phonics, not only became the exclusive method of teaching early reading in English schools, it was made statutory by the newly elected Conservative-Liberal Coalition under David Cameron. The then Education Secretary, Michael Gove, and his Schools’ Minister, Nick Gibb, announced a match funded scheme in which schools were required to purchase a synthetic phonics program. Included in the list of recommended programs was one owned by Gibb’s Literacy Advisor. This program is now used in 25% of English primary schools. In 2012, Gove introduced the phonics screening check for all Year One students (5-6 year olds) in England, and in 2017, Gibbs toured parts of Australia promoting the test here. 

To what extent has the Phonics Screening Check been successful?

In its first year, only 58% of UK students passed the test, but in subsequent years’ results have improved. Students who fail the test must re-sit it at the end of year Two. By 2016, 81% of Year One students passed the test, but since then there has only been an increase of 1%.

Gibb cites this increase in scores, over a six-year period, as proof that the government has raised standards in reading and advocates of the test in Australia have seized upon the data as evidence in support of their case.

At face value, the figures look impressive. However, when we compare phonics screening check results with Standard Assessment Test (the UK equivalent to NAPLAN) scores in reading for these students a year later, the results lose their shine. In 2012, 76% of Year Two students achieved the expected Standard Assessment Test level in reading, but last year only 75% achieved the same level. Clearly then, the phonics screening check is not indicative of general reading ability and does not serve as a purposeful diagnostic measure of reading.

In a recent survey of the usefulness of the phonics screening check in England, 98% of teachers said it did not tell them anything they did not already know about their students’ reading abilities. Following the first year of the test in 2012, when only 58% of students achieved the pass mark, teachers explained that it was their better readers who were failing the test. Although these students were successfully making the letter-sound correspondences in nonsense words, in the blending phase, they were reading real words that were similar to the visual appearance of the pseudo words.

The conclusion is that authentic reading combines decoding with meaning.

Furthermore, as every teacher knows, high status tests dominate curriculum content, which in this case, means that by giving greater attention to synthetic phonics, in order to get students’ through the test, there is less time to give to other reading strategies.

Whilst the systematic teaching of phonics has an important place in a teacher’s repertoire of strategies, it does not appear to make any sense to make it the exclusive method of teaching reading, as is the case in England. To give it a privileged status as a test does exactly that.

Perhaps this is the key reason why, in England, phonics screening check scores have improved but students’ reading abilities have not.

I don’t think Australia should be heading down the same dead-end path.

Dr. Paul Gardner is Senior Lecturer in Primary English, in the School of Education at Curtin University. Until 2014, he taught at several universities in the UK.

How the national phonics test is failing England and why it will fail Australia too

A national test of phonics skills will not improve faltering literacy standards in Australia. The test is being imported from England where it has been in place since 2011. It has failed to improve national standards in reading in England. Instead the phonics frenzy of testing and practicing nonsense words that has accompanied the implementation of the test appears to be narrowing classroom practice and damaging literacy standards.

The test itself is ill conceived and poorly structured. Should we wish to test the phonological awareness of our six year olds this test would be inadequate.

So how did we end up even considering the test for Australian children? The process that led to this test being recommended for all Australian six year olds was deeply flawed and is an unfortunate example of the growing influence of ultra-conservative think tanks on educational policy.

What is the phonics screening check?

The phonics screening check is a test devised in England. It is conducted one on one with Year 1 students (typically aged 6). The children are presented with 40 decodable words. Twenty are pseudo words that are indicated as such by an accompanying alien icon. The other 20 are real words, but ideally unknown to the students.

The rationale is that this is a test of pure phonic knowledge, not vocabulary or sight word knowledge. Students need to score 32 from 40 to pass the Check. Those who don’t pass are given intervention using a government mandated synthetic phonics program.

Why was the check recommended and who was involved?

Jennifer Buckingham from the conservative think tank, the Centre for Independent Studies, was appointed to chair the panel that was tasked with conducting an independent review of the need for Year 1 Literacy and Numeracy checks.

Dr Buckingham was a public advocate for England’s Phonic Screening Check before she was appointed to head the review and write the report. And she continued to publicly advocate for the Check whilst conducting the review, and before the review’s final report was released. So the report’s findings were not surprising.

What was surprising was the report’s lack of reference to any of the peer reviewed research studies that have been conducted on the Phonics Screening Check since its introduction in England.

A review of that research finds little value in the Phonics Screening Check.

The phonics check is not helping England; in fact England is going backwards

The Check is not improving reading comprehension scores in England. This year’s literacy test results are disturbing.

Scores on the phonics screening check do not correlate with scores in reading comprehension tests as measured by England’s national SAT reading tests in Year 2 and Year 6.

In 2016, 91% of Year 1 students passed the Phonics Screening Check. This was lauded as evidence the Check was working because it was forcing teachers to focus on phonics, and therefore students were passing the Check at higher rates than ever before.

In 2017 these ‘successful’ phonics-ready students sat their Year 2 Key Stage 1 reading comprehension test. To pass this reading comprehension test, children only had to score 25 from 40 questions. However, only 76% passed. And only 61% of low SES students passed the test.

It appears then that being poor has more to do with your reading comprehension achievement than knowing your sounds.

It also seems the phonics check hasn’t solved the gender puzzle in reading achievement, as girls consistently outperform boys on both the phonics check (by 7 percentage points in 2017) and the reading comprehension tests (by 9 percentage points in 2017).

Again in 2017, Year 6 children sat the Key Stage 2 Reading comprehension test. These are children who sat the Phonics Screening Check in 2011. Those who didn’t pass were placed in synthetic phonics programs mandated by the English Department of Education, until they passed the Check. Yet, this year, only 71% reached the minimum benchmark in their Year 6 reading comprehension test.

Thus, in 2017, more than 1 in 4 English children in Year 6 are not able to read with basic comprehension. The phonics inoculation they were given in their early years patently hasn’t worked, and there is trouble ahead as they move into high school. England should feel very nervous about the next round of PISA results.

The test fails to deliver on any of its claims

Buckingham’s report to the Minister describes the check as a ‘light-touch’ assessment. The research indicates that this is a problematic claim on two counts. It is too ‘light’ to identify and diagnose reading difficulties, but its prominence as a mandatory standardised assessment means its influence on literacy instruction has not been ‘light’.

As a short assessment, it assesses a limited range of phoneme/grapheme relationships, which limits its use as a phonics check. Recent research in England, which pointed this out, goes on to question the purpose and validity of the check.

As a partial assessment of only one reading skill it cannot give a diagnosis of a reading difficulty, and it can offer no direction for subsequent interventions.

Indeed the check has been found to be no more accurate than a teacher’s judgement in identifying struggling readers.

In short, the check doesn’t tell teachers anything they didn’t know already. And it doesn’t tell them what kind of instructional intervention their identified strugglers need.

Heavy-handed effect in England

The phonics screening check has had a very heavy-handed effect on literacy instruction in England. The UK Literacy Association claims it has failed a generation of able readers in the UK.

Students who don’t pass the check are required to re-sit the test after yearlong participation in the government mandated synthetic phonics program. These programs relentlessly drill the children in out-of-context phonic decoding to prepare them to read the unknown or alien words in the check. The deliberate focus on these non-meaningful words has shifted the focus of literacy instruction away from meaning, despite the fact that evidence suggests that the ability to read pseudo words is not a good predictor of later reading comprehension

England now has the farcical situation where literacy time is spent teaching struggling Year 1 and Year 2 readers to decode pseudo words to pass a test.

As a consequence of the over emphasis on synthetic phonic decoding skills, other reading skills have been sidelined. The very purpose of reading, comprehension, has dropped off the instructional agenda as schools focus on ensuring their students pass the phonics screening check.

Flawed reasoning behind recommending the test 

The report provided to the Minister by the panel headed by Buckingham claims the check is required because early reading assessments currently used in every state and territory in Australia are inadequate. The report provides a table of ‘necessary’ components of a phonics check, although it is not made clear what research has been drawn upon to come up with those components.

Notwithstanding this limitation, analysis of the English Phonics Screening Check shows it does not even meet these, the panel’s own requirements for a valid phonics check. Indeed the existing Northern Territory Foundations of Early Literacy Assessment (FELA) meets more of the panel’s criteria than the proposed Check does.

The Check contains both real and pseudo words. The real words are ideally not in the children’s existing vocabulary. The rationale for the inclusion of pseudo and unfamiliar real words is to ensure the children are relying solely on their phonic knowledge rather than prior familiarity with the word. Thus the check is supposed to be a pure assessment of phonic knowledge.

It does not do that. The test itself is flawed.

Detailed analysis of the flaws in the test

An analysis of 10 of the 40 words in the 2017 English Phonics Screening Check is provided below. The analysis confirms research findings that the Check is neither a pure test of phonic knowledge, nor an accurate assessment of phonic skills.

Scoring real word decoding in the 2017 Check

To achieve a correct answer on the 20 unfamiliar real words in the check, a student must correctly read the ‘real’ word, and not use any other plausible phonic decoding for that word. This makes the Check a vocabulary test rather than a phonics test.

For example, ‘groups’ must be read so the ‘ou’ is pronounced as /oo/. If the children decode this word with the ‘ou’ pronounced as /ow/ as in ‘house’, or /u/ as in ‘tough’, they are marked wrong.

As such, the child is marked on their existing knowledge of the word and its pronunciation. The children who used other accurate phonic possibilities for the letters ‘ou’ are marked incorrect, and we are left with inaccurate information about their phonic knowledge.

Similarly ‘chum’ must be read with the ‘ch’ pronounced as /ch/ in chip, not /k/ as in Chris or /sh/ as in chef.

‘Blot’ must be decoded to rhyme with ’hot’. If the ‘o’ is pronounced as the ‘o’ in ‘so’ or ‘go’ the student is marked wrong.

The ‘oa’ in ‘goal’ must be pronounced to rhyme with ‘foal’. If the student breaks the word into go – al, using the pattern found in ‘boa’, they are marked as wrong.

These examples show the children are being marked on their vocabulary knowledge, not their ability to use phonic knowledge. They are being marked wrong, despite plausible phonic decoding, and as such we have not gathered accurate information about their phonic strengths and weaknesses.

Scoring pseudo word decoding in the 2017 Check

To achieve a correct score for the pseudo words, the students must decode the word using only the phonemes identified in the marking guidelines.

For example, the pseudo word ‘braits’ is only marked correct when the ‘ai’ is pronounced as /ay/ as in ‘rains’. If the child decodes the word using the /a/ in ‘plaits’, or the /e/ in ‘said’, they are marked incorrect.

Given ‘braits’ is not a real word it is unclear why only one phonological interpretation is allowable. And it is unclear what we have learned about the child’s phonological skills, given they were marked wrong when their decoding was correct.

The pseudo word ‘zued’ is only marked correct if the ‘ue’ is pronounced as /oo/ as in ’too’, ‘to’ or ‘two’. If the students use any other pronunciation of ‘ue’ as heard in ‘duet’, ‘cruel’, ‘suede’ or ‘cue’ they are marked incorrect.

The ‘ue’ pattern is assessed yet again in the pseudo word ‘splue’. Once again the only decoding effort marked as correct is the /oo/ as in ’too’, ‘to’ or ‘two’.

The ‘ue’ digraph is being tested twice in 40 words, and with only the one pronunciation marked as correct. It leaves unanswered how the ‘ue’ in ‘cue’, ‘league’, ‘duet’, ‘cruel’, and ‘suede’ might be assessed.

‘Tay’ is designated a pseudo word in the 2017 test, which I’m sure the Scots would be surprised to hear given it is the name of Scotland’s longest river. Another reason for Scottish independence perhaps?

To score correctly on this word the students must rhyme it with ‘pay’.

However it turns out ‘tay’ is also a real word in the vocabulary of a Turkish 6 year old. It is the Turkish word for a baby horse – pronounced like the English word ‘tie’ to rhyme with ‘aye’.

It is also a real word in the vocabulary of a Vietnamese 6 year old. It is the Vietnamese word for hand – also pronounced like the English word ‘tie’ to rhyme with ‘aye’.

What information has been gained, or missed, about these children’s linguistic competence by marking their decoding as incorrect?

Scoring two syllable words in the 2017 Check

There are 36 one syllable decodable words in the Check, and 4 two-syllable words. The two syllable words are particularly problematic when using a synthetic left to right decoding method, which is the theoretical basis of the Check, and the accompanying mandated instructional interventions.

‘Model’ was one of those four two-syllable words in 2017.

If the word is decoded left to right using synthetic decoding processes we are likely to read the word as ‘mo’ to rhyme with ‘so’ and ‘del’ to rhyme with ‘hell’. As a consequence we end up with a word that sounds like the way we pronounce the word ‘modal’. If a child decodes the word in this manner, they will be marked wrong in the check.

It is necessary to have the word ‘model’ in your vocabulary to pronounce it correctly, to know which syllable takes the emphasis, and to know that that the second vowel is reduced to a schwa sound.

‘Reptiles’ is also on the 2017 test. Using the left to right approach taught in synthetic phonics programmes this word can be plausibly broken up as follows: rep – til – es. This would be marked wrong in the Check. Marking such an attempt as incorrect would fail to take account of the phonic knowledge the student has. Consider, in contrast, if the word had been ‘similes’. If the child had broken the word into si – miles, they would have been marked as incorrect The only way a child would know to break the words into rep – tiles, or sim -il -es, is if they have the word already in their vocabulary.

So the check is failing in even what it is purporting to do, that is measure phonological processing.

England should pull the plug on this test

The Phonics Check has failed to deliver the desired improvements in reading comprehension in England. It was worth a shot, but it is time to pull the plug.

It has failed because it attends to only one early reading skill, and thus distorts reading instruction in the early years to the detriment of reading comprehension in the later years.

It has failed because the Check is faulty, and ill constructed. It is unable to successfully assess the one skill it seeks to assess, phonological processing, and as such cannot even provide accurate diagnostic information to teachers.

Facing our literacy challenges in Australia

Australia can avoid falling into the same trap. Like England, we clearly have literacy challenges in the upper years of primary and secondary school. Our NAPLAN results for Year 7 and 9 make this very evident. But these are not challenges with the basic skills of phonological decoding of simple words and nonsense stories of Pip and Nip. These are challenges with depth of vocabulary and the capacity to deal with the complex syntactic structures of written texts across the disciplines.

It is crucial the State and Territory Ministers of Education are not distracted from these real challenges by placing false hope in a Phonics Screening Check. It is time to dump the idea.

 

Misty Adoniou PhD is Associate Professor in Language, Literacy and TESL at the University of Canberra. She was a primary school teacher for 10 years before moving to Greece and teaching and consulting in the area of English Language Teaching for 7 years. She has received numerous Teaching Awards including the Vice-Chancellor’s Award for Teaching Excellence, and was the Lead Writer of the Federal Government’s Teachers’ Resource for English Additional Language/Dialect (EAL/D) learners. She sits on a number of national and international advisory boards as a literacy expert.

 

Misty Adoniou with UK Professors Greg Brooks (member of the Rose Report Panel), Terry Wrigley, and Henrietta Dombey have contributed to a book edited by distinguished researcher Margaret Clark (OBE) published this month outlining how England came to adopt the Phonics Screening Check and providing more detail of its impact on teaching and learning as well as its costs.

Margaret Clark (Ed) (2017) Reading the Evidence : Synthetic Phonics and Literacy Learning. Glendale Education, Birmingham available on Amazon