Wednesday 8 November 2017

Straw men and obfuscation: My response to Misty Adoniou on the Phonics Screening Check


This week, A/Prof Misty Adoniou of the University of Canberra published a piece entitled How the national phonics test is failing England and why it will fail Australia too on the AARE Blog

In her blogpost, Misty relies on what she herself has reported in an academic publication, as teachers' poor knowledge of the structure of language, to construct a series of weak, straw man arguments about phonics teaching and its assessment. For the record here is what Misty said about teachers' knowledge of language and literacy in a 2014 paper (my emphasis):

 “The consequences of a lack of content knowledge in teaching literacy can be serious, with Shulman (1986) indicating that lack of content knowledge results in narrowed and regressionist pedagogies as teachers resort to replicating own past experiences with instruction in language. In particular, to be effective in teaching children who struggle with literacy, they need a strong content knowledge of the English language (Spear-Swerling & Cheesman, 2012). Numerous accounts of beginning teachers note a lack of content knowledge about how the language works – most particularly, the basic constructs of the English language (Alderson & Hudson, 2013; Hadjioannou & Hutchinson, 2010; Moats et al., 2010; Washburn, Joshi, & Cantrell, 2011; Wong, Chong, Choy, & Lim, 2012). Spear-Swerling and Cheesman (2012) suggest that without good content knowledge in the area of literacy "teachers may provide inadvertently confusing instruction to children” (Spear-Swerling & Cheesman, 2012, p. 1692). 

So - how are teachers, with their "lack of content knowledge" to judge the veracity of the claims Misty makes in this recent blogpost? I hope my responses below are of some assistance to them in this endeavour. I will select Misty's key points (shown in red) and respond to them (in black).

(The) testing and practicing nonsense words that has accompanied the implementation of the test appears to be narrowing classroom practice and damaging literacy standards.

Assessing nonsense words (more correctly referred to as pseudo-words) is the only way of knowing that it is children’s actual decoding skills (otherwise known as phonics skills) that are at work when they read a word aloud. It is after all, a check of phonics, not a check on reading for meaning. There are other assessments for that. As anyone with any familiarity with the Simple View of Reading will know, both skill sets must be in place in order for children to become effective readers.

It is unfortunate if some teachers are misunderstanding the function of the check so fundamentally that they are “teaching” pseudo-words – perhaps in the hope of randomly hitting on a few that will actually turn up in the check and children will remember them? This is obviously misguided and misses the point that if a skill is in place, it can be applied across a range of conditions. That said, nonsense or pseudo-words should not be unfairly demonised. Many children’s books contain what adults might refer to as nonsense, or made-up words and the only way that these can be lifted off the page is through knowledge of phoneme-grapheme correspondences. Context will not help you decode “quidditch” for example. 

Is Spike Milligan’s Ning Nang Nong poem to be banned in schools because it contains nonsense words? 

Come on.

Should we wish to test the phonological awareness of our six year olds this test would be inadequate.

This is a particularly puzzling statement, as the PSC does not set out to assess phonological awareness (PA). PA and its derivative, phonemic awareness is an important predictor of reading success, but it is not what is being targetted in the PSC, in the same way that vocabulary, fluency, and comprehension are not targetted. The Phonics Screening Check has a focus on well…phonics.

Why, you may ask, would we need a screening check on this aspect of early reading instruction? The answer to that question lies in the contested, "ugly duckling" status of phonics in the instruction toolkit in recent years, as discussed here

The process that led to this test being recommended for all Australian six year olds was deeply flawed and is an unfortunate example of the growing influence of ultra-conservative think tanks on educational policy.

This is simply an attempt to alarm the reader early on by suggesting that the PSC is just a Machiavellian plot to bring down modern society. I was on the Year 1 Literacy and Numeracy Panel that recommended a trial of the PSC and can assure readers I have no political affiliations one way or the other. 

Politics is the smoke-screen people hide behind when science is not on their side.

Move on. Nothing to see here.

A review of that research finds little value in the Phonics Screening Check.

The “review of the research” that Misty refers to here is in fact one single study conducted by a UK body that was opposed to the introduction of the check in the UK. It reports the views of 494 respondents – hardly anything that could be said to approximate a representative sample of UK early years teachers. 

Again, nothing to see here.

In 2017 these ‘successful’ phonics-ready students sat their Year 2 Key Stage 1 reading comprehension testTo pass this reading comprehension test, children only had to score 25 from 40 questions. However, only 76% passed. And only 61% of low SES students passed the test.
It appears then that being poor has more to do with your reading comprehension achievement than knowing your sounds.
It also seems the phonics check hasn’t solved the gender puzzle in reading achievement, as girls consistently outperform boys on both the phonics check (by 7 percentage points in 2017) and the reading comprehension tests (by 9 percentage points in 2017).

Here, Misty conveniently glosses over the fact that in 2016, the nature of the Year 2 comprehension testing that was used in the UK was different from previous years and was more demanding. No comparisons can be made with previous years.

I agree with Misty that being poor is a significant challenge – for students, teachers, and educational systems more widely. Much of my research in the last twenty years has focussed on students from disadvantaged backgrounds, so there’s no surprises in the fact that a social gradient exists with respect to the knowledge and skills children bring to school with them. Eminent researchers such as Sir Michael Marmot have devoted their professional careers to trying to influence social determinants of health. Asking a 10-minute PSC to achieve this after seven years is a bit fanciful. 

It’s all about baby steps.

That said, however, it is entirely possible, given the particular advantage that children from low-SES backgrounds derive from explicit teaching (see Snow, 2016), that low SES students may be deriving a particular benefit from exposure to the PSC and the teaching that sits around this. This kind of subgroup analysis is the kind of nuanced inquiry that is needed in this space and we will have an opportunity to ask this question if the check is employed in Australia.

And similarly, no, a 10-minute check and the teaching behind it will not counter the biopsychosocial influences that come packaged as gender. Again, it’s all about baby steps.

Again in 2017, Year 6 children sat the Key Stage 2 Reading comprehension test. These are children who sat the Phonics Screening Check in 2011. Those who didn’t pass were placed in synthetic phonics programs mandated by the English Department of Education, until they passed the Check. Yet, this year, only 71% reached the minimum benchmark in their Year 6 reading comprehension test.

What Misty fails to mention here is that this represents an increase on the previous result of 66% - I think we would call that a move in the right direction and a result that warrants staying the course to see where the trends go over the next few years. 

None of us arrived at our current rather parlous position overnight, and we won’t trade out of it overnight either. A shift from 66% to 71% represents tens of thousands of students being on stronger educational trajectories, something we all strive for every day.

As a short assessment, it assesses a limited range of phoneme/grapheme relationships, which limits its use as a phonics check.

The very nature of screening is that a full range of possibilities is not explored. To do so is to enter into diagnostic testing, which is a completely different ball-park. 

I agree with Misty that a PSC should not be construed as a fail-safe early detection system for children who may go on to display reading difficulties (sometimes referred to as dyslexia), however the fact that the results are immediately available to teachers means that red-flags will be raised in some cases, and appropriate referrals will be made. Let’s not ask any more of this measure than what it can reasonably deliver.

It is a straw man, however, to say that the PSC fails at something it was not designed to do. My coffee machine doesn’t wash the dishes. It wasn’t designed to.

Misty also provides a number of examples of what she presents as flawed test items in the PSC. All measures have potential flaws, and this is where good test design, development, piloting, and refinement comes in. None of the examples Misty describes constitutes a “deal-breaker” – they reflect examples where Misty’s knowledge of language could be employed to strengthen item development and assuage some anxieties about the content of the screen.

Australia can avoid falling into the same trap. Like England, we clearly have literacy challenges in the upper years of primary and secondary school. Our NAPLAN results for Year 7 and 9 make this very evident. But these are not challenges with the basic skills of phonological decoding of simple words and nonsense stories of Pip and Nip. These are challenges with depth of vocabulary and the capacity to deal with the complex syntactic structures of written texts across the disciplines.

Yes, we do have major literacy challenges but there is no data on which to base the claim that all is well with the decoding skills of struggling older students. Again, the Simple View of Reading invites a nuanced appreciation of student strengths and difficulties across the range of linguistic domains that support reading success.

It is not a question of phonics Vs non-phonics – that is an artificial distinction that is not empirically supported. It is also insulting and derogatory to refer to “Nonsense Stories” of (by implication) decodable readers. Often these stories are far more plausible and narrative-based than the repetitive predictable scripts found in levelled readers widely used in Australian classrooms.

The UK Literacy Association claims it has failed a generation of able readers in the UK.

Well yes, they would "claim" this, wouldn’t they, because they are opposed to the check. But Misty – repeating a broad, baseless, exaggeration does not transform a broad, baseless, generalisation into a statement of fact. It is still a broad, baseless, generalisation.



I know that Misty has an extensive knowledge of language and how it works and I know she spends considerable amounts of time delivering professional development to teachers to try to back-fill some of the gaps left by pre-service education that neglects to provide teachers with this foundation (see references at this link). 

Given this knowledge, and the fact that Misty claims to be "pro-phonics" instruction, it is perplexing and disappointing that she uses her position of influence to obfuscate rather than inform. 

The problem with straw men, you see, is that they can be scary and they can be used to start bush fires. 

(C) Pamela Snow (2017)