Friday, 27 November 2015

Stuff they don't tell you about research success


 

This week, I was asked to say a few words at the opening of a Higher Degrees by Research (HDR) festival, at the Bendigo campus of La Trobe University, where I am the Head of the Rural Health School.

This gave me pause for thought about why some people are happy, productive and "successful" in the research space, and others are not.

Here's a few pearls I've gleaned along the way in my own research career, which started relatively late, as I spent some 13 years in clinical practice before returning to study and completing a PhD. As an aside, I don't regret one of those years in clinical practice - they provided rich, complex experience and gifted me some precious and enduring friendships.

So, what I have learned along the way?
  1. After you've carefully selected your PhD supervisors (that's another blog-post in itself), make it your business to soak up all the mentoring and support they can offer. You'll never again be on the nursery slopes, so submit to the ignorance and naiveté and drink from the font of your supervisors' wisdom. Listen carefully. It's fine to make some mistakes along the way, but you don't have to make all of them.
  2. Your research supervisors are a bit like your parents - they are expected to literally "supervise" you, to give you feedback (positive and negative), to set boundaries (you can't answer every known question on your topic) and to set timelines (it's not OK to roll your candidature over year after year like your car registration). You don't have to like your supervisors and you don't have to be their friends. (That said, I count myself as very fortunate to still on very good terms with both of my PhD supervisors and I still publish with them both from time-to-time).  As with the parent-child relationship, you are expected to develop increasing independence over time.
  3. Your PhD probably won't change the world. It's actually an apprenticeship in which you're meant to be learning stuff. A lot of what you learn goes under the heading of "hidden curriculum" - how the publishing game works, how the hallowed halls of academia function, which stats program is easiest to use, and where to buy the best coffee on campus.
  4. Find out what matters in academia and do more of it. People who get ahead typically have a clear program of research, and say no to temptations to stray too far from it. They also work incredibly hard, invariably in what might otherwise be seen as their own time: evenings and weekends. "But that's bad for work-life balance" I hear you say. Well maybe it is and maybe it isn't. You'll need to decide at different points in your career how this plays out, but you need to understand that those with whom you're competing for research funds and academic posts are almost certainly working many more hours than those for which they are paid. Someone had to say it.
  5. Publishing matters, so if you're not a strong writer, you need to develop your skills, or prepare to be lost in the publication crowd.
  6. Don't be too distracted by presenting at conferences. Don't get me wrong - conferences are important for sharing data, receiving feedback, and networking with colleagues. But a conference presentation doesn't carry the same weight on your CV as a peer-reviewed publication. If you're wanting an academic career, it's the latter that is important.
  7. Think about where you're going to publish. This means considering journal Impact Factors, target audience, and the actual quality of your manuscript. Be strategic (and realistic) about the match between the size and rigour of your study and the likelihood that the Editor of Nature/The Lancet/BMJ etc will be sitting by the phone waiting to hear from you.  
  8. Learn about metrics such as h-Indices. Sure, they are highly reductionist and potentially even flawed. If you have an h-Index of 10, you're hardly getting credit for that amazing Cochrane Review that has been cited 280 times, as it has to just sit alongside the other nine papers that have been cited 10 times. Remember too, that your work might be oft-cited because people think it's a good example of a poor methodology or shoddy practice. I wonder how many times the (subsequently retracted) Andrew Wakefield autism-MMR study was cited? Learn to love all your h-Indices equally, whether the "official" offering from Scopus, or the always higher version offered by Google Scholar (because it picks up a lot of grey literature not included by Scopus).
  9. Don't be afraid to change tack in your research career. I started off studying communication impairment and psychosocial outcome after traumatic brain injury and now have a focus on two key areas: language skills of young offenders and literacy education (as many of you would realize, there's a sad link between the two). I do some related work on young people in the state care system, but try to be careful to always be able to articulate clear links between my research interests.
  10. Make sure you can answer the "so what?" question about your research. If you're going to spend a good part of your work (and non-work) time consumed with a particular issue, you need to be able to explain to funding bodies why it matters. Your research should also, therefore, pass the pub test (or failing that, the grandmother test) - it needs to be able to be packaged to make sense to the tax-payer who may well be asked to fund it.
  11. Related to the "so what" question is the notion of translational impact. What are your findings going to translate into and how? Your answers to these questions should drive your dissemination strategy, covering peer-reviewed journals, reports, conference presentations, and the use of social media.
  12. Collaborate. It's been said that if you want to go fast, go alone, but if you want to go far, go together. To be honest, I think you can only go fast on your own to a point. You can sometimes quickly get certain specific tasks done on your own, but if you want to achieve significant outcomes in the research space, you need to form collaborations with others who share your focus and interest. But you don't all have to fall into the photocopier. It can be tremendously beneficial to have different paradigms, disciplines, and methodologies represented in your team - provided there are good reasons that are driven by the research agenda, not by misguided charity about finding a role for a drifter who has lost their way research-wise. Remember too, that funding bodies look at the quality and make-up of teams and this assessment can be weighted quite heavily in the overall rating of your project.
  13. Kiss a few frogs. By that, I mean cold-call people interstate and overseas whose work is cognate to yours, and share your most recent publication (in which you have hopefully cited their work). Most researchers are delighted to hear that someone far away is aware of their work and has taken the trouble to get in touch. I've formed a number of enduring international collaborations in this way and have published with at least two of them. Sometimes you won't get a response, and sometimes it will be like a luke-warm bath. That's OK, and it may not be about you - it may be because their life is complicated at the moment and your timing was unfortunate.
  14. Expect set-backs. They will probably be many and at times, quite painful. I read a wonderful article via Twitter recently, called Me and My Shadow CV. Read it. It's a great reminder that we don't see the rejected manuscripts and grants, or the unsuccessful job applications when we look at the profile of someone we see as an academic star. However it's the entries on this ghost-document that provide us with valuable learnings, not to mention an extra layer or two on the rhino hide we call academic resilience.
  15. Speaking of Twitter, if you're not using this incredibly valuable platform, you almost certainly should be. I have discovered whole new professional global networks of people who are interested in things I'm interested in. Invariably these days if I come across something new and interesting, it's via Twitter. Try to follow a few people whose views you don't necessarily share too - it's good to have your assumptions nudged from time-to-time, and to know how others think on your topic of interest (even if you're pretty positive that they're wrong).
  16. Be reliable. Successful researchers can't tolerate unnecessary weights in their saddle bags. Be known for being the person who states what they will do, commits to delivering in a timely and thorough manner, and then does so.
  17. Enjoy the ride. For all the lows and frustrations, the life of a researcher is deeply satisfying. There is great personal satisfaction in having a paper published after the long haul of funding, ethics approval, data collection and delays, multiple manuscript drafts, late-night data-wrangling, responding to appallingly misguided reviewer comments, and other set-backs of various forms. I am reminded of this when, from time-to-time, I receive an email out of the blue from a total stranger, telling me how they have used such-and-such a paper to change their practice or influence a policy maker. H-Indices are all well and good, but it doesn't get much better than knowing that somewhere, you've made a small difference on the ground. 
(C) Pamela Snow 2015


Wednesday, 21 October 2015

How to influence your students' brain chemistry and other handy hints

Science activities for kids


I'm not the first academic commentator to be critical of the rapid infiltration of classrooms by so-called "neuroscience". Indeed some teachers at the coalface are also sceptical of this latest education movement, though it is not always easy for them to speak out against the zeitgeist when a charismatic presenter spruiks the virtues of teaching to the "whole brain". I say "so-called neuroscience" because in the main, this is neuroscience-lite at best, and neuroflapdoodle at worst. Either way, it comes packaged in an over-simplified box and is promoted to teachers via equally over-simplified sound-byte messages.

I think it's fair to say the average classroom teacher has limited-to-zero knowledge of neuroscience. That is not a criticism of teachers, nor of teacher training. Teachers need to understand factors that promote good learning and positive behaviour. They also need to learn to apply skills in the classroom that increase the likelihood of good learning and positive behaviour occurring. Knowing something about the inner workings of the human brain may or may not be useful in these endeavours, I really don't know. The question is akin to whether I need to know about the inner workings of a carburettor in order to be a good driver. Maybe we should devote time in the pre-service teacher education curriculum to neuroscience and maybe we shouldn't. However I am quite certain that we should not be feeding teachers (at any stage of their careers) a diet of pseudo neuroscience, and dressing it up as "research-based".

Education has a long history of loving its fashions and fads, and one of the more recent skirt-lengths is so-called "Brain-Based" (or "Whole Brain") learning. There's a large number of Youtube clips demonstrating this approach in action. If you're not familiar with it, I suggest you have a look and decide for yourself.

At least one Faculty of Education at an Australian university is offering teacher professional development on this approach. Yes, it's possible, in spite of the rigours of peer-review, to find research to support almost anything. However, we don't accept such reasoning in medicine, aviation, or engineering, so why should we accept it in education?

The term "research-based" devalues the currency if it only means "there's a study published in a journal somewhere that says this might be okay". Medicine woke up more than twenty years ago to the idea that all evidence is not created equal, and hence in the health sciences, we refer to levels of evidence. This hierarchy gives us a yardstick with which to exercise our scepticism and offers some protection against the hasty (and potentially premature) adoption of approaches that may be no better than current practice, may create an opportunity cost, or may actually be harmful. In health sciences, of course, we are exposed to the impact of our poor practices, as people deteriorate and sometimes die when we don't get it right. In education, practitioners are quarantined from having to see effects of poor practices; children disappear from view, progressing automatically to the next grade, until an inevitable tipping point at which an ongoing relationship with education is no longer tenable.

Why does Education not engage with the notion of levels of evidence?

Children are not in a position to give or withhold consent regarding the teaching and learning experiences they are exposed to in the classroom, and in most cases, their parents are not either. So it's the responsibility of the other adults in the village to call out practices that do not maximise the limited developmental window that is available to convey core skills to all children in the early years.

Faculties of Education running courses for teachers on brain-based learning is akin to Faculties of Medicine running courses for doctors on homeopathy. The latter would see editorials in major newspapers and would potentially threaten the good standing of the medical program in the eyes of accrediting bodies.

Come on Education. It's time to get serious about producing, accessing, stratifying, and using evidence.


(c) Pamela Snow 2015


Tuesday, 8 September 2015

Can we just leave the brain out of this please?


It might sound like a strange request from someone who has spent much of her professional life trying to achieve a modest grasp on the modest body of knowledge about how the human brain functions....but enough is enough!

Often posts on this blog are motivated by my frustration at something in the media that goes against the evidence-based practice grain, but happily this post is (mainly) motivated by two excellent pieces this week, both calling for more straight-talking and less "neurobollocks" in the education sphere. The first is by Melbourne Speech Pathologist, Alison Clarke, and was published in The Age and Sydney Morning Herald: Children with learning difficulties need programs based on science, not anecdote and neurobabble and the second was published on The Conversation, by Jared Cooney Horvath and Gregory Donoghue, PhD students at the University of Melbourne: So much talk about ‘the brain’ in education is meaningless.         


If you work in education, as a teacher, clinician, or researcher, I'd strongly recommend that you read both pieces. Their central thesis is that "neuro this" and "neuro that" is blindsiding sensible people to the simple notion that learning and behaviour are observable, but human brains are not. Teachers have never been able to directly observe children's brains at work, and it's probably better if things stay that way. Even in the bizarre, science-fiction world where that was possible, seeing the brain "working" may not equate to seeing a child learning, or behaving in a settled and focussed way. We already over-attribute from functional MRI studies, so rather than getting excited about "changing brains" it would be awesome if we could instead focus on changing knowledge and skills. This seems to be an ideologically distasteful position to some, and is clearly not nearly exotic enough for those who have been stung by the Cupid's Bow of neuroscience.


The article at this link is an example of what happens when people are bedazzled by neuroscience - they start to think that a common-sense decision like giving children breakfast is some kind of sciency innovation that no-one has thought of before. The teachers at this school also set ambitious goals for their students, and take a careful approach to analysing and understanding behaviour and emotions. Wonderful. But these approaches have nothing to do with neuroscience, and everything to do with good cognitive, behavioural, and developmental psychology.  Like many others, I am very impressed with the clear gains that this school has made, but why do they need the pseudo-imprimatur of being called "neuroscience"? Were they not good enough for the children of 2015 previously?

To presenters of professional development sessions for teachers, I say this:

Unless you are a qualified neuroscientist, you probably shouldn't be talking about neuroscience. It's not subject matter than can be done justice by people without proper credentials, and most importantly, without a realistic grasp of the limits to their knowledge and the (major) limits of the modest applications of neuroscience in education.

To teachers attending PD sessions in which neuroscience is discussed, I say this:

Maintain a high level of skepticism. Ask questions, such as
  • Where did you gain your neuroscience degree?
  • Doesn't everything we do, every day, "change our brains" in some way?
  • Isn't all learning "brain-based" and hasn't it always been so?
  • What actual classroom innovations can be unequivocally traced to a neuroscience discovery?
  • If we see a classroom approach resulting in improved learning and / or behaviour, shouldn't we do more of it? 

Talk of the brain is, unfortunately, the latest fashionable distraction in education. Cognitive psychology has given us decades of evidence about what works / helps / detracts in education, but that has become the Cinderella to neuroscience's rather noisy, if not ugly, sisters.

Let's do our teacher colleagues a big favour and stop this run-away neuro-brain-train before it becomes the next generation-long disaster in education. 
I hope we're not already too late.





(C) Pamela Snow 2015

Friday, 14 August 2015

Dyslexia Dystopia

Last weekend I was privileged to hear Professor Julian (Joe) Elliott speak in Melbourne on the topic most strongly associated with his name - the Dyslexia Debate. This was a keenly anticipated event for me, given my interest in the topic and the fact that I share his view that "dyslexia" is a term that has run its course in learning disability circles.  Joe's visit to Australia was sponsored by Learning Difficulties Australia and bravo to them for doing so.



Dyslexia has become something of a black sheep in the family of "Dys" terms that health professionals such as psychologists, speech pathologists and occupational therapists learn about in their pre-service education. Have you ever heard anyone talking about the Dyspraxia Debate? Or the Dysarthria Debate? No, of course not, because those are terms that have stayed on the nosological leash and in the main don't cause users much complaint. If you're a clinician working in a hospital and someone tells you that the elderly lady in Bed 17 has dysphasia, you'll know to expect someone who won't express herself very clearly and is likely to have at least some difficulties understanding you. All good. But if you're also told that the young apprentice in Bed 18 has dyslexia, what will you expect then? Someone with reading difficulties perhaps? Somehow, though, a quasi-medical term like dyslexia cuts more ice (for some people and in some situations) than the simple descriptor "reading difficulties" (or "problems", or "disorder").

Professor Joe Elliott is a lively, impassioned and engaging speaker. The central thesis in his argument is that dyslexia is not a scientifically robust term that differentiates a particular (special) group of poor readers from other "less special" poor readers, who, by virtue of circumstance (e.g., lack of family resources) have not been "anointed" by the diagnostic label dyslexia. Note though that Professor Elliott is not claiming that children with reading disorders don't exist or that their needs are not important. Nothing could be further from the truth.

He does argue as follows* though, that:

  1. It is not helpful to assign a quasi-medical label (dyslexia) to some children whose reading skills are significantly below those of their peers, and not to others. This assignment occurs on the basis of socio-economic and resourcing issues, as much as on prevailing culture in professional and educational circles about the use of the term. The flow-on effects of this inequity are considerable. Children diagnosed with "dyslexia" may be deemed eligible for additional support services and accommodations (e.g., additional time to complete exams), while those who are simply poor readers will not receive such services and also risk the double-jeopardy of being labelled lazy and/or dull.
  2. The evidence-based interventions that work for children with reading difficulties are the same, irrespective of the label applied.  There's a great deal of snake-oil out there, to be avoided by teachers and parents at all costs. The focus needs to be on what works, not on differentiating who "needs" one of the many pseudo-scientific interventions in the marketplace, Vs who needs assistance with the underlying psycholingusitic competencies that promote reading success (e.g., phonemic awareness, decoding skills, vocabulary development).  Related to the issue of pseudoscience, it's also important to note that Prof Elliott observed that "Neuroscience for education is massively overblown". This point has been well-made by Professor Dorothy Bishop of Oxford University and also on this humble blog.
  3. It is not helpful to use so-called "discrepancy" criteria to diagnose reading problems. Children with high and low IQs can have reading (decoding) difficulties, though IQ is important with respect to reading comprehension.
  4. Avoiding a label of "laziness" is not a sufficient reason to diagnose dyslexia. We should assume that all children can and will learn to read, and need to ensure that appropriate instructional environments are provided to promote success. On this, Prof Elliott also made the observation that "Whole Language ruined an entire generation of weak readers in the UK". He also observed that "We've only been reading since yesterday in evolutionary terms". This fact is often lost on Whole Language advocates, who erroneously claim that reading and writing are as natural as speaking and listening. Not so.
Discussants at this event were Associate Professor Tim Hannan of the School of Psychology at Charles Sturt University, Ms Mandy Nayton Executive officer of Dyslexia SPELD  Foundation WA and Professor Tom Nicholson of Massey University, NZ and Alison Clarke, Melbourne Speech Pathologist. As one would expect from such a range of stakeholders, views varied on the utility of the term dyslexia, with Mandy Nayton in particular arguing that the debate is potentially a distraction from the wider issue of improving instruction. There was strong, but not unanimous support for adopting the US Response to Intervention approach, which is highly data-driven and seeks to ensure optimal academic and behavioural outcomes for all children. We must remember though, that RTI is only as good as the assumptions that are made about the quality of the instruction that occurs at Tier 1 (universal, classroom based teaching). If we do better at that level, we should have only small percentages of children (certainly fewer than 10%) needing services at Tiers 2 and 3. At the current time, I don't believe we can be confident in Australia that we are getting it right at Tier 1, and this belief is borne out in national data on unsatisfactory reading progress by Australian students (see this previous blogpost for links).

I think the genie's out of the bottle on the term dyslexia, and its usefulness for children who struggle to read (and their teachers and parents) has run its course. It has run its course in the same way that the term dysphasia ran its course twenty years ago as a descriptor for childhood language disorders. Although there has been some unhelpful terminological detouring in that domain as well, there is at least an appetite now for plain(er) labelling.

So - let's round up this errant black sheep of the "Dys" Family and put him in a secure enclosure where he can't cause further mischief with our already muddled thinking on this important issue.

*I have drawn here on the following publication: Elliott, J. (2014). The dyslexia debate: Some key myths. Learning Difficulties Australia 46(1&2).

See also Professor Elliott's book, co-authored with Elena Grigorenko: The Dyslexia Debate.

(C) Pamela Snow 2015

Thursday, 6 August 2015

Can we talk about NAPLAN?

I have avoided blogging about NAPLAN* until now, because I know it's a deeply divisive topic, and is one that can elicit some particularly impassioned responses from teachers, academics, and parents. I just did a quick search on the hashtag #NAPLAN on Twitter and as you can see, the result confirms that there's range of views on this important issue:


As with all divisive topics on which well-motivated people disagree, there's a number of issues that need to be fleshed out. To be clear, I am in favour of national testing, but that doesn't mean that I am blind to the need for some improvements.

I don't intend here to dissect individual stakeholder arguments one-by-one, but in general, here's some of what I've read and heard in recent days, and some thoughts in response:

  1. NAPLAN isn't assessing knowledge and skills that are relevant in modern education (or some variant of this, along the lines of the specific tasks not being appropriate).

    Much is written and discussed these days of the importance of modern education preparing students for an uncertain future in a complex world. The future has always been uncertain, but I see no indications that the importance of literacy and numeracy will diminish, particularly as these apply to gaining access to higher education and skilled employment. Many teachers and teacher educators are dismissive of data from PIRLS testing, and Australian Bureau of Statistics data on literacy levels in this country, but for me, the compelling cry comes from the Industry Skills Council of Australia 2011 report "No More Excuses" which points out that literally millions of Australians lack the literacy, numeracy and language skills to cope with the demands of the workplace. What was that again about preparing students for the demands of a complex world?
  2. Taking part in NAPLAN testing is stressful for students, especially for those who are struggling academically or facing particular psychosocial adversities in their lives. We should protect such children from this stress.

    I find this argument particularly fuzzy and just a bit disingenuous. Yes, there will be some children who experience anxiety about any kind of testing - including that which teachers do outside of the NAPLAN process. Anxiety is not necessarily harmful, in fact in the right "dose" it can be beneficial to performance, and can equip us to better confront future challenges. It is also a normal human emotion and one that we all need to learn to manage, so we can cope with a range of everyday uncertainties and stresses. How will children and adolescents learn to habituate to the stress of testing if they are sheltered from it? Controlled exposure, under the calm lead of a skilled teacher should significantly address such concerns. Removing everyday stress from children does not teach them how to deal with stress.  Life is testing and testing should not be a taboo word. Ramping up talk of "high-stakes" testing, however, is irresponsible and not child-friendly.
  3. Teachers already assess their students and know "who's who" with respect to achievement and needs.

    So the argument here seems to be that data is OK when teachers collect it, but data is on the nose when it is collected by government authorities. Of course we would expect teachers to know who is achieving, who is excelling, and who is struggling, based on their observations and interactions with students every day. However we could also expect that the notional "bar" shifts as a function of the community and school in which a teacher is working. Given the inescapable reality of socio-economic status (SES) and teacher practice variations between schools, isn't it helpful for teachers to know how students in similar and different settings are going? Aren't teachers just a bit curious about what is happening in other schools and pondering why?
  4. Other professions don't have to undergo this kind of "government scrutiny" of their work outputs.

    Every profession is different, and if what you do is important (and you're tax-payer funded), you should assume that someone will want to look over your shoulder at some point and see how well you're going. Maybe your performance could be used to assist others who are not achieving as well as you are. Maybe there's aspects of your performance that could be tweaked. Is that such a terrible thing? Some schools are obviously "punching above their weight" with respect to NAPLAN scores, e.g., as reported here. If I was a primary school teacher, I'd be very keen to hear about what they've been up to at Harrisfield PS in recent years.  
  5. The data arrives too late in the academic year for it to be of real use to teachers and schools in addressing individual student needs.

    I agree with this criticism. Let's speed up the cycle and get data back to teachers and schools much earlier in the year. In return though, teachers and schools should stop discouraging so-called "weaker" students from participating in NAPLAN testing.
  6. Variations in performance reflect inequities in how schools are funded.

    This seems to be much more an ideological position rather than one that can be argued from any kind of evidence-base, particularly as some schools buck the demographic trends, as a result of specific pedagogical approaches they adopt. Sure, we can do a lot more to improve equity and fairness with respect to school funding, but when NAPLAN results continue to be highly variable, what will the nay-sayers resort to as an argument then?
  7. The existence of NAPLAN narrows teachers' focus and distorts curriculum delivery.

    I think the best answer to this came from a young Grade 3 teacher I spoke with recently. She said her principal asked her earlier in the year what she was doing to prepare her students for NAPLAN testing. Her reply? "Teaching them well".

We do need to keep talking about NAPLAN, but in ways that authentically put the educational interests of children ahead of professional vested interests. We would expect nothing less from colleagues in medicine, engineering and aviation, whose work also involves enormous trust on the part of the community. As I've said many times on this blog, the work of teachers is no less important than that of any of those other professions, and as such, we expect engagement with data from a range of sources. NAPLAN is but one of those.


(C) Pamela Snow 2015







Sunday, 2 August 2015

Santa Claus, Homeopathy, and Phonics: Where's the link?


Most of us can remember when we found out that Father Christmas, the Easter Bunny, and/or the Tooth Fairy were not real entities. For me, the disappointment of discovering that my parents were Santa was off-set by the relief that my attempts to deal with the illogical premise of his work-schedule were now reconciled. I must have been a budding empiricist even as a seven year old. Letting go of illogical but cherished belief-systems is an important rite of passage in childhood. It hurts a little, but it leads to greater maturity and depth of understanding about oneself and the complex world in which we live. Sometimes we have to let go of cherished beliefs as adults too.

I often think that for primary teachers whose pre-service education has been dominated by Whole Language-based ideology and pedagogy, exposure to the scientific evidence on what works (and who gets left behind) with respect to reading instruction must feel somewhat akin to losing a belief-system like the idea that a fat jolly bloke in a red suit flies around the world bringing presents to all of the children of the world (well, to those of a particular belief-system, and even then, not in an equitable fashion....let's not try to untangle those loose ends today).

There's a number of challenges in having discussions about evidence-based practice with teachers, and none of these reflect on teachers, per se. They do, however, reflect on teacher training.

  1. While systematic synthetic phonics instruction is strongly favoured by the cognitive psychology literature as a basis for early reading instruction, some children, notably those from more disadvantaged backgrounds, derive particular benefit from such approaches. Teachers who work in communities characterised by high levels of economic, social and human capital will likely find that many (but by no means all) children will make the transition to literacy almost irrespective of the instructional focus, because their classrooms have more "high readiness" than "low readiness" children with respect to learning to read. This is somewhat akin to the fact that doctors who work in such areas may see lower rates of illnesses in children that are due to air-borne pathogens, because such illnesses are more common where living conditions are over-crowded. Our everyday experiences are a powerful driver of what we see as "normal" and "abnormal".
  2. If teachers accept, in spite of their pre-service education, that Whole Language based approaches such as expecting children to memorise lists of commonly occurring words, and the use of three-cueing strategies are not optimal, what do they do then? Become overnight experts on delivering systematic synthetic phonics instruction? Not easy.
  3. Teachers are typically not taught the skills of reading and critically appraising scientific research. The power of anecdote and personal experience prevails when such a skill vacuum exists.
  4. Teachers have been sold a crock in so-called "Balanced Literacy". This is a slick attempt by some teacher educators to pay lip-service (no pun intended) to phonics-based early instruction, through pitches such as "Oh it's OK. We've moved on from the reading wars now. Now we teach Balanced Literacy, so phonics is in the mix". 
In the mix?

Let's consider the so-called "Five Big Ideas" in literacy education (phonemic awareness, phonics, vocabulary development, comprehension, and reading fluency). How much emphasis on the first two is enough? And when should phonemic awareness and phonics be introduced and called upon in the learning process? You might think you'd find some answers in the Australian document entitled "The Place of Phonics in Learning to Read and Write" by Emmitt et al. (2013). Instead, this document takes a perversely undermining position with respect to the importance of phonics instruction. The purpose of this blogpost is not to deconstruct the work of Emmitt et al., but rather to use it as an example of a modern guide for teachers that promotes what I've come to think of as homeopathic doses of phonemic awareness and phonics instruction. Sure, they're in the mix, but in doses that prevent systematic skill acquisition by early learners. Effective phonics instruction requires specialist knowledge of the structure of the English language, and this unfortunately has been shown to be significantly lacking in the teacher workforce - again, no fault of teachers, and something I will come back to in a future blogpost. 

In the meantime though, we need to think long and hard about what it means for children to be receiving patchy and often weak instruction in phonics. Phonics is not a stand alone. It's necessary but not sufficient in order to get beginning readers off the blocks and into the transformational world of deriving meaning from written text. But it needs to be taught well if early inequities in reading readiness are to be removed in the critical first three years of school.
 


I do wonder though, how we can move from our current impasse. Four decades of Whole Language dominated teacher education and classroom practice stands between today's children and exposure to evidence-based reading instruction. Maybe I need to believe in one of these:

 



(C) Pamela Snow 2015



Thursday, 23 July 2015

Trust me.....I'm a teacher

Imagine a newspaper headline that read "It's time medical researchers allowed doctors to do their job". The story might go something like this:

Chief Medical Officers from around the country have today committed to not allowing the practices of medical staff to be influenced by medical research. Speaking on behalf of his colleagues, Dr Sid Nurk stated "Enough is enough. Doctors have been to university and they know what is right for their patients. Researchers aren't looking after patients, so they have no place in clinical decision making". Dr Nurk said from now on, patients would just have to work around the issue of doctor variability and accept that patients get better under some doctors and not under others. 

It sounds a bit silly, doesn't it? And yet, the corollary of Dr Nurk's thinking is exactly what is proposed by Melbourne teacher and newspaper columnist Christopher Bantick, in a recent article in The Age entitled "It's time researchers let teachers do their job"

It's astonishing and deeply concerning to see a teacher argue such an anti-intellectual corner. No profession, whether teaching, medicine, engineering, law, or aviation should be allowed to "do its own thing" unfettered by scrutiny from interested stakeholders, taxpayers, friendly critics and/or researchers. That a teacher would publicly assert such a position affirms some of the worst stereotypes about education operating in an evidence-vacuum. It also leaves no obvious room for finding common ground characterised by genuine curiosity about what works best in the classroom, and under what circumstances. Academics whose research has a focus on education (of whom I am one) are typically motivated not only by intellectual rigour, but also by a sense of social justice that compels to action. 


If evidence matters when we are treating cancer, building bridges, or flying aeroplanes, why doesn't it matter when we're educating the next generation of doctors, engineers and pilots?

 
       

   
(C) Pamela Snow 2015