Is the new Victorian learning difficulties screening assessment any good?
14 RepliesFrom the end of this year, all children starting school in this state will be screened for learning difficulties.
The Minister’s media release says, “…the improved process will be the first step in the early detection of learning difficulties, so that children could get the support they need sooner” and that the policy means “properly supporting students with additional learning needs so they get the opportunity at a great education.” (Does poor grammar in the Minister for Education’s media releases make you feel a bit despairing, too?)
A useful free online document called Selecting Screening Instruments explains why school entry screening is a worthwhile thing to do: “The goal of universal, early reading screening is to identify children at risk of future failure before that failure actually occurs. By doing so, we create the opportunity to intervene early when we are most likely to be more effective and efficient. Therefore, the key to effective screening is maximizing the ability to predict future difficulties”.
What are the features of a good early reading screening test?
A good early reading screening test measures things that matter over time (i.e. is valid and predicts well) and gives consistent results (i.e. is reliable).
It sorts out the kids who are going to be fine from the kids who are going to struggle, without too many false positives (worrying about kids who are actually going to be fine) or false negatives (failing to identify kids who are actually going to struggle).
It has norms which allow any child’s performance to be compared with the typical performance of their peers.
What are the best predictors of reading and writing difficulties?
The best predictors at school entry of difficulty with reading and writing the following year (Grade 1) are phonemic awareness and letter knowledge.
However, the best predictors of persistent and severe reading and writing difficulties (Grade 3 and beyond) are oral vocabulary and rapid automatised naming.
Good screening tests take into account how children’s learning changes over time, and can identify not only the kids likely to take a while to get off the ground, but also the few who are likely to need extra help in the long-term.
High-quality literacy screening tests
There are a number of high-quality standardised tests suitable for use in screening for learning difficulties at school entry, such as these US tests:
- Predictive Assessment of Reading (the best screener, according to Selecting Screening Instruments)
- DIBELS
- AIMSWeb
- PALS
- The RAN/RAS
I find the UK Children’s Test of Nonword Repetition useful too, although it’s not a comprehensive screener. It takes four minutes, has norms for ages 4-8, is simple enough for even preschoolers, and doesn’t disadvantage children whose home language isn’t English or children who’ve hardly seen a book and don’t know what a rhyme is. Poor non-word repetition is quite a good predictor of language and reading difficulties.
Homemade is only good if you know what you’re doing
I’m a bit mystified as to why our education department has chosen not to buy an off-the-shelf, valid, reliable, norm-referenced screening tool or tools for use with Victorian children, and has chosen instead to improve its existing English Online Interview.
Our state government is still smarting from couple of recent, massive DIY technology fiascos: the education department’s Ultranet (see “Corruption-fighting body to investigate botched $180 million Ultranet project for schools”) and our public transport system’s Myki card (see “Outsmarted: Victoria pays the price”). So you’d think they’d be wary of commissioning something new when they could be buying something already known to work well elsewhere.
I can’t find any statistics to back up the education department’s website statement that the existing English Online interview is “a powerful online tool for assessing the English skills of students in Years Prep to 2”.
Until it’s finalised and researched extensively, there won’t be any statistics showing how validly the revised version predicts future reading and/or writing performance, how accurately it pulls out the “worry about” kids and not kids who’ll be fine, or whether it allows us to compare one child’s performance with that of their peers in a meaningful way.
A closer look at the current English Online Interview
Only Victorian education department employees can access the English Online Interview. However, website information indicates that the assessment for school beginners currently involves a child:
- Having a conversation with a teacher, which the teacher evaluates using rubrics.
- Retelling a story, again evaluated via rubric.
- Answering questions about the story just retold. The website video says at least half the questions are easy for most students.
- Writing their name, drawing a picture about the story they’ve just heard and retold, writing about their picture, and then reading what they have written to the teacher. The website video about this task says, “The teacher scores how relevant the student’s writing is, based on what the student says their writing tells you. The teacher also records how easy it is to match what the student has read to what the student says. The teacher then looks at the student’s writing to see if there are any identifiable words. Most children at the start of Prep aren’t writing yet, however there are a very small number who can actually write a recognisable sentence with punctuation”.
- Identifying and generating rhyming words. The Grammar Obsessive in me wasn’t sure whether to laugh or cry when, in the demonstration video, the teacher asks the child, “does bed and shed rhyme?”
- Giving a name or sound for upper and lower case letters of the alphabet. About two-thirds of students are reported to be able to get at least five letters right.
- Matching words with the same first sound (about half the children are said to succeed) and last sound (very few succeed).
- Using a highly repetitive, predictable book e.g. starting at the front, turning pages in order, identifying a word or letter, listening to the teacher read the first two pages then finishing off the story (the video says this may be done by making it up from the pictures if necessary) then answering a comprehension question, assisted by the pictures.
- Attempting to read another, more difficult text (but most children can’t).
As previously said, phonemic awareness and letter knowledge are the things which best predict reading problems in Grade 1, and oral vocabulary and rapid automatised naming are the things which predict problems in Grade 3 or later.
The English Online Interview currently includes tasks requiring phonemic awareness and letter knowledge, but there seems to be no research reported on the tasks used, at least in the public domain.
There aren’t any tasks in the current English Online Interview assessing oral vocabulary or rapid automatised naming, so these would need to be added, but unless they add tasks that have already been researched and found to be valid, reliable and accurate, we won’t know how good they are until this research is done.
Perhaps some of the other tasks on this assessment are useful to teachers for other purposes, though I’m not sure why statistically robust, research-based tools like the CELF-4 Screening test aren’t used to screen oral language, or whether children identified as having language difficulties are routinely referred to a Speech Pathologist. Children with Severe Language Disorders seem to get no extra funding any more, and Speech Pathologists in schools are like hen’s teeth, so however they do it, teachers must find identifying children with significant oral language difficulties rather a thankless task.
I’m not sure why children just starting school are being asked to write a sentence, or read a book that’s too hard for them. Trying to spot gifted and hot-housed kids?
I also wonder how such an assessment might be improved to the point where it meets the criteria for a good school entry learning difficulties screening tool (valid, reliable, accurate, normed). It seems quite subjective and descriptive, and to not even report statistics, let alone attempt to be statistically robust.
I know tests cost money, but not accurately identifying kids with learning difficulties currently costs us a lot more in the long run. After Toupee-gate, our education bureaucracy needs to put some serious work into rebuilding our confidence that it has its spending priorities right.
Getting teachers to administer long assessments which are not as informative or well-targeted as they might be also has an opportunity cost.
Screen-then-intervene or intervene-then-screen?
In the UK, all school beginners are supposed to get explicit, systematic synthetic phonics teaching from the first week of school, then they test to see who’s not catching on after a few months, and provide them with extra teaching.
Children start school with a vast range of language and (pre) literacy skills. Some of them don’t speak English. Some can’t say long sentences. Some aren’t read stories because their parents are illiterate. Others can already read and write at school entry, and engage you in complex, polysyllabic, vast-vocabulary conversation for hours. Lucky them.
In the UK, everyone’s taught how to sound out words and all the main spelling patterns, in small, fast steps. Yes, some children already know a lot about this, but it’s better for them to be given an opportunity to shine than go too fast for some children, leaving them behind and miserable.
Children’s learning of the things they have been taught is then assessed, using tests like the PERA. Children across the country must also do the Phonics Screening Check. Strugglers identified on these tests are referred for additional intervention.
The UK system seems a better way to minimise the singling-out of strugglers, but really I don’t mind which system we use, as long as we abandon our current wait-to-fail system.
If our government wants us to have confidence in the new screening system, it needs to put a lot more information about the screening tool’s validity and reliability, accuracy and norms in the public domain.
Until that’s available, if I were a Principal, I’d be looking for a statistically robust screening tool to help my teachers focus their reading and writing intervention where it’s needed most. I guess it goes without saying that I’d also require all early years and special needs staff to have explicit systematic synthetic phonics training and resources. There’s no point identifying problems you don’t know how to solve.
Yes amazing how the Victorian Education Department continues to find its own ways to do things instead of applying evidence based practice. As an ex primary teacher and training speech pathologist I know how long these one on one interviews take and how little information they actually give the teacher. What happens with this information? – not a lot! A quality teacher would gather this information informally within the first month if not fortnight a child was in their class. Rapid Automatized Naming is the second key (after being able to discriminate phoneme sounds) to language learning difficulties as it is linked to memory and attention. Those kids that read ‘the’ correctly on one page and on the next page get stuck on ‘the’ again. Hope some Principals and Prep teachers out there are reading your post Alison and do what they need to tick the Ed. department boxes but add evidence based practice into the mix.
Thanks for another thoughtful post. I am looking for a readiness screening tool to help me work with the children art preschool who need help – usually I know they need help but not necessarily all areas. Someone has suggested the Who am I? tool. Do you know about this one and would you recommend it? I think it does more than reading readiness.
Hi Louise, I’m not familiar with more general school readiness assessments, but I’d suggest you look for assessments that have predictive validity, classification
accuracy, and norm-referenced scoring, as per the Selecting Screening Instruments paper at http://www.literacyhow.com/wp-content/uploads/2014/07/http-literatenation.orgwp-contentuploads201311102513-select-screening-ebsprd1.pdf. Education is awash with things that people just made up on the basis of their own beliefs and experiences, and they get taken far too seriously. Look for something that has proper data backing it up, which focusses on variables that really do make kids happy and successful at school. There must be a whole academic literature about this, try going to Google Scholar and putting in “School readiness assessment”.
It sounds awfully like the NSW Best Start Assessment? It is not a valuable tool for screening learning difficulties at all. Best Start does not give a good enough “spread” of students experiencing difficulties. Students who start school on”Cluster 0″ could in fact be much further behind. I wish the school departments would just invest in valid screening assessments/tools too!
Hi Alison – I’ve posted your review via the International Foundation for Effective Reading Instruction here: http://www.iferi.org/iferi_forum/viewtopic.php?f=3&t=577&p=906#p906
Thank you!
The assessment certainly sounds very similar to the NSW Best Start Assessment. Best Start is now in its 8th year having been first trialled in 2008. An ‘evaluation’ that was done in 2008 has never been publicly released and unfortunately Best Start is now so firmly entrenched in other aspects of curriculum delivery and teacher professional learning that it is unlikely it will ever be replaced. There is now a Best Start K to 2 Literacy Continuum with ‘markers’ that have never been validated or normed. Claims are made that the assessment is diagnostic and assists with programming. This is rubbish. The assessment evaluates phonemic awareness with 3 rhyming items and 3 ‘initial sound’ items and by asking children what new word is made when a sound is deleted. The phonics component does not include assessment of letter sound knowledge for all single letters. When I am programming I actually want to know far more than Best Start is capable of revealing. Best Start was touted as the answer for improving literacy levels as measured by Basic Skills/NAPLAN scores and the number of students in the lowest bands. To the best of my knowledge there has been no positive flow on effect from Best Start to NAPLAN as the number of students in the lower bands has remained fairly constant.
Hi Alison, I am enjoying reading all your blog posts. 🙂
Can you tell me what you mean by oral vocabulary? Do you mean range of words used by the child as well as annunciation of those words? How does one test for oral vocab?
I am an optometrist who is interested in children with learning difficulties/ I do visual perceptual screening and I include some auditory tests (TAAS and a visual-auditory integration test) and the Developmental Eye Movement test which includes an automatic naming component. So I have some idea about these things but wanting to learn more! I am aware that slow automatic naming correlates with learning difficulties. Just wondering if you can give me any further info about this. If a child has slow automatic naming and deficient oral vocab (whatever that means!), are there specific interventions for these 2 things? Can you improve either of these skills directly (how?) or are these just signs that the child will need particularly help with early literacy skills.
Hi Suzy, thanks for the nice feedback. By oral vocabulary I mean the number of words the child knows, nothing too fancy. The Peabody Picture Vocabulary Test used to be the classic test for this, but there are a lot of newer tests used by Speech Pathologists and others, and of course receptive vocabulary is slightly different from expressive vocabulary. The RAN/RAS tests are standardised tests just of Rapid Automatised Naming, but a lot of other tests also have subtests tapping this skill e.g. there is a criterion-referenced subtest on the CELF-4 used by Speech Pathologists. There isn’t any specific intervention to improve RAN, we just need to be aware if a child has slow RAN so they get enough time to do tasks. Speech Pathologists do a lot of work on vocabulary as this is very responsive to intervention. But I can’t give you a brief answer about how this is done, in the same way you can’t give me a one-sentence answer about how one tests someone’s vision! It’s something that requires professional training and expertise. All the best, Alison
The greatest problem lies at the state education departmental levels. Schools are bound to conduct their assessments within the dept guidelines. They are often time-consuming, subjective and have little value diagnostically. The “continuums” , “clusters” & “markers” that follow are also subjective and for the most part, very “wish you-washy”!Most teachers find these assessment tools very frustrating and of little benefit in explicitly assessing the curriculum they teach from.
I have investigated screening tools you mentioned and have not found one that doesn’t cost approximately $20 /student. Not expensive at all in terms of the cost of not discovering our at-risk students. However, as it stands, I do not have access to these funds as they are tied to other “initiatives” . Apart from running more sausage sizzles etc do you have any advice regarding “affordable” screening tools, please? It will be too late for too many students if we wait for the “powers that be” to financially support such initiatives!
There is a bit of debate within pro-phonics circles about whether early screening is even necessary, as if we had good explicit phonics teaching happening in all classrooms of five-year-olds and the teachers were very systematically working through phonemes and spellings and checking at every stage whether kids were “getting it”, then it would soon be obvious who wasn’t, and they could be referred for extra small group work within six months of starting school. Anyone still not getting it after that would need more detailed assessment to work out why and tailor a program to their specific needs. Ultimately there are three tasks that need to be accomplished if all kids are to learn to read and spell satisfactorily:
1) Teach everyone in a structured, explicit way to ensure that the 20% of kids who won’t learn unless taught this way are not excluded from mainstream instruction. The rest of the kids will learn no matter what methodology is used, though their spelling will be better if they get explicit systematic phonics teaching. It’s possible to screen everyone on school entry and pull out the 20% of kids most likely to struggle, and teach them literacy differently from the remaining 80%, but that creates a whole extra set of logistical problems for schools, especially small schools, as well as not being very inclusive. Also, screening is not perfect so there would be a few kids assigned to the wrong sort of teaching.
2) After about six months of systematic phonics teaching, give extra small group work to anyone who seems to be struggling with any of the Big 5 areas of literacy: phonemic awareness, phonics, vocabulary, fluency and/or comprehension, targeting their weak areas, and working with a Speech Pathologist where speech-language difficulties seem to be the root cause of the problem(s).
3) After about eighteen months, only about 3-5% of kids should still be struggling, and they need extra, detailed assessment and intervention. The UK Phonics check is their government’s way of getting an idea from outside schools whether their teaching is working, and we can use last year’s check for free here if a post-18-months-of-good-teaching measure is needed: https://www.gov.uk/government/publications/phonics-screening-check-2015-materials. My problem is that too often here, steps 1 and 2 are not followed, so we end up with too many kids eighteen months into their education and still not able to read or write much, and then somehow we are meant to help them catch up, when many of them just need the teaching and they’ll be fine, whereas a small number will have persistent problems.
Hi Alison,
Can you forward the research you are referring to that is associated with this please –
“However, the best predictors of persistent and severe reading and writing difficulties (Grade 3 and beyond) are oral vocabulary and rapid automatised naming.”
I am doing an assignment on oral language and vocab acquisition.
Now you’re testing me! I’ve been reading lots of different things about this, I guess Maryanne Wolf is the key author (she’s recently been here), and she says there are some kids who seem OK in the early years of schooling, who don’t have particular problems with PA or letter knowledge, or if they get some direct instruction they can learn to sound words out, but once they get a bit older they can’t speed their reading up because they have slow Rapid Automatised Naming (RAN). This means they cannot get fluent enough to read well for comprehension. If their oral vocabulary is also low, even if they can sound a word out, the oral version of the word is missing from their lexicon (or is misfiled because its phonological structure isn’t correct), so again they are left unable to comprehend unless the meaning is obvious from the context, which it often isn’t. So kids with semantically-based oral language impairments are at a disadvantage as their upper strand of Hollis Scarborough’s “reading rope” (you can google the diagram) is weak. I hope since you’re a student you can access the full text of articles like http://www.sciencedirect.com/science/article/pii/0093934X9290099Z, without this costing you an arm and a leg. Also this: http://link.springer.com/article/10.1023/A:1013816320290. And in general if you use Google Scholar to search terms like predictors of reading skill or predicting dyslexia, you’ll find a lot of information. Kids who have both phonological and naming problems are said to have a double deficit, so also try “double deficit dyslexia”. Hope that’s helpful
I believe that the Australian Council for Educational Research (ACER) developed the English on line assessment tool. I have had numerous email exchanges with them about the phonemic awareness assessment they were developing for UNESCO and in the end I gave up.