
Photo by Ángel Rubio from Pexels
According to a preponderance of scientific evidence, phonics instruction — early and systematic — is critical to teaching most children to read. This beachhead, secured after decades of resistance from proponents of the whole-language approach to reading, should signal an end to the reading wars. But the path from settled science to practice in schools can be a long one; and such it has been for phonics over the past 60 to 70 years. Many education experts have bemoaned the fact that the results of scientific research are not disseminated widely in the “real world,” and fail to reach the places where they might make a crucial difference in the adoption of effective teaching methods: school districts, state governments, and (most surprisingly) university teacher education departments.
Breaking it down
Phonics is a way of teaching the correspondence of letters, and combinations of letters, to sounds. Children “sound out” unfamiliar words by breaking them down into phonemes, or the sounds that correspond to the letters in the words. Once the word is reconstructed, children should be able to access its meaning, if they have heard the word spoken and can immediately link it with an object or a concept. But English is an “orthographically deep” language; that is, there are many inconsistencies in the way letter groups map to sounds. And this orthographic depth necessitates explicit instruction in these phonemic inconsistencies or, research has confirmed, many children will struggle to read, comprehend, and flourish in a literate society.
If it ain’t broke…
Phonics instruction has been employed to teach children how to read for centuries. But the complexity of the English language and the rote, and some say tedious, methods by which phonics succeeds in helping children decode written language may have made educators susceptible to the notion that there is a more intuitive, more “natural” way to teach reading — indeed, that learning to read might be a natural process, like learning oral language. But the story of how phonics fell out of favor seems almost whimsical — devoid of any clear cause beyond, perhaps, politics. Change for the sake of change might be an apt characterization.
War story
Modern-era sniping about phonics instruction began with a book published in 1948, called On Their Own Reading: How to Give Children Independence in Attacking New Words. In it, William Gray (who would become the first president of the International Reading Association (now the International Literacy Association), objected to “mechanical phonic drills” that gave rise to “dull, word-by-word reading.” Because adults were able to read whole words quickly, Gray believed that children should be able to do the same, making that “dull” work of sounding out words unnecessary. Supported by observational research, Gray theorized that whole-word memorization was the key to efficient, whole-sentence reading. Thus, the look-say method was born, and Dick and Jane were its spokeskids. Many proponents of systematic phonics instruction feel that this type of whole-word reading can appear to work well enough until about the third grade, when students encounter more complex words; it is around this time that they begin to fall behind — and some never catch up.
A withering answering salvo to Gray’s theory came from Rudolf Flesch in his book Why Johnny Can’t Read, published in 1955. Flesch argued that beginning readers who saw the same basic, Tier-1 words over and over again had then to attempt a formidable task: extrapolating, from the likes of look, come, and see, the hundreds of exceptions to the implicit phonological “rules” inculcated by Dick and Jane primers. Why Johnny Can’t Read brought phonics/whole-word hostilities into the open, where they have remained ever since.
Nevertheless, despite these nascent challenges to phonics, Flesch’s message was embraced by many and was reinforced in a major research synthesis of 30 independent studies, published in 1967 (by Harvard’s Jeanne Chall), which concluded that kids who couldn’t decode were less likely to become good — let alone expert — readers.
The very same year, however, whole-language theorists emerged to join the battle. Whole-language theory went even further than whole-word in its conviction that children wanted and needed more independence — not only to attack new words, but whole books, guided primarily by their own interests. The thinking was, at its most basic level, that children would figure out the alphabetic code largely on their own if they could engage with subjects that already interested them.
The most influential of the whole-language theorists in America was Ken Goodman, who in 1967 told an audience of educational researchers his ideas about “cueing.” He held that when children encounter an unfamiliar word, they attend to graphic cues; that is, they consider the letters in a word, especially the first letter, and make a guess at what the word might be. Starting with that inference, they would then use syntactic cues, or where the word was placed within the sentence, to help them figure out what part of speech it might be. And finally, semantic cues, or what the other words (the ones they could understand) in the sentence meant, would lead them to settle on a likely meaning for the unfamiliar word. It might not be the exact word, but it would make sense, and it would give them a close enough idea of what the sentence was saying to make meaning out of simple texts. This approximate “meaning-making,” facilitated by guesses, was more important than learning the actual word or understanding any nuance of meaning the actual word might convey.
As with whole-word memorization, at lower grades this method might appear to work. But it seems unnecessary to point out that at some juncture in a child’s education, nuance and exactitude become more important. In other subjects, in which students must read to obtain accurate information rather than to entertain themselves, close enough would not be good enough.
Nevertheless, with a substantial boost from another researcher, Frank Smith, Goodman’s whole-language approach took off. Indeed, Smith went somewhat further than Goodman by positing that skipping over a difficult word was “the first alternative and preference,” and only more exposure to print was necessary to fill in the gaps. Both Goodman and Smith began to publish books aimed directly at teachers, and they said the right things to win practitioners over to their side, stressing empowerment and professional autonomy. James S. Kim, in his 2008 essay “Research and the Reading Wars,” has suggested that Smith’s ideology-driven entrepreneurship [which, by tradition, the majority of educational scientists eschew] led to widespread and inadequate early reading instruction.
Over the next several decades, however, research refuting the claims of whole-language efficacy piled up. In the 70s, cognitive psychologist Keith Stanovich began to investigate the principles underpinning whole-language theory and found that poor readers were more dependent on context clues than good readers, a finding replicated by others in the field. Conversely, after a couple of years of phonics instruction in their early education, researchers found, children could read more fluently, enabling them to focus on comprehension, while poor readers expended their cognitive capacities evaluating cues with less comprehension.
Cuing was shown to be more laborious than word-by-word reading, and less conducive to meaning-making, a direct contradiction of its claims. In fact, more recent research on the areas of the brain involved in reading has shown that skilled readers — those whose recognition of words has become automatic — can process a word on the page faster than they can process a picture representing the word. The key to this kind of automaticity is phonics. In 1994, 20 years after he began to study the efficacy of the whole-language cuing approach — which he had been prepared to confirm, not disprove — Stanovich wrote, “That direct instruction in alphabetic coding facilitates early reading acquisition is one of the most well established conclusions in all of behavioral science.”
Buttressing his statement was large-scale, real-world evidence: In 1987, as a result of the grass-roots appeal of whole-language immersion, California was persuaded to change its state-wide curriculum framework to whole language — and within a few years, reading scores tanked.
RELATED: New York Times opinion piece “Every state left behind”
This and other apparent failures of the approach, along with increased governmental willingness to trust science, has led to a somewhat grudging consensus among some educators that phonics instruction is necessary to reading skills acquisition. But they also argue that there are advantages to whole language. As a result, practice in classrooms increasingly consists of a combination of the two approaches, termed balanced literacy. The dominant question now is, what’s the right balance?
Next: Finding the balance in “balanced literacy” | Read more of Victory Productions’ blogs on education
Sources:
Why are we still teaching reading the wrong way?
Ending the Reading Wars: Reading Acquisition From Novice to Expert
Early literacy development and instruction
Why Reading Is Not a Natural Process
How Do Kids Learn to Read? What the Science Says