What We’re Reading!
We’re very excited to announce the Fall book for Teaching Channel’s Book Club: How We Learn: The Surprising Truth About When, Where, and Why It Happens by The New York Times science reporter, Benedict Carey. He uncovers the fascinating and surprising research on how distractions, repetition, sleep, and even study locations can effect how efficiently the brain learns.
We’re Giving Away 10 Free Books
We also have 10 free copies to give away to teachers. (Thanks Random House!) We’ll draw names on October 15th from the first 200 people to join Teaching Channel’s #TchLIVE “class” on Remind: https://www.remind.com/join/tchlive. Read the rules here. (Remind is a free and private services that helps us text or email you reminders for our monthly Twitter chats.) (Updated 10/16/14: We’ve texted the winners for this Giveaway. Thanks for participating!)
Join Our Book Club Twitter Chat
We’re also very excited to announce that author Benedict Carey, along with Sarah Brown Wessling, will join our Twitter chat on Thursday, November 13 at 7pm EST. All Teaching Channel bookworms are encouraged to attend using the hashtag #TchLIVE. See you there!
• • •
Excerpted from How We Learn by Benedict Carey
Copyright © 2014 by Benedict Carey. Excerpted by permission of Random House. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Introduction: Broaden the Margins
I was a grind.
That was the word for it back in the day: The kid who sweated the details, who made flashcards. A striver, a grade-hog, a worker bee — that kid — and I can see him clearly now, almost forty years later, bent over a textbook, squinting in the glow of a cheap desk lamp.
I can see him early in the morning, too, up and studying at five o’clock: sophomore year, high school, his stomach on low boil because he can’t quite master — what? The quadratic formula? The terms of the Louisiana Purchase? The Lend-Lease policy, the mean value theorem, Eliot’s use of irony as a metaphor for . . . some damn thing?
It’s long gone, the entire curriculum. All that remains is the dread. Time’s running out, there’s too much to learn, and some of it is probably beyond reach. But there’s something else in there, too, a lower-frequency signal that takes a while to pick up, like a dripping faucet in a downstairs bathroom: doubt. The nagging sense of having strayed off the trail when the gifted students were arriving at the lodge without breaking a sweat. Like so many others, I grew up believing that learning was all self-discipline: a hard, lonely climb up the sheer rock face of knowledge to where the smart people lived. I was driven more by a fear of falling than by anything like curiosity or wonder.
That fear made for an odd species of student. To my siblings, I was Mr. Perfect, the serious older brother who got mostly As. To my classmates, I was the Invisible Man, too unsure of my grasp of the material to speak up. I don’t blame my young self, my parents, or my teachers for this split personality. How could I? The only strategy any of us knew for deepening learning — drive yourself like a sled dog — works, to some extent; effort is the single most important factor in academic success.
Yet that was the strategy I was already using. I needed something more, something different — and I felt it had to exist.
The first hint that it did, for me, came in the form of other students, those two or three kids in algebra or history who had — what was it? — a cool head, an ability to do their best without that hunted-animal look. It was as if they’d been told it was okay not to understand everything right away; that it would come in time; that their doubt was itself a valuable instrument. But the real conversion experience for me came later, when applying for college. College was the mission all along, of course. And it failed; I failed. I sent out a dozen applications and got shut down. All those years laboring before the mast and, in the end, I had nothing to show for it but a handful of thin envelopes and one spot on a waiting list — to a college I attended for a year before dropping out.
What went wrong?
I had no idea. I aimed too high, I wasn’t perfect enough, I choked on the SATs. No matter. I was too busy feeling rejected to think about it. No, worse than rejected. I felt like a chump. Like I’d been scammed by some bogus self-improvement cult, paid dues to a guru who split with the money. So, after dropping out, I made an attitude adjustment. I loosened my grip. I stopped sprinting. Broadened the margins, to paraphrase Thoreau. It wasn’t so much a grand strategy — I was a teenager, I couldn’t see more than three feet in front of my face — as a simple instinct to pick my head up and look around.
I begged my way into the University of Colorado, sending an application along with a pleading letter. It was a simpler time then; it’s a state school; and I was accepted without much back-and-forth. In Boulder, I began to live more for the day. Hiked a lot, skied a little, consumed too much of everything. I slept in when I could, napped at all hours, and studied here and there, mixing in large doses of mostly legal activities for which large colleges are justifiably known. I’m not saying that I majored in gin and tonics; I never let go of my studies — just allowed them to become part of my life, rather than its central purpose. And somewhere in that tangle of good living and bad, I became a student. Not just any student, either, but one who wore the burden lightly, in math and physics, and was willing to risk failure in some very difficult courses.
The change wasn’t sudden or dramatic. No bells rang out, no angels sang. It happened by degrees, like these things do. For years afterward, I thought about college like I suspect many people do: I’d performed pretty well despite my scattered existence, my bad habits. I never stopped to ask whether those habits were, in fact, bad.
• • •
In the early 2000s, I began to follow the science of learning and memory as a reporter, first for the Los Angeles Times and then for The New York Times. This subject — specifically, how the brain learns most efficiently — was not central to my beat. I spent most of my time on larger fields related to behavior, like psychiatry and brain biology. But I kept coming back to learning, because the story was such an improbable one. Here were legit scientists, investigating the effect of apparently trivial things on learning and memory. Background music. Study location, i.e., where you hit the books. Video game breaks. Honestly, did those things matter at test time, when it came time to perform?
If so, why?
Each finding had an explanation, and each explanation seemed to say something about the brain that wasn’t obvious. And the deeper I looked, the more odd results I found. Distractions can aid learning. Napping does, too. Quitting before a project is done: not all bad, as an almost done project lingers in memory far longer than one that is completed. Taking a test on a subject before you know anything about it improves subsequent learning. Something about these findings nagged at me. They’re not quite believable at first, but they’re worth trying — because they’re small, easy, doable. There’s no excuse for ignoring them. In the past few years, every time I have taken on some new project, for work or fun, every time I’ve thought about reviving a long-neglected skill, like classical guitar or speaking Spanish, the self-questioning starts:
“Isn’t there a better way?”
“Shouldn’t I be trying . . . ?”
And so I have. After experimenting with many of the techniques described in the studies, I began to feel a creeping familiarity, and it didn’t take long to identify its source: college. My jumbled, ad-hoc approach to learning in Colorado did not precisely embody the latest principles of cognitive science — nothing in the real world is that clean. The rhythm felt similar, though, in the way the studies and techniques seeped into my daily life, into conversation, idle thoughts, even dreams.
That connection was personal, and it got me thinking about the science of learning as a whole, rather than as a list of self-help ideas. The ideas — the techniques — are each sound on their own, that much was clear. The harder part was putting them together. They must fit together somehow, and in time I saw that the only way they could was as oddball features of the underlying system itself — the living brain in action. To say it another way, the collective findings of modern learning science provide much more than a recipe for how to learn more efficiently. They describe a way of life. Once I understood that, I was able to look back on my college experience with new eyes. I’d lightened up on my studies, all right, but in doing so I’d also allowed topics to flow into my nonacademic life in a way I hadn’t before. And it’s when the brain lives with studied material that it reveals its strengths and weaknesses — its limitations and immense possibilities — as a learning machine.
The brain is not like a muscle, at least not in any straightforward sense. It is something else altogether, sensitive to mood, to timing, to circadian rhythms, as well as to location, environment. It registers far more than we’re conscious of and often adds previously unnoticed details when revisiting a memory or learned fact. It works hard at night, during sleep, searching for hidden links and deeper significance in the day’s events. It has a strong preference for meaning over randomness, and finds nonsense offensive. It doesn’t take orders so well, either, as we all know — forgetting precious facts needed for an exam while somehow remembering entire scenes from The Godfather or the lineup of the 1986 Boston Red Sox.
If the brain is a learning machine, then it’s an eccentric one. And it performs best when its quirks are exploited.
• • •
In the past few decades, researchers have uncovered and road-tested a host of techniques that deepen learning — techniques that remain largely unknown outside scientific circles. These approaches aren’t get-smarter schemes that require computer software, gadgets, or medication. Nor are they based on any grand teaching philosophy, intended to lift the performance of entire classrooms (which no one has done, reliably). On the contrary, they are all small alterations, alterations in how we study or practice that we can apply individually, in our own lives, right now. The hardest part in doing so may be trusting that they work. That requires some suspension of disbelief because this research defies everything we’ve been told about how best to learn.
Consider the boilerplate advice to seek out a “quiet place” and make that a dedicated study area. This seems beyond obvious. It’s easier to concentrate without noise, and settling in at the same desk is a signal to the brain that says, it’s time to work. Yet we work more effectively, scientists have found, when we continually alter our study routines and abandon any “dedicated space” in favor of varied locations. Sticking to one learning ritual, in other words, slows us down.
Another common assumption is that the best way to master a particular skill — say, long division or playing a musical scale — is by devoting a block of time to repetitively practicing just that. Wrong again. Studies find that the brain picks up patterns more efficiently when presented with a mixed bag of related tasks than when it’s force-fed just one, no matter the age of the student or the subject area, whether Italian phrases or chemical bonds. I can’t help thinking again of my own strained, scattered existence in college, up all hours and down napping many afternoons, in blithe defiance of any kind of schedule. I’m not going to say that such free-form living always leads to mastery. But I will argue that integrating learning into the more random demands of life can improve recall in many circumstances — and that what looks like rank procrastination or distraction often is nothing of the kind.
The science of learning — to take just one implication — casts a different light on the growing alarm over distraction and our addiction to digital media. The fear is that plugged-in Emily and Josh, pulled in ten directions at once by texts, tweets, and Facebook messages, cannot concentrate well enough to consolidate studied information. Even worse, that all this scattered thinking will, over time, somehow weaken their brains’ ability to learn in the future. This is a red herring. Distractions can of course interfere with some kinds of learning, in particular when absorption or continued attention is needed — when reading a story, say, or listening to a lecture — and if gossiping on social media steals from study time. Yet we now know that a brief distraction can help when we’re stuck on a math problem or tied up in a creative knot and need to shake free.
In short, it is not that there is a right way and wrong way to learn. It’s that there are different strategies, each uniquely suited to capturing a particular type of information. A good hunter tailors the trap to the prey.
• • •
I won’t pretend, in these pages, that the science of learning has been worked out. It hasn’t, and the field is producing a swarm of new ideas that continue to complicate the picture. Dyslexia improves pattern recognition. Bilingual kids are better learners. Math anxiety is a brain disorder. Games are the best learning tool. Music training enhances science aptitude. But much of this is background noise, a rustling of the leaves. The aim in this book is to trace the trunk of the tree, the basic theory and findings that have stood up to scrutiny — and upon which learning can be improved.
The book unfolds in four sections, and from the bottom up, so to speak. It will begin with an introduction to what scientists know about how brain cells form and hold on to new information. Having a handle on this basic biology will provide a strong physical analogy for the so-called cognitive basis of learning. Cognitive science is a step up the ladder from biology and, most important for us, it clarifies how remembering, forgetting, and learning are related. These two chapters form the theoretical foundation for all that follows.
The second section will detail techniques that strengthen our hold on facts, whether we’re trying to remember Arabic characters, the elements of the periodic table, or the major players of the Velvet Revolution. Retention tools. The third section will focus on comprehension techniques, the kind we need to solve problems in math and science, as well as work our way through long, complex assignments, like term papers, work presentations, blueprints, and compositions. Appreciating how these approaches work, or at least how scientists think they do, will help us remember them and, more critically, decide whether they’re of any practical use — today, in our daily lives. And finally, section four will explore two ways to co-opt the subconscious mind to amplify the techniques we’ve just described. I think of this as the “learning without thinking” part of the story, and it’s a reassuring one to hear — and to tell.
The treasure at the end of this rainbow is not necessarily “brilliance.” Brilliance is a fine aspiration, and Godspeed to those who have the genes, drive, luck, and connections to win that lottery. But shooting for a goal so vague puts a person at risk of worshiping an ideal — and missing the target. No, this book is about something that is, at once, more humble and more grand: How to integrate the ex- otica of new subjects into daily life, in a way that makes them seep under our skin. How to make learning more a part of living and less an isolated chore. We will mine the latest science to unearth the tools necessary to pull this off, and to do so without feeling buried or oppressed. And we will show that some of what we’ve been taught to think of as our worst enemies — laziness, ignorance, distraction — can also work in our favor.