As they text, tweet, surf and IM, are grad students still learning how to think?
Image from Wikimedia Commons;
January 2012--When Douglas Robinson attended graduate school in the mid 1990s, he brought only a notebook and a pen to his classes. That’s it. No laptop, no tablet, no smartphone.
Today when Robinson, an associate professor of cell biology, lectures to graduate and medical students at the Johns Hopkins School of Medicine, he looks out over a sea of glowing laptops, occasionally hears the buzz of a cell phone and spies fingers frenziedly punching out text messages. And he wonders what it all means. Are students truly able to concentrate with all of these digital distractions?
Other faculty members are asking similar questions. So too are many cultural scholars and authors, such as Nicholas Carr, author of The Shallows: What the Internet is Doing to Our Brains. The prognosis, according to Carr, is dim. The habits engendered by the digital revolution—flitting from one web page to the next, issuing missives no longer than 140 characters, expecting answers instantly and continuously—is diminishing our ability to think deeply, critically and creatively.
If such a transformation is, in fact, occurring, then the implications are profound for science and medicine, disciplines reliant on such skills. Will the current generation of plugged-in grad students perform as well as their predecessors?
Indeed, a growing body of research gives weight to fears like Carr’s, showing that the habits engendered by the digital lifestyle come with a cognitive cost.
One of those habits is multitasking. According to Steven Yantis, Johns Hopkins chair of Psychological and Brain Sciences, studies, including his own, show that multitasking simply doesn’t work. “Multitasking is a misnomer,” says Yantis. While people may be able to chew gum and cross the street at the same time, or fold laundry and listen to the radio, the same does not apply to tasks requiring concerted cognitive decisions.
“People tend to believe they’re better at multitasking than they are,” says Yantis. “But tons of evidence shows that, for example, talking on the cell phone while driving is equivalent to driving under high levels of intoxication.”
In his studies, Yantis has found that, rather than complete tasks A and B simultaneously, people are, in fact, shifting from A to B to A and so on, a process that incurs a “task-switch cost.” The brain’s working memory has only so much capacity, explains Yantis. So shifting from task A to task B to task A requires the brain to reload into working memory the details of task A—and that takes extra time.
So students who switch from reading an online book chapter to texting a friend to checking out products on eBay may take longer to absorb the book’s information or may comprehend it less than if they had focused singly on each task.
At the same time, however, our brains may find the siren call of all those distracting beeps and chimes and bright flashing icons neurochemically irresistible, further studies by Yantis and others suggest. The “ping” of an email delivery may excite our neurochemical reward circuits. “I don’t want to throw around the term ‘addiction’ lightly,” says Yantis. “But people can become distracted by information at a level they can’t control.”
But outside of the laboratory and in the classroom, are such mechanisms having an effect on student learning? Are students now more distracted and less capable of deep thought than they used to be? Has student performance declined?
Here’s where the research on technology and distraction offers fewer answers. “You hear a lot of anecdotes about students being more distracted,” says Robinson. “But are they scientifically valid? The issue should probably be studied for our students.”
Nevertheless, Robinson and many other faculty members say that digital technology has undeniably altered the climate in the classroom and lab. Whether or not those tools are changing brain circuitry, they are challenging certain standards of behavior.
Many faculty members have a personal story about a student, an electronic device and a breach of etiquette. For Robinson, it was the undergraduate student who, while talking to Robinson in his office about a research project, paused to answer his cell phone. Incredulous, Robinson stood up while the young man chatted on his phone, went to his white board and wrote the following: “Word to the wise: Never ever take a phone call while talking to your professor.”
For Thomas Burke, associate dean for Public Health Practice and Training at the Bloomberg School of Public Health, it was the day he stood in a glass-walled projection booth at the back of a lecture hall while a colleague delivered a lecture. Looking out through the booth’s glass window, Burke saw a student in front of him checking out different hairstyles on her laptop. Burke banged on the glass to get the woman’s attention and told her to search for hairstyles some other time.
Laptop computers and tablets, says Burke, “have changed the whole dynamic in the classroom.”
In response, Burke discourages students from using laptops in his lecture classes. In his research lab, Robinson permits people to play the radio but asks them not to wear headphones. “The lab is an interactive environment,” he says. “If you’re walled off because you have headphones on, you’re not able to interact.”
But in many classes, restricting the use of certain technologies isn’t feasible because those same technologies have become an integral part of the learning experience. In school of medicine lecture classes, students may download a presenter’s slides at the start of each lecture and scroll through them on their laptops while listening to the presentation.
This year, for example, he introduced an “audience-response system” in his lectures. At the start of class each student receives a handheld remote control “clicker” for responding to questions. Periodically during the class, Lorsch poses a multiple-choice question about a segment of the lecture. Students answer the question using their clickers. A histogram of the group’s answers then appears on the lecture hall screen. Lorsch says he finds the information invaluable in assessing how well he is communicating. If many students register an incorrect answer, he goes over the lecture segment again.
Students, says Lorsch, “haven’t changed biologically. I don’t think they’re less able to concentrate than students were 20 years ago.” However, they may be accustomed to technology-driven interactive learning. So faculty should change with the times. “If students are bored, I figure it’s my job to engage them better,” Lorsch says.
Indeed, technology is not going away. The days of notebook and pen in the classroom are vanishing. And as teachers compete for students’ attention amid the electronic hubbub, the best approach may be to embrace the best of the old and of the new, says Pier Forni, professor of Italian literature and director of the Johns Hopkins Civility Project.
Forni has examined the digital revolution’s impact on learning and everyday life in an article called “The Civil Classroom in the Age of the Net” and in a recently published book, The Thinking Life: How to Thrive in the Age of Distraction.
He believes that deep thinking has suffered as a consequence of the age’s digital distractions. But the solution is to understand how the culture is changing and work with those changes. The “old” way of learning, Forni says, focused on retention: concentrating on an idea long enough to turn it over in the mind, apprehend its different parts and connect the idea to what was already known about the world. The “new” mode emphasizes retrieval—for example, plumbing the millions of pages of the Web for an answer.
In truth, says Forni, both are valuable skills. “The goal is to find a reasonable and enlightened compromise between the necessity of retention and the ability of retrieval.” Good teachers, he says, will guide students in seeking that balance.