In This Section      

HeadWay - Listening in on language

HeadWay Winter 2012

Listening in on language

Date: January 3, 2012

Hillary Ganek, Kristin Ceh, Frank Lin and Deborah Bervinchak are all using the LENA device to evaluate how well children with new cochlear implants are learning language.

Thousands of deaf and hard-of-hearing patients of all ages use cochlear implants (CIs) to navigate the world of sound. But accurately measuring this device’s effects in the youngest patients could mean a world of difference later in life, says otologist Frank Lin.

That’s because, as a series of studies showed in the late 1980s and 1990s, the amount of language a child hears early on has significant effects on language development, school performance and even IQ, Lin explains. Because children who have CIs may not be exposed to as many words as a non-hearing-impaired child might, they could be left with permanent deficits unless a doctor or therapist intervenes.

“When our patients come in for regular therapy sessions,” Lin says, “our deaf educators ask their parents a lot of questions. How is he communicating? What kinds of noises is she making? But this just gives them a rough guess of what’s going on at home. We really had no way to directly measure how many words children were hearing and how well the children themselves were communicating.”

Recently, Lin and his colleagues found a tool that will finally give them the data they’ve been lacking. Called LENA—Language ENvironment Analysis—the device was developed by a nonprofit organization, the LENA Foundation, which set out to automate the methods used by earlier researchers to study the connection between heard words and IQ. A huge advance from older studies, in which researchers went in person to study subjects’ homes with tape recorders and spent countless hours tallying every word, LENA revolutionizes the whole process.

Rather than just adding up words, Lin says, the device gives the therapist a comprehensive picture of how much and what kinds of sounds a child is exposed to, such as how many words are spoken to the child directly, how many utterances the child produces, and how many conversational turns—the back-and-forth between the child and conversational partner—take place. It also analyzes how much background conversation the child hears, how much heard language comes from electronic media such as televisions or radio, and how much extraneous sound the child is hearing throughout the day from other sources.

“The amount of data that LENA gives back in such a user-friendly way is changing the way we do therapy,” says Lin’s colleague, speech-language pathologist Hillary Ganek.

Ganek explains that she and her colleagues, including deaf educators Kristin Ceh and Deborah Bervinchak, now have six LENAs that they hand out on a regular basis to parents whose children recently received CIs. Those children wear the cell phone-sized device in specially made clothes that keep it high on their chests—an ideal location to sense sound from the child and those around the child. After several days, parents send the device back to Johns Hopkins, where the therapist downloads the collected sound files and uses LENA’s software to analyze their components—a  process that takes about two hours because of the wealth of collected data.

Based on what the files show, Ganek explains, she and her colleagues can make recommendations either to help parents continue their good work in helping their children learn language or to correct problems before they become entrenched.

“Before, we could only measure the child’s progress,” Ganek says. “Now we have the ability to measure the parents’ progress, too.”

“We finally have a realistic way,” Lin says, “to see whether our interventions are really making a difference.” 

For more information, call 410-955-9397.