Sign Up for Fundamentals

Stay up-to-date with the latest research findings from the Institute for Basic Biomedical Sciences.

Please enter a valid email address.
Fundamentals Topics+

Music on the Brain

More Advancements in Research

Music on the Brain

Image credit: iStock

Music on the Brain

By Catherine Gara

February 2017 — About 10 years ago, Xiaoqin Wang visited a school for the deaf in his native China with a colleague. A teacher brought a boy wearing a cochlear implant over to him and reminded the youngster to greet him. But when the boy formed the words “Hi, Uncle,” they came out with a distorted pitch, as if he had a big tongue. Wang explains that Chinese is a tonal language: Each syllable can be pronounced four ways. Sometimes an incorrect intonation changes the meaning of the word; sometimes it just makes the speaker hard to understand.
 
“The boy was perfectly capable of pronouncing the words correctly, but he was unaware of his error,” says Wang, an electrical engineer by training and an expert on the neuroscience of hearing. “That’s partly because cochlear implants don’t distinguish pitch well, which means they are also bad at conveying music and picking out voices in noisy environments.”
 
cochlear implant Credit: NIH/NIDCD
It was an “aha” moment for Wang, who for 21 years has worked with small primates called marmosets to understand how the brain processes complex sounds. Thanks to the rich vocalizations of these animals and the similarity of their brain to humans’, his research group has identified areas of the brain that process various aspects of sound, learned more about how cochlear implants work and discovered that humans are not alone in our ability to perceive pitch. Now, he wondered whether their work could also find ways to help people with the implants better formulate tones.

Bionic Ears
 
At the time of Wang’s visit to the school for the deaf, his research group was in the middle of experiments on a process by which the brain listens to and critiques one’s own speech, known as auditory feedback. When you start singing, hit a wrong note and try again, auditory feedback is at work. 
 
In its studies on marmoset monkeys, Wang’s group identified which neurons fire when there is a mismatch between what the animals intend to “say” and what they actually hear themselves saying. Presumably, these neurons guide the brain to produce sounds that more closely match the intended sound.
 
marmoset Credit: iStock.
“In patients with cochlear implants, this process is impaired because the device conveys an impoverished version of what the person has pronounced,” says Charles Della Santina, a cochlear implant surgeon and collaborator of Wang’s.
 
Della Santina explains that in normal hearing, sound waves hit the spiral structure within the ear called the cochlea. The cochlea responds by sending electrical impulses to the auditory nerve, which passes them along to various parts of the brain for interpretation. Cochlear implants are electronic substitutes for the cochlea.
 
After his visit with the deaf children, Wang says, “I returned to the lab and reviewed the journals. It was already reported that cochlear implant users do poorly with tonal languages and music, and in noisy environments, and very little was known about how the brain processes signals from the device at the level of individual neurons. So we began to investigate these issues in our marmosets.”
 
That project attracted graduate students Luke Johnson and Kai Yuen Lim to Wang’s lab. Just recently, Wang’s team reported some of its latest findings from marmosets that were deaf in one ear. The researchers implanted electrodes into the monkeys’ auditory cortex, the hearing center of the brain, to monitor the activity of more than 1,400 neurons while playing sounds in the good ear and stimulating the auditory nerve of the other with electrical currents, similar to what happens with conventional cochlear implants.
 
They found that many of the neurons that are usually activated by sound did not respond to the electrical stimulation, including some known to respond to complex acoustic features, like pitch.
 
“Not only did the electrical current not reach the right neurons, we think it scattered and activated the wrong neurons, confusing the brain,” says Lim. “But when we used a more advanced, more focused electrode configuration, we were able to activate particular neurons that might allow a cochlear implant patient to understand speech in noisy environments.”
 
He hopes that future studies will shed more light on precisely which neurons need to be stimulated so that cochlear implant users can enjoy the full richness of sounds, especially music.
 
The Sound of Music
 
Music is an ancient language — one common to all human beings that taps into our emotions in a special way. It can bring us to tears, it can rouse soldiers to fight, it can soothe us to sleep. To make and appreciate music, the first humans who did so had to be able to hear it. But why were their brains equipped to understand sounds that had never been made before?
 
“The brain’s ability to appreciate music seems to have evolved before our ability to make it,” says Wang. “Marmosets deviated from humans on the evolutionary path about 30 million years ago but share with us the ability to perceive pitch, which is an essential component of the melodies and harmonies of music.”
 
The first evidence that marmosets could perceive pitch came about a decade ago, when Wang’s team identified a region of neurons in the marmoset brain that was only active after the monkeys heard sounds with pitch, like a melody, not sounds without pitch, like noise. Though it was likely that the monkeys were perceiving pitch, the scientists couldn’t know for sure without some sort of indication from the marmosets that they could tell the difference between tones.
 
It took years, but then-graduate student Xindong Song found that indication. Song and other members of Wang’s lab trained the marmosets to lick a waterspout every time they heard a change in pitch and were able to show that marmosets perceive pitch the same way we do. 
Xindong Song Xindong Song working in the lab.
 
“That was the first time anyone demonstrated humanlike pitch perception in a nonhuman animal,” says Song, now a postdoctoral fellow. “And so far, no other animal has been shown to possess the same quality, though only a few species have been tested because the experiments are so challenging.” 
 
Building on that work, former graduate student Lei Feng and Wang published a study just last month showing that marmosets’ brains contain neurons that can not only identify individual pitches, like the notes of a piano scale, but pitches played together, as in a chord. They monitored hundreds of neurons to figure out which were responsible for this trait.
 
Despite their virtuoso hearing abilities, Wang says marmosets don’t seem to perceive music in the same way we do. Instead, he thinks their pitch discrimination evolved to allow them to make and understand the harmonic vocalizations used by some animals in their native habitat, South American rainforests.
 
“We think marmosets have evolved sophisticated hearing abilities to aid in their survival,” says Wang, “to help them find prey, escape from predators and communicate with each other.”
 
Those same abilities now allow humans to enjoy music and understand tonal languages. Every time we listen to music, sing or try to converse in a noisy room, our brains are working hard to find relationships between tones and pick out important ups and downs. Wang’s team hopes to learn enough about that process to one day enable cochlear implant users to enjoy the sound of music.