I Want To...
I Want To...
Find Research Faculty
Enter the last name, specialty or keyword for your search below.
School of Medicine
I Want to...
When Medicare coverage for a widely used assistive hearing device was threatened, a team of Johns Hopkins medical experts stepped into the breach.
Illustration by Gérard Dubois | Photography by Justin Tsucalas
“In terms of medical impact, this could be the most important thing I’ve ever done.”
— Brad May
Neurobiologist Brad May has always been more interested in scientific affairs than political ones. He spends the majority of his professional life in his lab in the Department of Otolaryngology–Head and Neck Surgery at Johns Hopkins, where he studies how the brain processes sound.
May had taken what he calls an “informed but not engaged” approach to politics—enough to get an “I Voted” sticker at elections, but that’s about it. He’d never actively campaigned for candidates, solicited donations for issues or lobbied for any political cause.
Yet last October, May, neurotologist John Carey and audiologist Colleen Ryan-Bane found themselves meeting with a governmental committee—representatives from the Centers for Medicare and Medicaid Services (CMS), which had proposed cutting Medicare coverage for an assistive hearing device known as an osseointegrated auditory implant (OAI). OAIs are widely used by those with single-sided deafness or challenging cases of conductive hearing loss. A loss of coverage would ultimately affect patients at Johns Hopkins and across the country.
“In terms of medical impact,” May remembers thinking, “this could be the most important thing I’ve ever done.”
May has spent the last few decades investigating a basic science question: How do nerve cells in the brain respond to various aspects of sound?
For example, when ears pick up sound, some of the brain’s neurons fire in response. But when that sound emanates from a different location, how does that nerve firing change? Does another group of neurons respond to sound from the second location? Or does the first group continue to fire—but in a different pattern?
Animal models have always been the key to his work. Over the years, starting with his undergraduate degree in zoology, his research has included various members of the animal kingdom, ranging from rats to cats to primates. But starting in 2006, he had his first opportunity to apply his understanding of auditory neuroscience to hearing-impaired patients.
May was contacted by John Niparko, then the director of the Johns Hopkins Listening Center, who surgically implanted OAIs in patients with single-sided deafness. The surgery places a titanium abutment just behind a patient’s nonhearing ear, which picks up vibrations from a sound processor that sits on top of the abutment. When the processor detects sounds from the environment, the abutment sends vibrations through a patient’s skull to the inner ear—the portion that converts sounds into neural impulses that are then sent to the brain—bypassing the nonfunctioning ear that doesn’t transmit sound information.
Candidates for OAIs must have one normally functioning ear, and many are completely deaf in the other ear. This pattern of single-sided deafness is often encountered among individuals with an acoustic neuroma—a type of “benign” tumor growing on the vestibulocochlear nerve, which connects the ear to the brain—or those who’ve experienced a viral infection, trauma or other cause. Traditional hearing aids don’t work for this population because they only amplify sound, which won’t help in a “dead” ear.
It’s that group of patients that Niparko was most interested in, and he needed May’s help to study them. Niparko and his colleagues had already used sophisticated audiology measurements to study some patients with single-sided deafness who wear OAIs. But the results showed only a slight advantage over not having an OAI at all.
“It wasn’t anything that would knock your socks off, if you saw the data,” May remembers.
Those findings puzzled May, because OAI patients were so enthusiastic about their devices and couldn’t imagine being without them. They needed them to navigate their work, social and home lives. Such positive reviews aren’t the norm with traditional hearing aids, where users often report having a love-hate relationship with them.
With such a discrepancy between the audiology measurements and patient satisfaction, there had to be more to the story.
Perhaps, May reasoned, the audiology tests themselves weren’t measuring the right things. “What we really needed,” he concluded, “was a new test.”
Leaning on his decades of studying how animals process sound, May realized that the problem might be rooted in the focus of standard audiology tests. Rather than looking at what’s happening in the brain, the usual tests focus only on what’s happening in the ear. It’s the difference between perceiving sound versus understanding information, May explains, or the difference between simply hearing and really listening.
Looking for a way to figure out what patients were really getting from their OAIs, May and his colleagues, including audiologist Steve Bowditch, turned to an unlikely resource: aviator call signs. These nicknames were given to pilots as far back as World War I and were intended to simplify the task for ground controllers who needed to send directions to pilots. The call signs were also meant to confuse enemies who might be listening in to radio communications.
May knew that recordings using these call signs already existed in the public domain. The recordings include male and female voices reading directions to call signs along with colors and numbers, directing pilots to sections of maps (e.g., “Arrow, go to Blue 2”). These recordings would be ideal for his testing purposes, because each sentence contains critical information that could be reported in a test situation; in this example, the color blue and the number two.
So Bowditch asked Johns Hopkins patients who had been using OAIs for a minimum of three months if they would be willing to participate in one 20-minute study. They’d sit in front of a row of speakers that delivered voices with call signs and coordinates but would be directed to pay attention to just one voice delivering an assigned call sign—a male voice that delivered coordinates to “Tiger,” for example. Then the volunteers would push buttons on a screen that corresponded to the directions the voice delivered.
In some trials, only the assigned voice delivered instructions, making it easy to pay attention to the target voice and report correct coordinates, Bowditch says. Every subject showed nearly perfect performance on these trials. On other trials, confusing voices delivered instructions from other locations, making it more difficult to pay attention to the target voice. Even Bowditch, who has normal hearing and tried the test himself before he administered it, found it extremely demanding to answer commands correctly as the number of distracting voices increased.
“I was surprised how hard it was,” he remembers. “I wasn’t expecting these patients to show any significant improvement.”
But they did. With the device on, the patients were better able to filter out distracting voices. Error rates fell by about one-half relative to individuals with single-sided deafness who did not use an OAI.
The reason, May suggests, is that like people with normal hearing who have two streams of information derived from their two functioning ears, OAI users also have two streams of information: one from their normal ear, and the other from their device.
These dual streams allowed patients with single-sided deafness to overcome the common challenges of the hearing impaired: going out for dinner at crowded restaurants, teaching in noisy classrooms, being part of the discussion at business meetings with multiple speakers. In a nutshell, May explains, the OAIs were invaluable for their users—and it would be devastating for many if coverage for these devices was lost.
When David Parker, director of federal affairs in Johns Hopkins’ Office of Government and Community Affairs, first heard of OAIs, it was in a communication from the Department of Otolaryngology–Head and Neck Surgery. CMS, he learned, was considering a benefits change that would lump OAIs in the same category as regular hearing aids, which have never been covered.
Parker—a graduate of Johns Hopkins’ Bloomberg School of Public Health who worked at CMS for over a decade before landing at Johns Hopkins in early 2014—wanted to find out about the potential impact on patients with OAIs. In his research, he learned that OAIs are significantly different from traditional hearing aids. Besides necessitating major surgery to install, OAIs also need regular maintenance and require upgrades every four to five years.
“If CMS decided to change its coverage of the device, which included device maintenance and upgrades,” Parker says, “we would have patients with nonfunctioning implants.” (A new processor alone costs between $4,000 and $5,000.)
The stakes were even higher, he says, since private insurance companies usually follow Medicare’s lead. So their patients would probably be denied coverage eventually too.
Before making its determination, CMS would provide an open commentary period. Parker requested a meeting with CMS decision-makers so the Johns Hopkins Medicine team could explain why OAIs are vastly different from standard hearing aids.
Carey—chief of the Division of Otology and Neurotology, which performs OAI surgeries—would explain how the surgery and the device itself worked. May could talk about his research showing what the device was actually doing in patients’ brains. And Ryan-Bane, who fits and trains many patients in practices including Carey’s, could tell the committee exactly what she hears from patients about the impact of OAIs on their lives.
Ryan-Bane says the stakes for the CMS decision were incredibly high. Being unable to hear on one side was more than a simple inconvenience for her patients—it was another devastating blow after many had already endured dealing with acoustic neuromas or other problems that caused their deafness.
Part of Ryan-Bane’s evaluation appointment usually includes a demo of the device that patients can wear on a headband. “Some patients cry just during that demo,” she says. “Maybe they’ve already been through a lot of surgeries at this point just to take care of their disease. This is one surgery that feels more positive and forward-moving for them, that can give them some good outcome for the future and give them back something that was lost.”
On the day of their CMS committee meeting in late October, Carey, May and Ryan-Bane piled into Parker’s car to drive to the CMS campus in the Baltimore suburb of Woodlawn.
The team’s initial calm turned to anxiety after they got to their assigned conference room and set up for their presentation. Seven CMS stakeholders strolled into the meeting—more than double what the Johns Hopkins team was expecting.
“They were pretty poker-faced,” Carey remembers. “That was probably the most unnerving part.”
But those hardened visages didn’t last for long, he says. As he and the team began their presentation, the CMS committee was clearly interested, beginning a lively dialogue with him and the other presenters about all aspects of OAIs. When the meeting wrapped up, CMS told the team that their presentation—the last they’d hear before making a decision—was the most informative of any they’d heard.
It was a promising sign, Parker says, but no guarantee. “I thought our team really hit the nail on the head with what CMS needed to make an informed decision,” he remembers, “but there was no predicting what they’d do.”
On Oct. 31, 2014, the news came in: CMS would continue funding OAIs after all, allowing patients across the country to breathe an audible sigh of relief.
The decision and all that led up to it have also impacted the research of May and others. At a recent meeting of the Association for Research in Otolaryngology, May heard that other scientists were using his call sign method to test for listening differences in patients with cochlear implants, another device that aims to restore hearing. Eventually, he predicts, hearing devices such as OAIs and cochlear implants could be better designed to optimize the cues that patients are using to function in complex listening environments, allowing these devices to better simulate normal hearing.
May says that when he first began studying the neuroscience of hearing, he never dreamed he’d end up testifying before a governmental committee that would ultimately decide to fund OAIs for thousands of patients across the country.
“You never know what the application of knowledge will be,” he says, “until something like this happens.”
“They were pretty poker-faced. That was probably the most unnerving part.”
— John Carey
“This is one surgery that feels more positive and forward-moving for them, that can give them back something that was lost.”
— Colleen Ryan-Bane