A Hopkins biomedical engineer hopes to harness the brain’s signals to operate the next-generation prosthetic arm.
May 2009--Nitish Thakor dreams of a day when an amputee will be able to point a finger, turn a doorknob or grasp a cup with a prosthetic arm simply by concentrating on those movements.
He shares this vision with a large international consortium of scientists and engineers who are striving to build what Thakor calls a “space-age” prosthesis, through a contract awarded by the Defense Advanced Research Projects Agency Revolutionizing Prosthetics Program. The Johns Hopkins Applied Physics Laboratory is the lead manager of the program.
Current state-of-the-art prosthetic arms have limited movements, which users control with muscle signals. For example, a wearer might shrug a shoulder to make the hand open and close. Thakor and his colleagues envision an arm with 22 degrees of freedom—the wrist will rotate, individual fingers will bend. And all of these movements will respond directly to the user’s intentions, through a system that monitors brain activity. “In contrast to a prosthetic claw,” says Thakor, “this one would be more dexterous and intuitive, in general more anthropomorphic, or humanlike.”
The concept relies on the fact that even after a person loses an arm, her motor neurons continue to fire when she imagines moving her arm, hand or fingers. Thakor aims to decode these neural messages to figure out which neural patterns trigger which activities, and then to use a computer to translate signals generated by the brain’s motor cortex into different movements of a prosthetic arm.
On a recent afternoon in his bustling lab, Thakor and graduate student Ryan Smith run through a series of slides and videos that demonstrate the group’s several levels of research in this field of neuroprosthetics.
One video shows a volunteer wearing a cap imbedded with 64 electrodes that monitor his brain waves. The electrodes are connected to a brain-signal amplifier and processing unit that digitizes these signals and translates them into an EEG recording that appears on a monitor. More specifically, the monitor shows an aspect of the brain waves called the mu band. When the volunteer imagines performing a manual task, such as grasping a ball, the mu band changes its pattern. In a similar manner, the volunteer also learns to modulate a somewhat higher frequency rhythm called the beta rhythm. During training sessions, the volunteer learns to modulate and shift the patterns of these rhythms as he focuses his thoughts to elicit this change. The system is called a brain-computer interface.
The researchers have also incorporated a mechanical hand into the system, so that changes in the volunteer’s brain wave patterns are translated into movement of the mechanical hand. Smith calls up another video. It shows a seated volunteer watching a shiny blue robotic hand whose fingers readily round into a loose fist. A researcher tells the volunteer, “Open.” Slowly, with a whirring sound, the hand opens. “Close,” the researcher says. Whirring some more, the fingers close.
The system has shown proof of principle, says Thakor. But it has limitations. Electrodes on the scalp provide only a macroscopic view of brain activity. Thakor would like to peer even closer.
So in a second set of studies, his group is analyzing neural signals recorded from the surface of the brain’s cortex. Placing electrodes in the brain is an invasive procedure, so graduate students Soumyadipta Acharya and Heather Benzare work with patients who have had the electrodes implanted for another purpose. The patients have epilepsy and are in the hospital so that doctors can pinpoint the neural area responsible for their seizures. For these tests, surgeons temporarily lay a grid of electrodes under the scalp on the surface of the cortex. A neurologist then takes readings from the electrodes to discern which areas relate to seizures and which govern specific functions. In conjunction with those tests, Thakor’s group collects readings from the somatosensory cortex while volunteers perform various manual tasks, ranging from simple finger movements to complex grasps.
The researchers then use neural network algorithms and pattern recognition software that they have developed to identify patterns of brain activity that correspond to specific gestures. Next, Thakor plans to introduce a mechanical hand into the experimental design; a patient’s thoughts of bending an index finger or squeezing a ball will be relayed through the electrodes to a computer and on to the mechanical hand, which will execute the movement. “This summer, we will have a patient sitting in a bed controlling the fingers on a mechanical hand,” says Thakor.
Yet Thakor would like to get closer still to the root of finger movement, down to the level of the individual neuron. So in another set of experiments he is collaborating with a team led by neuroscientist Marc Schieber at the University of Rochester, who are implanting ultrathin microelectrodes into the cortices of monkeys. Each microelectrode detects the activity of only 40 to 100 neurons. The researchers then record and interpret the neural signals emitted when the monkeys perform various finger movements. Although the brain may use millions of neurons to perform highly coordinated and complex hand motions, says Thakor, the message of as few as 40 neurons appears to be enough to decode simple gestures. It will take many more electrodes, a much more sophisticated decoding approach, and an extremely sophisticated prosthetic hand to begin to approach human dexterity. “We have a long way to go to create a fully implantable and autonomously brain-controlled dexterous prosthesis,” notes Thakor.
But a day may come when surgeons implant such electrodes into the brains of amputees and relay their output to a prosthetic arm. “That endpoint, however, depends on a whole lot of ethical issues,” says Thakor. It’s one thing to study volunteers who have already agreed to have brain surgery for epilepsy; it’s another to ask an amputee to volunteer for brain surgery as part a study. “Clinically, can it be justified?” asks Thakor. “The technology would have to be shown to be reliable and suitable for the long term, and the benefit to the patient will have to be greater than the trauma and the risk of surgery.”