Skip Navigation
Brain-Computer Interface Study
  
 

The Johns Hopkins Department of Physical Medicine and Rehabilitation (PM&R), in collaboration with the Johns Hopkins University Applied Physics Laboratory and the Department of Neurology and Neurosurgery, has been awarded a grant by the Defense Advanced Research Projects Agency (DARPA) to conduct a clinical trial focused on recording and stimulating the brain of a person with tetraplegia.

    
 
  1. Controlling Assistive Devices

    The implanted MEAs allow a computer to read the signals coming from the participant’s motor cortex. These signals convey the intention of movement. The computer then decodes and transmits these signals to the robotic arm, allowing the participant to control its movements similar to if it were his real arm.

    The research team is investigating uses for the MEAs beyond controlling prosthetics. For example, the participant uses neurosignals to control a cursor on a computer or provides instructions to a smart home setup.

    Additionally, we are testing whether the participant will be able to control two things at once using only neurosignals, such as controlling a wheelchair and a robotic arm at the same time.

  2. Perceiving Physical Stimuli

    While controlling a prosthetic limb is groundbreaking for someone with paralysis, it only solves half the problem. Without sensation, the participant would be controlling an arm that is essentially numb. Without sensory feedback, it is difficult to use the correct amount of pressure to pick up a cup, for example. Sensation also allows the participant to know where the arm is without having to look at it.

    The goal behind implanting MEAs into the participant’s sensory cortex is to simulate sensation so that when the prosthetic arm is touched, the participant would be able to feel that touch and pinpoint its location. This ability will make robotic limbs much easier to use.

    The research team wants to take perception beyond the scope of prosthetics as well. The goal is to use sensory cues to provide feedback for certain actions, such as a tap on the pinky indicating that a smart home system locked the front door.

 
 

What Makes this Study Different

This clinical trial is ongoing, and our research team is still working on publishing our findings. For now, we can share a few things that set this project apart:

Neurosignal Decoder

We built a decoder — a device that can read neurosignals and translate them into movement. Our participant is also learning how to adjust his neurosignals for better control.

Bilateral Implantation

Our team was the first to implant MEAs in both hemispheres of the brain. With this bilateral implantation, we aim to allow the participant to control right and left robotic arms independently and perceive touch on both hands simultaneously, differentiating between them.

Simulating Sensation

We’re working to make sensations feel more natural for the participant, altering the neurosignals to produce different perceptions of touch.

 
 

The Research Team

This study is a close collaboration between several Johns Hopkins groups.

Physical Medicine and Rehabilitation

Pablo Celnik, M.D.
Gabriela Cantarero, Ph.D.
Robert Nickl, Ph.D.
​Manual Anaya, M.D.

Applied Physics Laboratory

Francesco Tenore, M.S.E., Ph.D.
Matthew Fifer, Ph.D.
Matthew Rich, M.S.
Margaret Thompson, Ph.D.
Brock Wester, Ph.D.
​Adam Cohen, M.D.

Neurology and Neurosurgery

Research Collaborators and Consultants

David McMullen, M.D.
 
 
the research team and their participant
 

Media Inquiries

If you would like to interview the study participant and his care team, please contact Waun’Shae Blount.

  
back to top button