Listening in on the brain: A 15-year odyssey

Stanford engineers and neurosurgeons have worked together to develop an experimental technology that could one day allow people with paralysis to affect the world around them using only their minds.

- By Elizabeth Svoboda

Electrical engineer Krishna Shenoy began dreaming of developing a brain-controlled implant before coming to Stanford in 2001.
Paul Sakuma

Billions of text messages and emails are sent around the globe each day. One seldom if ever pauses to think about the complex interplay of neurons, synapses and muscles at work as the brain transmits the thought to the fingertips that type and send messages into the world.

But how this all works is a question of life-altering significance for those who are paralyzed or have severe movement disabilities — and one that electrical engineer and neuroscientist Krishna Shenoy, PhD, and his collaborators have long sought to understand.

Their goal for the past 15 years has been to create a brain-controlled prosthesis, a device that channels thoughts into words or movements. And after years of research with monkeys, Shenoy, a professor of electrical engineering, is part of a national consortium that has begun clinical trials in which a tiny electrical device was implanted into the brains of three movement-impaired people.

Among other things, this device, about the size of a baby aspirin, allows these people to compose and send text messages merely by thinking about moving their hands. Neurosurgery professor Jaimie Henderson, MD, Shenoy’s longtime friend and collaborator who performed two of the implant surgeries, keeps one such text message from a participant on his phone: “Let’s see your monkey do that!” it reads.

The Stanford researchers recently published results from a clinical trial of the investigational technology. Behind those results lie years of efforts by an interdisciplinary team of neurosurgeons, neuroscientists and engineers who brought different scientific vantages together to solve challenges that would have stumped any single discipline. Institutional support was another key ingredient in this long-term effort aimed at ultimately helping people with paralysis affect the world around them using only their minds.

Though more work lies ahead, this ongoing research shows that new engineering and neuroscience techniques can be directly applied to human patients. The milestone is heartening for Shenoy, who has led the effort to create brain-controlled prosthetic devices since he came to Stanford in 2001. Integral to that success has been his 12-year partnership with Henderson, which he describes as a professional marriage of engineering, science and medicine.

“When you have a clear vision, you involve yourself in as many details as possible and you work with absolute mutual respect, as coequals, it’s pretty interesting what you can do over a couple decades,” Shenoy said.

A born tinkerer

Shenoy, director of Stanford’s Neural Prosthetic Systems Laboratory, had dreamed of bringing the brain-controlled implant into being even before he came to Stanford. A born tinkerer with a desire to improve people’s lives, he traces his motivation for the project back to memories of his childhood in Cedar Rapids, Iowa.

“My mother’s father suffered from multiple sclerosis for around 40 years. He was wheelchair-bound,” he said. “It was not like I ever had a conscious epiphany, ‘I want to help him,’ but I think it subconsciously influenced me greatly.”

Neurosurgeon Jaimie Henderson has been collaborating with Shenoy on developing a brain-computer interface since 2004, and implanted the experimental devices in two patients as part of a clinical trial.
Paul Sakuma

Shenoy grew up thinking about a traditional engineering career, but changed that plan in college when he learned about the nascent field of computational neuroscience. For the first time, researchers were trying to figure out how the brain processes information, and Shenoy — then an undergraduate at the University of California-Irvine — knew he wanted to be part of that herculean effort.

“I had that light bulb turn on,” he said. “Small things, transistors, put together create computers. The brain is spectacular — how is it built from all its component neurons that are wired together?”

After completing his PhD in electrical engineering and computer science at the Massachusetts Institute of Technology, Shenoy headed to the California Institute of Technology to do a postdoctoral fellowship in neuroscience, an experience he likens to “another PhD.” As he used electrodes to record signals from monkeys’ brains, he toyed with the prospect of tapping into the brains of people with paralysis in order to give them entirely new prosthetic devices. Existing methods of helping such people interact, like eye-blink systems or pointers strapped to their heads to let them move a cursor on a computer screen, were slow and cumbersome — and they didn’t work at all for people who didn’t have enough muscle control to operate them. A device that translated thoughts into actions — a true brain-computer interface — would be far more efficient and intuitive, Shenoy figured.

He kept returning to the same question: “How do we design systems capable of listening in on those neurons and interpreting their language?”

After becoming an assistant professor at Stanford in 2001, Shenoy devoted his research to answering that question. He recorded neural activity from hundreds of neurons at the same time while monkeys, whose brains are quite similar to human brains, performed a variety of arm and hand movement tasks. Generations of students, postdoctoral scholars and research staff in Shenoy’s lab used the electrode-array recordings to explore how ensembles of neurons in the motor cortex prepare and guide hand and arm movements. Their goal was to understand the fundamental ways that these neural circuits control arm movements, and to then use this scientific knowledge to design mathematical algorithms for converting, or decoding, this language of the brain into electronic signals for controlling prosthetic devices such as keyboards or robotic limbs.

Better and better

As Shenoy and his colleagues added detail and depth to their understanding of the brain, their decode algorithms kept performing better and better. That success encouraged them to start thinking about bringing this preclinical research to people.

They got a major assist in 2004 when Henderson, then interviewing for a neurosurgery position at Stanford, arrived on campus for a round of meetings. Among other skills, Henderson was an expert at using medically approved electronic devices to stimulate the nervous system for therapeutic purposes. To treat Parkinson’s disease, for instance, surgeons often use deep-brain stimulation, a procedure in which they deliver tiny jolts of electricity to relieve the tremors that characterize the condition. Shenoy had the opposite goal: He wanted to read the brain’s faint biochemical signals and translate these into electronic data.

Neurotechnologies such as this will alter how we think about treating nervous system disorders.

But Gary Steinberg, MD, PhD, professor and chair of neurosurgery, had a hunch Shenoy and Henderson would get along. Steinberg made sure the electrical engineer got penciled into Henderson’s interview schedule. During their first meeting at the Clark Center, the two hit it off.

“It was chemistry,” Shenoy said. “Two people who just clicked.” When he told Henderson about his dream of creating a brain-controlled prosthetic system, Henderson responded, “Yeah, that’s exactly the kind of thing I’d like to do.”

Henderson’s interest in surgical treatment of Parkinson’s disease, coupled with Shenoy’s expertise in recording from large numbers of brain cells simultaneously, led them to propose a joint project investigating brain activity during deep-brain stimulation surgery, using the sensor that Shenoy had been using in his laboratory. Those sessions marked the project’s first tentative forays into human research and cemented the team, Shenoy recalled.

“It started getting us thinking more deeply about how these [implants] could really work, if implanted for years,” Shenoy said. Henderson’s medical expertise helped the engineers figure out what approaches might work in a clinical setting. In turn, Henderson, who holds the John and Jene Blume-Robert and Ruth Halperin Professorship, learned about the finer points of computer control systems — how detailed algorithms allow for swift interpretation of messages coming from the brain.

Steinberg, the Bernard and Ronni Lacroute–William Randolph Hearst Professor of Neurosurgery and Neurosciences, was impressed by Shenoy and Henderson’s teamwork. “They’re not ego-driven,” he said. “That allows them to build these collaborative programs, rather than feeling that as an individual they have to be the shining light.”

As he watched them make necessarily slow but fundamental and systematic progress in those early years of their collaboration, Steinberg told the researchers that when they were ready to start a clinical trial to bring this to people with paralysis, the university would provide the initial resources they needed. With Steinberg’s financial backing and moral support, along with support from Stanford’s interdisciplinary biosciences institute Bio-X, Shenoy and Henderson formed the Neural Prosthetics Translational Laboratory in 2009.

Improving system performance

But before the team felt that their technology would provide people with a nearly natural and enjoyable experience, they needed to improve overall system performance by refining their knowledge of movement intent embedded in brain signals. Much of that pioneering work fell to graduate students Paul Nuyujukian and Jonathan Kao. They trained monkeys to think about moving a cursor toward targets on a computer screen and kept track of how the neurons in the monkeys’ brains formed this intention. That allowed them to develop new mathematical algorithms that continuously processed a monkey’s brain signals into the movement commands that controlled the cursor.

In essence, they were decoding how the brain forms an intention and then carries out a movement. Over many such experiments carried out over several years, they steadily improved the algorithm. At first, the monkeys managed to hit one target every two to three seconds. By 2012, they were hitting a target a second because the algorithm was enabling them to make faster, straighter and more controlled movements.

It was one of the biggest operations I’ve ever done in my life.

As Nuyujukian recalls, those kinds of results signaled that it was time to “take those techniques, try them in people with paralysis and ask the fundamental question, ‘Does this actually work in the real world?’”

That same year, Henderson surgically implanted the first electrode array in a clinical trial participant at Stanford. The 50-year-old individual had suffered for years with amyotrophic lateral sclerosis, also known as Lou Gehrig’s disease, a degenerative condition that causes the loss of motor neurons and can ultimately make movement impossible. As part of the clinical trial process it was explained that there would be no personal benefit from participation. The individual chose to proceed to help advance the research. Henderson, a seasoned neurosurgeon, typically approached the operating room with calm and composure. Less so this time.

“I was definitely hyped up,” Henderson recalled. “It was one of the biggest operations I’ve ever done in my life.” After the four-hour procedure, the team went out for burritos. Henderson slumped in his chair, exhausted.

Even then the researchers couldn’t relax. They wouldn’t know whether they had been successful for another month or so, when they brought the first participant back into the lab to see whether the implant could pick up electrical signals from the person’s brain and correctly decode them into movement controls. During that first lab session, the data was a bit difficult to interpret — the team had to make some computing tweaks to ensure the participant’s brain signals came through reliably. But it soon became clear that the surgery had been a success: The participant moved a cursor to an on-screen target just by thinking about it.

“We definitely, as a lab, opened a bottle of Dom Pérignon,” Henderson said.

The celebration was, of course, preliminary. Years more work lay ahead. Another participant was implanted at Stanford, and another at Massachusetts General Hospital as part of the multi-institutional BrainGate consortium, bringing to three the participants contributing to the present round of results. Testing the results required some creative on-the-spot adjustments. Nuyujukian, by then a postdoctoral scholar at Stanford, recalls visiting the participants’ homes after the surgery with fellow postdoctoral scholar Chethan Pandarinath, now an assistant professor at Emory University and Georgia Institute of Technology. They would arrive with a cart filled with computers and electronic recording equipment. For a few minutes at a time, they would tell each participant to look at a cursor moving back and forth on a screen and pretend they were controlling it with their mind. This process was meant to calibrate the algorithms responsible for converting patterns of neural activity into computer cursor movements.

Nuyujukian recalls watching, awestruck, as study participants became proficient at moving the cursor from one letter to the next, tapping out words, then whole sentences, using only their brains. “To see that actually happen is unbelievable. It was hard to not display emotion on the spot,” he said. From time to time, he and the participants would chat about how incredible all this was.

Now, 15 years into the effort and with many more years of development and testing ahead, team members think that collaborative problem solving remains the key to success. “This group was so careful and just laid out every step along the way,” Nuyujukian said. Next, the team plans to test the implant in even more participants — and expand the types of devices people can operate. “The follow-on to this is allowing a person to use an off-the-shelf tablet device,” Henderson said. “And to do so 24/7, so that the participant has access to assistive devices at any time.”

For all of the success in getting to this point, Shenoy remains cautious, yet deeply optimistic.

“Neurotechnologies such as this will alter how we think about treating nervous system disorders,” he said. “And perhaps even how we think about what it means to be human.”

Shenoy, Henderson and Steinberg are members of Bio-X and the Stanford Neuroscience Institute.

About Stanford Medicine

Stanford Medicine is an integrated academic health system comprising the Stanford School of Medicine and adult and pediatric health care delivery systems. Together, they harness the full potential of biomedicine through collaborative research, education and clinical care for patients. For more information, please visit med.stanford.edu.

2023 ISSUE 3

Exploring ways AI is applied to health care