More than 15 years ago, a group of academics around the country, including John Donoghue, a professor of neuroscience at Brown University, began to wire the brains of primates to computers. Those early primate studies opened the window on what they needed to do to make the technology work with people. But new advances in microelectronics, mathematical code deciphering, and software were needed in order for them to be able to make it work in any practical sense.
"We're able now to pick up a surprising amount of information," says Donoghue, whose research was the foundation of Cyberkinetics, the company he cofounded in 2001, where he works one day a week as chief scientific officer. "But it's a tiny sample of what's going on. Just reaching for a cup of coffee uses tens or hundreds of millions of neurons. We're picking up close to 100."
To make the technology accessible to all quadriplegics, Donoghue is working to replace the wiring with wireless communications. He's also using more sophisticated sensors, which pick up even more information, so that patients can refine their control of prosthetic devices and robotics. And the professor is seeing the research gain speed as his team gains new knowledge with every new patient and as other researchers leap into the fray.
"It's coming very quickly," he says. "A number of groups have miniaturized essential components. The challenge is sealing it to withstand a harsh environment."
"The ability to link brain signals to a computer or an artificial arm is just the beginning," says Dr. Rezai, who has been pushing the envelope on deep-brain stimulation, implanting devices in the brain that emit gentle pulses of electrical stimulation to keep the body's command-and-control unit humming. Ultimately, researchers will be able to detect other electrical storms in the brain, including ones associated with Parkinson's disease, depression, and anxiety. "The Cyberkinetics system and others are giving us snapshots of the nervous system," he says.
Right now, the system is used to detect the normal signals associated with movement. Donoghue and others, though, are working on other ways to utilize the technology.
"There's a lot of blue-sky thinking," says Surgenor. "There's interest in providing information to the brain that would augment feeling and hearing."
One group of researchers, led by Mark Humayan, a retinal surgeon and biomedical engineer at the Doheny Eye Institute at the University of Southern California, has been using the technology to translate camera images into the brain's electronic language of sensory perception and is beginning to gain some fuzzy images, which the researchers hope to gradually refine into sight.
This kind of technology, says Heywood, can transform lives, possibly even society.On a recent flight back from Los Angeles, Heywood was reading iWoz, the memoir of computer legend Steve Wozniak, and he reached the page where the software whiz described himself as being the first person to type a word on a computer keyboard and have it appear on a screen.
“Stephen and [Matt Nagel] were the first people to think without typing and have their words come up on-screen,” he says. “This is world-changing technology.”