Currently viewing the category: "Neuron-computer Interface"

The BrainGate collaboration, lead by Dr. John Donoghue from the Department of Neuroscienceat Brown University, recently announced they have began recruiting patients to join in the clinical trials of the second iteration of their neural interface system.

The first round of trials was run by Cyberkinetics, an independent neurotech company founded by Dr. Donoghue, but they have pulled out of this next phase due to funding difficulties. Now, a purely academic team based at the Massachusetts General Hospital, these exciting trials will help guide the next generation of neurotechnological interfaces between human brain activity and direct actions on a computer and, eventually, control of prosthetic devices.

The near-term goal of this research is to develop a technology that can assist patients with degenerative neurological paralysis where their brain is trying to talk to its body, but the body just isn’t listening. The BrainGate system trains itself to decode electrical activity in the brain and translate recorded signals into a computer for control of an external device. In effect, BrainGate is a bridge that re-routes neural communication to a device that would be designed to replace lost function.

With previous work, the critical success was converting brain activity into the control of a cursor on a computer screen. Although this seems to be a trivial activity, the understanding of the neuroscience behind the actual relationship between specific brain activity and the mechanical control of our environment remains a vital bit of understanding required for the future of neurotechnology. Now, with the BrainGate2 trials starting soon, opportunities to discover new science will hopefully bring us closer to successful devices for assisting patients with ALS, spinal cord injuries, stroke patients, and many others with empowering technologies to live their lives to the fullest.

“Brain-computer interface begins new clinical trial for paralysis” :: EurekAlert :: June 10, 2009 :: [ READ PRESS RELEASE ]

 

Recently, we discussed the developments from the Wadsworth Center of a minimally-invasive, thin-film technology to enhance electrocorticography (ECoG) recordings (read). Similar to the more common electroencephalogram (EEG) method, which uses an array of electrodes stuck on your outer skull to receive electrical signals from your neurons, the ECoG uses an array of electrodes embedded just on the surface of your brain allowing for a more direct electrical view of neural activity. This view still covers an averaged signal from a large number of talking neurons and still does not see individual electrical signals. However, by having the bony skull out of the way, the electrodes sure have a more clear shot for picking up the electric fields.

The importance of this work from Wadsworth is that the brain and it’s violent bodyguard, the immune system, doesn’t really like to have things hanging around that the body didn’t make on its own. So, typical implanted devices will quickly be destroyed by attacking antibodies. Here, the specialized implanted ECoG devices are lasting six to twelve months in human patients, but their goal is to improve the device life-cycle to five to ten years.

Through their collaboration with clinical neurologists and biomedical engineers at Washington University in St. Louis, Missouri, the Wadsworth group, lead by Gerwin Schalk, is taking the technology to the next step by integrating the recording activity with specialized software that maps the brain activity with computer control. The implanted ECoG providing its more detailed map of brain activity allows for a specific correlation to be observed between physically clicking a computer mouse button, for example, and the resulting pattern of neural firings in the brain. The patient can then train their thoughts to reproduce similar neuron activity and, with a direct connection to the computer, the mouse click appears without the click.

The interfacing process is being licensed to a start-up company in St. Louis called Neurolutions, who will be working to improve the software and training process to bring it to market for applications in neuroprosthetics. The challenge for further advancement begins with the unfortunately situation that just clicking a mouse button doesn’t get us very far in life. Just moving fingers and arms requires multi-dimensional spatial control, and with that comes an an unknown number of different neural patterns being required to simply raise your arm to reach the mouse on top of the desk. All of the corresponding neural activity–move shoulder up, rotate elbow, lift index finger, shift arm to the right, etc.– will need to be mapped, trained, and accessed to control a prosthetic device… and each human might have different neural patterns for the same physical motion.

“Reading the Surface of the Brain” :: Technology Review :: June 3, 2009 [ READ ]

“Brain-Computer Interface Technology Licensed to Missouri Firm” :: NY State Dept. of Health Press Release :: March 25, 2009 :: [ READ ]

 

At this year’s Intel Developer Forum in San Francisco, the final keynote address hosted by Justin Rattner, CTO of Intel Corp, focused on the next forty years of computing and how the gap is being bridged between machines and the human mind.

Mr. Rattner described the coming likelihood of “The Singularity,” previously discussed here on Neuron News (read), where the continued exponential growth of computing power will result in machines that surpass the “intelligence” of the human brain. It is not obvious that just because a device can process information at a higher level than that of the brain that it will automatically be imbued with what we consider as “intelligence.” However, there will no doubt be computers in the future with similar levels of complexity of the brain and ridiculously higher processing capabilities… so, we’ll see what happens.

The hour-long keynote web cast is quite interesting with several demonstrations of emerging technologies that will bring machines more human-like qualities. In particular, the non-invasive “mind-reading” headset technology from Emotiv Systems is demonstrated on stage with a human-to-computer game interaction. (In fact, they plan to begin shipping their neurotech headset in late 2008 for only $300!)

The headset records electrical activity from the brain through the skull and translates the signals, or your “thoughts,” into real actions in a computer game. Make a scary face, and frighten your virtual alien invaders away; focus on lifting a large rock that is blocking your path, and the virtual object levitates out of your way so your avatar may continue through the game world.

So, this Christmas, when you invite your friends over for game night, be sure to think carefully… because your fleeting imaginations might show up on the game screen for all to see!

“Research and Development: Crossing the Chasm between Humans and Machines” :: Intel Developer Forum Keynote :: August 21, 2008 :: [ VIEW WEB CAST ]

 

Dr. John Donoghue, Professor of Neuroscience at Brown University, presented an informative, hour-long presentation at the National Institute of Health this past April covering his group’s important research in neurotechnology interfacing. This presentation is part of the NIH Neuroscience Seminar Series, and is a must-see for learning first-hand the profound advancements and exciting technologies being developed for connecting into our brain and nervous system.

Neuron News has also added a link in our Neurotech Resources list to the NIH VideoCast Archives for Neuroscience presentations, so you’ll be able to keep track of the latest reports and advancements in neuroscience being presented at NIH.

Pop some popcorn, sit back, and watch the video linked below, and then post your comments and thoughts here on Neuron News!

“Merging Mind and Machines: Neural Interfaces to Restore Lost Function in Humans” :: NIH VideoCasting Event :: April 14, 2008 :: [ VIEW VIDEO PRESENTATION ]

Our lab investigates how the brain turns thought into action…”
The Donoghue Lab at Brown [ VISIT ]

 

Can several hundred thousand rat neurons living in culture control the movements of a mechanical robot? Apparently to some extent so far, as researchers, including Dr. Ben Whalley, at the University of Reading have created a working “rat-brain-controlled” robot.

The controlling unit is, however, much less than that of a full rat brain, but in many respects it is actually much more interesting and exciting: the controller is a dish of rat neurons growing and interconnecting on top of an electrode array, which records electrical activity as well as electrically stimulates the cultured neuron network, and this all sits in a temperature-controlled environment in the lab safely separated from the actual robot. Wireless technology transmits the electrical information to and from from the culture and a mobile block with wheels and sonar sensors.

The electrical signals are filtered through software into movement controls for the robot. When the robot bumps into a wall, the sonar returns a signal to the culture dish to provide electrical feedback to the network. To date, the research has created a moving robot, and the team is now working to “train” the living neural network to appropriately respond to its environment… i.e., “don’t bump into the wall when you hear it coming.”

The group is particularly interested in how their basic understanding of this neural network can create memories, and how it will respond to imposed degradations of the physical network. This may lead to further clues into the progression of neurological disorders, including Alzheimer’s and Parkinson’s. Even with the focus on medical advancements for human disease, this research program at Reading is extremely exciting as a pure application of neurotechnology by working to develop a direct neuron-computer interface, and their results will be quite useful for the broader technological advancement of neurotech devices.

“Rat-brain robot aids memory study” :: BBC News :: August 13, 2008 :: [ READ with VIDEO ]

“Rat brain-controlled robot to give important neurological insights” :: The Tech Herald :: August 13, 2008 :: [ READ ]

“A ‘Frankenrobot’ with a biological brain” :: asiaone News :: August 14, 2008 :: [ READ ]

 

Taking direct electrical measurements from a living brain and even from a single neuron cell requires an invasive connection between the localized electrochemical environment in the cell and a sharp, prickly, prodding metal stake of death.

An electrode might sound harmless, but it can take the form of a gigantic (in the reference frame of a tiny neuron) metallic (or other electrical conducting material) needle that could either damage living tissue, or be rejected by the hosting biological system and quickly bombarded in tissue to effectively disengage the pointy invader.


image courtesy PhysOrg.com

Recently, a collaboration lead by Edward Keefer from the University of Texas Southwestern Medical School, has discovered that coating these harmful–but, necessarily formed–electrical recording devices with the ever popular carbon nanotube is the neuron’s newest fuzzy best friend. The nanotubues act to not only enhance the transmitted signals received from directly implanted electrodes, but they have been shown to be bio-compatible, so that they might even minimize the damage caused to the specimen. In fact, Keefer claims the efficiency of the cell-electrode interface is improved by at least one-thousand times.

The development of neurotechnological devices–hardware that interconnects directly with nervous tissue and even individual neurons–is absolutely dependent on not only the production of electrical connections that will result in highly sensitive signal transmission, but the cells will must also like to have these needles sticking around. The carbon nanotube coating approach could be a critical step in advancing neurotechnology to a future level of high-res recording devices as well as localized, highly-controllable stimulus systems.

“Carbon Nanotube-Coated Electrodes Improve Brain Readouts” :: PhysOrg.com :: August 12, 2008 :: [ READ ]