When a person with paralysis imagines move the paralyzed limb, neurons in the part of the brain that control movement that still shoot as if they were trying to make the member still mess it up again.Although neurological injury or disease that broke the pathway between the brain and muscle, the region where the signals are emitted remains intact and functional.
In recent years, neuroscientists and neuroengenheiros working prosthesis began to develop implantable sensors that can measure brain signals from individual neurons.After submitting these signals to a mathematical algorithm for decoding, they used them to control computer cursors by thought alone. This area of study is known as neural prosthetics.
A team of researchers at Stanford University, USA, has now developed a new algorithm, called rEFIt (REcalibrated Feedback intention , intention feedback recalibrated, in free translation).
The algorithm greatly improves the speed and accuracy of neural prostheses that control the cursor on the computer screen.
In statements side by side with rhesus monkeys, cursors controlled by the algorithm rEFIt doubled the performance of existing systems, approaching the performance of the real arm.
Better yet, more than four years after implementation, the new system still works robustly, while the previous systems show a steady decrease in performance over time.
“These findings may lead to a much better performance and robustness of the prostheses implanted in people with paralysis, something we are actively pursuing as part of clinical trials [on the platform] BrainGate , “said Krishna Shenoy, coordinator of the experiment.
Monitor real-time neural
The system is based on a silicon chip implanted in the brain, which registers “action potentials” in neural activity from an array of electrodes, and sends the data to a computer.
The frequency with which action potentials are generated provides information about the direction and speed of the intended movement.
The refit algorithm decodes these signals with an accuracy that represents a quantum leap over previous models.
Most research on neural prosthetics controlled by thought, scientists record brain activity while the user moves or imagine moving an arm, and analyze the dataretrospectively .
The researcher Vikash Gilja discovered how to do this online , in a closed loop control in which the computer analyzes and implements visual feedback captured in real time as the monkey neurally control cursor toward a target on the screen.
Oddly enough, one of the key elements of the work Gilja was how to stop the cursor, not move it.While previous algorithms achieve the target almost as quickly as the new algorithm, they often exceed the target, requiring time to bring the cursor back, requiring multiple passes to “click” on target.