Researchers at Duke University have demonstrated a technique that allows a monkey to control two virtual arms using signals tapped directly from its brain.
The research is designed to help discover better surgical, medical or prosthetic options for humans who lose the use of a limb, but could also be adapted to the increasing variety of non-keyboard-centric controls over drones, robots wearable computing devices, and other digital hardware.
Using an interface surgically implanted in the monkey’s brain, researchers were able to record activity from a record number of neurons – 500 – from multiple areas in the brain and interpret that activity consistently enough to allow a rhesus monkey to manipulate a pair of arms on a digital avatar.
The monkey was presented with a three-dimensional animated image of a monkey moving two coconut-like spheres against a dark background. It was able to relate its thoughts to movements of a video avatar well enough to get the avatar to move the spheres using basic but voluntary pushing and pulling motions.
The real innovation wasn’t simply allowing the monkey to control a robotic or animated arm, but to control two arms simultaneously and get them both to work together to accomplish a task.
Such “bimanual” movements are critical to everything from typing on a keyboard to opening a can, according to Miguel Nicolelis, professor of neurobiology at Duke and lead author of the paper, which was published in a recent issue of the Proceedings of the National Academy of Sciences.
That ability suggested the monkey was able to imaging the avatar’s arms were its own and incorporate them into the internal image of itself the brain uses to control voluntary movements, according to the report.
Work on the interface revealed that large groups of neurons, sometimes in different locations, are responsible for normal motor functions rather than individual neurons or collections of neurons in a single location. Each arm was controlled by different portions of the monkey’s brain, but knowing what sections controlled one arm didn’t help them locate those controlling the other.
That finding indicates the brain scales in non-linear ways when activity increases – demonstrating brain architecture is both extremely flexible in the functions it can adopt and even more difficult to decode using cerebral geography than previously thought.
The paper was unable to conclude, based on the increasing capabilities of both robotics and digitally enhanced primates, whether the inevitable destruction of humanity and resulting post-apocalyptic dystopia would look more like Terminator or Planet of the Apes.
Image: Duke Univ./Miguel Nicolelis