June 5, 2013
Researchers Fly Robot Drone With Their Thoughts
|In a jaw-dropping feat of engineering, electronics turn a person's thoughts into commands for a robot. Using a brain-computer interface technology pioneered by University of Minnesota biomedical engineering professor Bin He, several young people have learned to use their thoughts to steer a flying robot around a gym, making it turn, rise, dip, and even sail through a ring.|
At the University of Minnesota, a new technology is turning science fiction into reality. In the lab of biomedical engineering professor Bin He, several young people have learned to use their thoughts to steer a flying robot around a gym, making it turn, rise, dip, and even sail through a ring.
The technology, pioneered by He, may someday allow people robbed of speech and mobility by neurodegenerative diseases to regain function by controlling artificial limbs, wheelchairs, or other devices. And it's completely noninvasive: Brain waves (EEG) are picked up by the electrodes of an EEG cap on the scalp, not a chip implanted in the brain.
A report on the technology has been published in the Journal of Neural Engineering.
"My entire career is to push for noninvasive 3D brain-computer interfaces, or BCI," says He, a faculty member in the College of Science and Engineering. "[Researchers elsewhere] have used a chip implanted into the brain's motor cortex to drive movement of a cursor [across a screen] or a robotic arm. But here we have proof that a noninvasive BCI from a scalp EEG can do as well as an invasive chip."
Mapping the brain
He's BCI system works thanks to the geography of the motor cortex—the area of the cerebrum that governs movement. When we move, or think about a movement, neurons in the motor cortex produce tiny electric currents. Thinking about a different movement activates a new assortment of neurons.
Sorting out these assortments laid the groundwork for the BCI, says He.
"We were the first to use both functional MRI and EEG imaging to map where in the brain neurons are activated when you imagine movements," he says. "So now we know where the signals will come from."
The brain map showed that imagining making fists—with one hand or the other or both—produced the most easily distinguished signals.
"This knowledge about what kinds of signals are generated by what kind of motion imagination helps us optimize the design of the system to control flying objects in real time," He explains.
|Image Source: Journal of Neural Engineering / He|
Now it's the real deal, controlling an actual flying robot—formally, an AR [augmented reality] drone. He's computers interface with the WiFi controls that come with the robot; after translating EEG brain signals into a command, the computer sends the command to the robot by WiFi.
The journal article describes how five men and women learned to guide the flying robot. The first author is Karl LaFleur, who was a senior biomedical engineering student during the study.
"Working for Dr. He has been a phenomenal experience," says LaFleur, who plans to put his knowledge to use when he enters the U's Medical School next year. "He has so much experience with the scientific process, and he is excellent at helping his students learn this process while allowing them room for independent work. Being an author on a first-person journal article is a huge opportunity that most undergraduates never get."
"I think the potential for BCI is very broad," says He. "Next, we want to apply the flying robot technology to help disabled patients interact with the world.
"It may even help patients with conditions like stroke or Alzheimer's disease. We're now studying some stroke patients to see if it'll help rewire brain circuits to bypass damaged areas."
SOURCE University of Minnesota
|By 33rd Square||Subscribe to 33rd Square|
Topics - BCI , Bin He , Biomedical Engineering , brain-machine interface , drone , flying robot , mind control , mind control robot , robotics , University of Minnesota