The study was published today in IOP Publishing’s Journal of Neural Engineering.
Five subjects (three female and two male) who took part in the study were each able to successfully control the four-blade flying robot, also known as a quadcopter, quickly and accurately for a sustained amount of time.
"Our study shows that for the first time, humans are able to control the flight of flying robots using just their thoughts sensed from a noninvasive skull cap," says Bin He, lead author of the study and biomedical engineering professor in the University of Minnesota’s College of Science and Engineering. "It works as good as invasive techniques used in the past."
He says this research is intended to help people who are paralyzed or have neurodegenerative diseases regain mobility and independence.
"We envision that they’ll use this technology to control wheelchairs, artificial limbs or other devices," He says.
The noninvasive technique, called electroencephalography (EEG), is a unique brain-computer interface that records electrical activity of the subjects’ brain through a specialized, high-tech EEG cap fitted with 64 electrodes.
"It’s completely noninvasive. Nobody has to have a chip implanted in their brain to pick up the neuronal activity," says Karl LaFleur, a senior biomedical engineering student during the study and one of the paper’s authors.
The researchers says the brain-computer interface system works due to the geography of the motor cortex—the area of the cerebrum that governs movement. When we move, or think about a movement, neurons in the motor cortex produce tiny electric currents. Thinking about a different movement activates a new assortment of neurons. Sorting out these assortments laid the groundwork for the brain-computer interface used by the University of Minnesota researchers, He says. This new study builds upon previous research at He’s lab where subjects were able to control a virtual helicopter on a computer screen.
"We were the first to use both functional MRI and EEG imaging to map where in the brain neurons are activated when you imagine movements," He says. "So now we know where the signals will come from."
During the study, the subjects involved faced away from the quadcopter and were asked to imagine using their right hand, left hand, and both hands together. This would instruct the quadcopter to turn right, left, lift, and then fall, respectively. The quadcopter was driven with a pre-set forward motion and controlled only by subjects’ thoughts.
The subjects were positioned in front of a screen that relayed images of the quadcopter’s flight through an on-board camera. Brain signals were recorded by the cap and sent to the quadcopter over Wi-Fi.
After several training sessions, the subjects were required to fly the quadcopter through two large rings suspended from a gymnasium ceiling. A number of statistical tests were used to calculate how each one performed. A group of subjects also directed the quadcopter with a keyboard in a control experiment allowing for a comparison between a standardized method and brain control.
He says the potential for the brain-computer interface developed at the University of Minnesota is very broad.
"Our next step is to use the mapping and engineering technology we’ve developed to help disabled patients interact with the world," He says. "It may even help patients with conditions like autism or Alzheimer’s disease or help stroke victims recover. We’re now studying some stroke patients to see if it’ll help rewire brain circuits to bypass damaged areas."
In addition to He and LaFleur, other University of Minnesota researchers involved in the study include Alexander Doud, a medical and biomedical engineering graduate student; Kaleb Shades, a biomedical engineering undergraduate student; Eitan Rogin, an undergraduate computer science and engineering student; and Kaitlin Cassady, a biomedical engineering research tech staff member.
The University of Minnesota study was primarily funded by the National Science Foundation (NSF).
Source: University of Minnesota