Research brings a future of mind-reading robots ever closer
If you think with the release of every new i-device the world is getting closer to thought-controlled smart tech and robotic personal assistants, you might be right.
And thanks in part to work led by the Univ. of Cincinnati's Anca Ralescu, we may be even closer than you realize.
Prof. Ralescu of the Dept. of Electrical Engineering and Computing Systems will discuss her team's research aims and current progress on brain-computer interface at the International Human-Centered Robotics Symposium (HuCeRo).
Brain-computer interface uses electroencephalography—a measure of the brain's electrical activity—to help distinguish which brain signal corresponds with the body's performance of a particular intended action. In these experiments, Shikha Chaganti, a graduate student in computer science advised by Ralescu, specifically targeted brain impulses generated when a person thought about going from a sitting position to standing and vice versa. Computers process this data—which can be reinforced by combining it with measures of electrical activity in muscle—in order to detect these brain signals and interpret their intent. The idea is to allow a person to use thought alone to communicate with a computer about the intent to move.
"The problem is quite difficult," Ralescu says. "We are experimenting with processing the signal and selecting useful features from it, and designing a classifier capable of distinguishing between the these two transitions—sitting to standing and standing to sitting.”
Ralescu's work eventually could be used in conjunction with another project being presented at HuCeRo by UC's Gaurav Mukherjee, a master’s student in mechanical engineering in UC’s College of Engineering and Applied Science (CEAS), and Grant Schaffner, an asst. prof. in UC’s Dept. of Aerospace Engineering and Engineering Mechanics. Mukherjee and Schaffner designed and built a spring-assisted leg exoskeleton that can help people with impaired mobility. By integrating Ralescu's brain-computer interface into the exoskeleton, someone using the device could think, "I'm going to stand," and they'd receive a robotic boost as they rose to their feet.
Source: Univ. of Cincinnati