MIT CSAIL’s Daniela Rushas developed an EEG/EMG robot control system based on brain signals and finger gestures.
Building on the team’s previous brain-controlled robot work, the new system detects, in real-time, if a person notices a robot’s error. Muscle activity measurement enables the use of hand gestures to select the correct option.
According to Rus: “This work, combining EEG and EMG feedback, enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback. By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”
The researchers used a humanoid robot from Rethink Robotics, while a human controller wore electrodes on her or his head and arm.
Human supervision increased the choice of correct target from 70 to 97 per cent.
The goal is system that can be used for people with limited mobility or language disorders.
Click to view CSAIL video