Brain-controlled systems that can process thoughts and translate them into commands that move objects can restore some communication and movement to those who can't speak or move. But users of these systems can suffer from mental fatigue.
Christian Isaac Peñaloza Sanchez, a Ph.D. candidate for Cognitive Neuroscience Applied to Robotics at the University of Osaka, Japan, has designed an intelligent interface for these systems that minimze mental fatigue.
His interface, called the Automating a Brain-Machine Interface System, is capable of learning up to 90 percent of the user’s instructions, which allows it to operate autonomously after awhile.
The system consists of electrodes placed on the scalp, which measures brain activity in form of EEG signals. These are used to detect patterns generated by various thoughts and the mental state of the user -- whether he or she is awake, drowsy or asleep, etc. -- and the level of concentration. It also includes a graphical interface that displays the available devices or objects, which interprets EEG signals to assign user commands and control devices.
In addition, there are wireless sensors distributed throughout the room that send environmental information -- such as temperature or lighting -- and mobile hardware actuators which receive signals to turn on and off appliances and an artificial intelligence algorithm.
"The latter collects data from wireless sensors, electrodes and user commands to learn a correlation between the environment of the room, the mental state of the person and its common activities," Peñaloza Sanchez said in a press release. “We give learning capabilities to the system by implementing intelligent algorithms, which gradually learn user preferences. At one point it can take control of the devices without the person having to concentrate much to achieve this goal."
For example, an individual can use it to control an electric chair and move it to the living room using basic commands (forward, backward, left or right), which are learned by the system. Thus, the next time the user wants to take the same action he or she only needs to press a button or think about it for the chair to automatically navigate to the desired destination.
Once the system operates automatically, the user no longer has to exert concentration to control devices. However, the system continues to monitor the EEG data to detect a signal called Error-Related Negativity, which occurs when people become aware of an error committed by themselves or by a machine.
For example, when the temperature in the room is warm, the user expects the window to open automatically, but if the system makes a mistake and turns on the TV, this action can be detected by the human brain in a spontaneous way without the user making any effort. This allows the command that caused the error to be corrected and the system re-trained.