Brain Computer Interface to Assistive Robots

Abstract

Able-bodied humans express and communicate their needs and ideas with each other through the exchange of verbal and non-verbal information. However, those with neuromuscular disorders, either inherited or acquired by other factors, have high needs and no practicable way to communicate those necessities. Today, human interaction with machines and robots are common. Robots can be controlled and communicated with relative ease. However, with neuromuscular disabled people, the gap is more significant with the failure of either the machine or human communicating and processing the information presented. Assistive robots are now become popular and more accessible to those who need it, but still to communicate with the assistive robot is a struggle. For the neuromuscular disabled, there is no effective way to verbally or physically communicate. The purpose and the question that this research aims to contribute is how we can bridge the information exchange gap between robots and disabled humans and provide them basic communication abilities. To bridge this information transfer gap, the research mainly focuses on Brain Computer Interface (BCI) via Electroencephalography (EEG) based communication. EEG was used because its most economically conservative and non-invasive device. Error Related Potentials (ErrP) signals which are a subcomponent of EEGs are used that are naturally occurring within the brain in response to an unexpected error. ErrP are used as the sole communication channel from human to robot. In this work, we showcase that the ErrP signals from the stimulated human brain are translated in real-time into binary control signals which can control PR2 robot by correcting its errors to perform a particular task. Here we consider the task to be binary real-household object selection. Robot initially based on random selection decides upon the category of the objects. EEG signals are collected from simulated human brain and are decoded in real time; if an ErrP is detected the robot immediately corrects its trajectory and classifies the object to another group else it continues in the planned trajectory. With this continuous process the robot is being self-trained by the disabled for the binary object selection task. With this work the disabled will able to communicate the robot and train it without any external human aid.

Date
Mar 15, 2019
Location
UNC Charlotte
Sunny Arokia Swamy Bellary
Sunny Arokia Swamy Bellary
Engineer/Scientist II

EPRI Engineer | AI Enthusiast | Computer Vision Researcher | Robotics Tech Savvy | Food Lover | Wanderlust | Team Leader @Belaku | Musician |

Related