Human-Robot Cooperation Using EEG Signals With Self-Learning.

My thesis committee (from left) Dr. James M. Conrad, Me, Dr. Andrew R. Willis, Dr. Thomas P. Weldon

Abstract

Able-bodied humans express and communicate their needs and ideas with each other through the exchange of verbal and non-verbal information. However, those with neuromuscular disorders, either inherited or acquired by other factors, have high needs and no practical way to communicate those necessities. Today, human interaction with machines and robots are common. Humans can control and interact with robots with relative ease. This communication is very difficult for people with neuromuscular disorders. Disabled people are often assisted by robots which have now become popular and more accessible, but still a struggle to communicate with the assistive robot. The purpose this research aims to build the communication bridge between assistive robots and disabled humans and provide them basic control abilities such as object selection tasks. This research focuses on Brain-Computer Interfaces (BCI) via Electroencephalography (EEG) based communication. EEG sensors were used because they are cost-effective and non-invasive devices. Error-Related Potentials (ErrP) signals, which are a sub-component of EEG signals are used since they are naturally occurring within the brain in response to an unexpected error. ErrP signals are used as the sole communication channel from human to robot. In this work, we showcase that the ErrP signals from the stimulated human brain are translated in real-time into binary control signals which can control PR2 robot by correcting its errors to perform a particular task. Here we consider the task to be the selection and sorting of household objects. A robot initially based on random selection decides upon the category of the objects. EEG signals are collected from a simulated human brain and are decoded in real time; if an ErrP is detected the robot immediately corrects its trajectory and classifies the object to another group, otherwise it continues in the planned trajectory. With this continuous process, the robot is being self-trained by the disabled for the object selection and classification task. With this work, the disabled will be able to communicate with the robot and train it without any external human aid.

Date
Apr 24, 2019 11:00 AM — 12:00 PM
Event
Masters Thesis Defense
Location
UNC Charlotte

Sunny Arokia Swamy Bellary
Sunny Arokia Swamy Bellary
Research Engineer - Robotics

EPRI Engineer | AI Enthusiast | Computer Vision Researcher | Robotics Tech Savvy | Food Lover | Wanderlust | Team Leader @Belaku | Musician |

Related