Towards Artificial Emotional Intelligence for Heterogeneous System to Improve Human Robot Interactions

Date
2018
Authors
Erol, Berat Alper
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract

Human-machine interactions increased its importance through graphical user interface applications from human-computer interactions. Later, these interfaces applied on human-like machines, and human-robot interactions (HRI) have been born. Increased demand and accessibility on these robots brought a new era of domestic robots. The innovations on the smart environments and the Internet of things devices made domestic robots a member of our social environment. The aptitude for identifying the emotional states of others and responding to exposed emotions are important aspect of human social intelligence. Robots are expected to be prevalent in society in the near future to assist humans in various tasks. HRI is of critical importance in the assistive robotics sector. Smart digital assistants and assistive robots fail quite often when a request is not well-defined verbally. When the assistant fails to provide services as desired, the person may exhibit an emotional response such as anger or frustration through the changes in their voice or on their facial expression. As interactive robots focus on serving human users with special tasks in their daily activities, it is critical for robots to understand not only the language, but also human psychology. In this dissertation, a novel affection-based perception architecture for HRI is presented, where the robot is expected to recognize human emotional states for encouraging a natural bonding between the human and the robotic artifact. The system built upon high level of heterogeneity including a low-cost 3D printed humanoid robot torso and an unmanned ground vehicle. The research aims to improve HRI by using the advantages of machine and deep learning approaches for emotion detection, decision-making and object identification. Furthermore, it emphasizes the multi-modal approaches for user specific emotion recognition problems in a dynamic environment.

Description
This item is available only to currently enrolled UTSA students, faculty or staff.
Keywords
Emotion recognition, Human-robot interactions, Indoor navigation, Intelligent personal assistant, Internet of robotic things, Object identification
Citation
Department
Electrical and Computer Engineering