Towards Artificial Emotional Intelligence for Heterogeneous System to Improve Human Robot Interactions

dc.contributor.advisorJamshidi, Mo
dc.contributor.authorErol, Berat Alper
dc.contributor.committeeMemberMadni, Asad M.
dc.contributor.committeeMemberPrevost, John J.
dc.contributor.committeeMemberCao, Yongcan
dc.creator.orcidhttps://orcid.org/0000-0001-5734-4865
dc.date.accessioned2024-02-09T20:49:19Z
dc.date.available2020-08-13
dc.date.available2024-02-09T20:49:19Z
dc.date.issued2018
dc.descriptionThis item is available only to currently enrolled UTSA students, faculty or staff. To download, navigate to Log In in the top right-hand corner of this screen, then select Log in with my UTSA ID.
dc.description.abstractHuman-machine interactions increased its importance through graphical user interface applications from human-computer interactions. Later, these interfaces applied on human-like machines, and human-robot interactions (HRI) have been born. Increased demand and accessibility on these robots brought a new era of domestic robots. The innovations on the smart environments and the Internet of things devices made domestic robots a member of our social environment. The aptitude for identifying the emotional states of others and responding to exposed emotions are important aspect of human social intelligence. Robots are expected to be prevalent in society in the near future to assist humans in various tasks. HRI is of critical importance in the assistive robotics sector. Smart digital assistants and assistive robots fail quite often when a request is not well-defined verbally. When the assistant fails to provide services as desired, the person may exhibit an emotional response such as anger or frustration through the changes in their voice or on their facial expression. As interactive robots focus on serving human users with special tasks in their daily activities, it is critical for robots to understand not only the language, but also human psychology. In this dissertation, a novel affection-based perception architecture for HRI is presented, where the robot is expected to recognize human emotional states for encouraging a natural bonding between the human and the robotic artifact. The system built upon high level of heterogeneity including a low-cost 3D printed humanoid robot torso and an unmanned ground vehicle. The research aims to improve HRI by using the advantages of machine and deep learning approaches for emotion detection, decision-making and object identification. Furthermore, it emphasizes the multi-modal approaches for user specific emotion recognition problems in a dynamic environment.
dc.description.departmentElectrical and Computer Engineering
dc.format.extent192 pages
dc.format.mimetypeapplication/pdf
dc.identifier.urihttps://hdl.handle.net/20.500.12588/3344
dc.languageen
dc.subjectEmotion recognition
dc.subjectHuman-robot interactions
dc.subjectIndoor navigation
dc.subjectIntelligent personal assistant
dc.subjectInternet of robotic things
dc.subjectObject identification
dc.subject.classificationElectrical engineering
dc.subject.classificationRobotics
dc.subject.classificationArtificial intelligence
dc.titleTowards Artificial Emotional Intelligence for Heterogeneous System to Improve Human Robot Interactions
dc.typeThesis
dc.type.dcmiText
dcterms.accessRightspq_closed
thesis.degree.departmentElectrical and Computer Engineering
thesis.degree.grantorUniversity of Texas at San Antonio
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy

Files

Original bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Erol_utsa_1283D_12621.pdf
Size:
32.86 MB
Format:
Adobe Portable Document Format