Skip to main content

A Sim2Real approach to augment underrepresented data for dynamic emotion expression recognition

Resource type
Thesis type
(Thesis) M.Sc.
Date created
2021-09-28
Authors/Contributors
Abstract
Robots and artificial agents that interact with humans should be able to do so without bias and inequity, but facial perception systems have notoriously been found to work more poorly for certain groups of people than others. In our work, we aim to build a system that can perceive humans in a more transparent and inclusive manner. Specifically, we focus on dynamic expressions on the human face, which are difficult to collect for a broad set of people due to privacy concerns and the fact that faces are inherently identifiable. We address this problem by offering a Sim2Real approach in which we use a suite of 3D simulated human models that enables us to create an auditable synthetic dataset covering 1) underrepresented facial expressions, outside of the six basic emotions, such as confusion; 2) ethnic or gender minority groups; 3) a wide range of viewing angles that a robot may encounter a human in the real world.
Document
Extent
37 pages.
Identifier
etd21677
Copyright statement
Copyright is held by the author(s).
Permissions
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Supervisor or Senior Supervisor
Thesis advisor: Lim, Angelica
Language
English
Member of collection
Download file Size
etd21677.pdf 21.91 MB

Views & downloads - as of June 2023

Views: 0
Downloads: 0