Embodied conversational agents (ECAs) are designed with the goal of achieving natural and effortless interactions with humans by displaying the same communication channels we use in our daily interactions (e.g. gestures, gaze, facial expressions, verbal behaviors). With advances in computational power, these agents are increasingly equipped with social and emotional capabilities to improve interaction with the users. Recently, research efforts are focused on modeling empathy, which is a human trait that allows us to share and understand each other's feelings. The emerging field of computational empathy aims to equip artificial agents with empathic behavior, which has shown great promise in enhancing the human-agent interaction. However, two issues arise in this research endeavor. Firstly, even though a variety of disciplines have extensively examined empathic behavior, there is minimal discussion on how that knowledge can be translated into computational empathy research. Second, modeling and implementing a complex behavior such as empathy poses a great challenge on fluent and automated integration of these behaviors to achieve real-time and multi-modal interaction with ECAs. This thesis aims to model and implement empathy in embodied conversational agents while focusing on both of these issues. To achieve this goal, an extensive literature review of the definitions and models of empathy from various disciplines is provided. Building upon this background knowledge, a model of empathy is presented that is suitable for interactive virtual agents, which includes three hierarchical layers of behavioral capabilities: emotional communication competence, emotion regulation and cognitive mechanisms. This dissertation further provides suggestions on how to evaluate perceived empathy of such a system, as there are no agreed-upon standards or best-practices in this novel field on evaluation metrics. Following the establishment of these theoretical foundations, levels of empathic behavior were implemented into an ECA with real-time spoken conversation capabilities that include synchronized gestural and emotional behavior. Evaluations of this system, which is called M-PATH, showed that the proposed levels of behavioral capabilities resulted in an increase in the perception of empathy as well as the perceived usefulness, human-likeness and believability of the agent. This dissertation further demonstrates that implementing empathic behaviors in artificial agents would not only improve our interaction but can also enhance our understanding of empathy by providing us with a controlled environment to implement and test our theories.
Copyright is held by the author.
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Supervisor or Senior Supervisor
Thesis advisor: DiPaola, Steve
Member of collection