Skip to main content

Investigating the role of social eye gaze in designing believable virtual characters

Resource type
Thesis type
(Dissertation) Ph.D.
Date created
A crucial issue in computationally modeling believable characters is determining the connection between high-level character attributes and the appropriate non-verbal behaviour such as gestures, head position, eye gaze and facial expressions that would display these attributes in a given situation. Current industry practice is to have artists animate characters, occasionally based on motion capture from actors. In order to procedurally adapt a character's behaviour to a situation, a general, valid, computationally tractable, model for non-verbal behaviour must be developed. This computational instantiation must preserve the aesthetic qualities of the incorporated material while presenting a usable set of interaction modalities. This dissertation examines how to capture the most relevant parts of social exchanges so that they can be modeled within a computational model, and then implemented within an interaction model. Most specifically, I investigated the ability to send social signals related to status through a virtual human’s eye gaze. Gaze is a critical component of social exchanges, and serves to make characters engaging or aloof, as well as to establish character’s role in a conversation. Based on the foundations of gaze behaviour from psychological literature, I have constructed a verbal-conceptual computational model of gaze for virtual humans that links several behaviour qualities to the concept of status from improv acting. This cross-domain model provides the basis for the social behaviour of procedurally animated characters. This dissertation validates aspects of that model related to length of looks, movement velocity, head posture, and mutual gaze. First, I evaluated the mode through four Amazon Mechanical Turk studies, which asked 50 participants in each to compare two different videos of a scripted scenario between characters animated using the SmartBody procedural animation system. These studies found some significant differences in how participants evaluated the characters’ status. Second, I designed an interactive system that incorporated eye tracking, dialogue, and control of a character in the SmartBody environment. This allowed 34 participants to practice hiring interviews, during which I varied the social gaze model of the character. Again, significant differences in perception of status were found.
Copyright statement
Copyright is held by the author.
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Scholarly level
Supervisor or Senior Supervisor
Thesis advisor: DiPaola, Steve
Download file Size
etd10504_MNixon.pdf 24.67 MB

Views & downloads - as of June 2023

Views: 22
Downloads: 1