Skip to main content

Movement affect estimation in motion capture data

Resource type
Thesis type
(Thesis) M.Sc.
Date created
2018-05-01
Authors/Contributors
Author: Li, Dishen
Abstract
The problem that we are addressing is that of \emph{movement affect estimation}: estimating the emotional states of motion capture data as input. Motion capture data contains just a skeleton and no information about body type and facial expressions. The motion capture data consists of professional actors and dancers performing movements such as walking, sitting, and improv. Machine learning models are then built using these motion capture to learn the affective states behind these movement. Overall, we conducted a series of three experiments. First, using the labels that were given to the actors as the ground truth, our Hidden Markov Models (HMM) models were able to reach over 70\% accuracy in prediciting the affective state, out of a total possible 9 affective states. Second, we attempted recognition through establishing a ground truth using ratings. We used a continuous approach by asking university students to rate each movement in valence and arousal simultaneously. The ratings are then used as ground truth labels for supervised machine learning with stepwise linear regression. This achieved a high coefficient of determination, the performance metric we used in this experiment. In our third experiment and in light of more literature review after the first two experiments, we gathered more data using a crowdsourcing platform and modified our machine learning techniques by switching to rank-based methods. In this case, rather than assigning an absolute numerical rating value to quantify the affective state of the movement, each movement is now ranked relative to other movement along the dimensions of affect. Results are then analyzed using the Goodman-Kruskal Gamma. The performance of models in this approach show that it is highly dependent on the movement type in that consistent movement pattern lead to a more consistent ranking. It also appears to be more effective for recognizing affect in postures.
Document
Identifier
etd19738
Copyright statement
Copyright is held by the author.
Permissions
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Scholarly level
Supervisor or Senior Supervisor
Thesis advisor: Pasquier, Philippe
Download file Size
etd19738.pdf 2.09 MB

Views & downloads - as of June 2023

Views: 0
Downloads: 1