Search
Displaying 1 - 9 of 9
Supplemental data files for the thesis Modeling empathy in embodied conversational agents.
Author: Yalçın, Özge
Date created: 2019-08-28
This zipped folder contains the videos generated for levels of empathic interaction between a human and the embodied conversational agent. This collection includes videos of different emotional stories (angry, happy, sad) told by a human, while the embodied conversational agent would show non-verbal feedback with backchanneling, mimicry and affective matching conditions. The zipped folder includes 9 videos, 3 interaction scenarios (backchannel, mimicry, affect match) for three emotional stories (happy, sad, angry).
Author: Yalçın, Özge
Date created: 2019-05-01
This zipped folder contains the videos generated for levels of empathic interaction between a human and the embodied conversational agent. Conversational Interaction with both verbal and non-verbal feedback from the ECA. The .zip file contains 4 videos of interactions between a human telling emotional stories and an embodied conversational agent responding to them. A total of two different stories and mimicry vs affect matching interaction is presented.
Author: Yalçın, Özge
Date created: 2019-05-01
A collection of videos for the facial gesture experiments. Includes the video files for the face gestures of the avatar. Each gesture is different in terms of its speed, amplitude and type of emotions it convey. The .zip file includes the videos of several emotional facial expressions recorded from Smartbody character animation system (http://smartbody.ict.usc.edu/) using ChrBrad. Every video is named starting with the name of the emotional category (happy, sad, anger, contempt, fear, disgust, upset) followed with three set of numbers that indicates the amplitude (0.4, 0.7, 1), speed (0.3, 0.6, 1) and duration (0.5, 1, 2) of the gesture. So the video named "anger0.40.30.5" shows the anger emotion with a amplitude of 0.4, speed of 0.3 and for 0.5 seconds.
Author: Yalçın, Özge
Date created: 2018-04-26
Results for the facial gesture experiments. Includes datasets used and results gathered from the MTurk experiments. The csv file includes anonymized Pleasure-Arousal-Dominance (PAD) ratings (https://link.springer.com/article/10.1007/BF02686918) with categorical emotion ratings for each emotion video with variations in amplitude, acceleration. The durations of the videos held constant. The ratings are obtained via Amazon MTurk experimental setting. The ratings for PAD is between 0 and 5 (6-point likert).
Author: Yalçın, Özge
Date created: 2019-03-05
Collection of videos, .bvh files, .skm files and results of the gesture rating experiments. Videos include a set of gestures that are used in the gesture rating experiments as a part of the M-Path project. The embodied conversational agent used in the making of videos is ChrBrad in Smartbody system (http://smartbody.ict.usc.edu/). Each video is 4 seconds long, with the peak of the gesture is relatively in the middle of the video.
Author: Yalçın, Özge
Date created: 2018-04-26
A collection of results for the facial gesture experiments. This collection includes a set of emotional gestures that were created by refining the emotional expressions that were provided by the Smartbody system. All the gestures included in this .zip file are hand-coded by Marie Louka (http://marielouka.com/). The emotions included are happy, sad, angry, shocked and fear.
Author: Yalçın, Özge
Date created: 2018-04-26
Includes videos used in the M-Path facial gestures rating experiment using PAD values. The experiment can be replicated using the code in https://github.com/onyalcin/M-PATH.
Author: Yalçın, Özge
Date created: 2019-03-05