SIAT Faculty Publications

Receive updates for this collection

The noise of the world

Date created: 
2007
Abstract: 

This essay traverses a heterogeneous terrain, finding important links in the ideas of Jacques Derrida and John Cage, and relating these to diverse cultural topics such as film soundtrack design, audio art, Saussurian linguistics, the sound and light shows at the Egyptian pyramids, the analogic nature of digital information, and cybernetics. Furthermore, the essay attempts to create some bridges - through the concept of "perceptual differance" - between the divergent world pictures (to use Heidegger's term) of cognitive psychology (with its quantitative frame of analysis) and the more slippery domain of hermeneutics.

Document type: 
Article

3D natural emulation design approach to virtual communities

Author: 
Date created: 
2010-05-31
Abstract: 

The design goal for OnLive’s Internet-based Virtual Community system was to develop
avatars and virtual communities where the participants sense a tele-presence – that
they are really there in the virtual space with other people. This collective sense of
"being-there" does not happen over the phone or with teleconferencing; it is a new and
emerging phenomenon, unique to 3D virtual communities. While this group presence
paradigm is a simple idea, the design and technical issues needed to begin to achieve
this on internet-based, consumer PC platforms are complex. This design approach
relies heavily on the following immersion-based techniques:
· 3D distanced attenuated voice and sound with stereo "hearing"
· a 3D navigation scheme that strives to be as comfortable as walking around
· an immersive first person user interface with a human vision camera angle
· individualized 3D head avatars that breathe, have emotions and lip sync
· 3D space design that is geared toward human social interaction.

Document type: 
Conference presentation

Programmatic Formation: Practical Applications of Parametric Design

Date created: 
2009-12
Abstract: 

Programmatic Formation explores design as a responsive process.The study
we present engages the complexity of the surroundings using parametric
and generative design methods. It illustrates that responsiveness of
designs can be achieved beyond geometric explorations.The parametric
models can combine and respond simultaneously to design and its
programmatic factors, such as performance-sensitive design-decisions, and
constraints.We demonstrate this through a series of case studies for a
housing tower.The studies explore the extent to which non-spatial
parameters can be incorporated into spatial parametric dependencies in
design.The results apply digital design and modeling, common to the
curriculum of architecture schools, to the practical realm of building
design and city planning.While practitioners are often slow to include
contemporary design and planning methods into their daily work, the
research illustrates how the incorporation of skills and knowledge
acquired as part of university education can be effectively incorporated
into everyday design and planning.

Document type: 
Article

Visual sensitivity analysis of parametric design models: Improving agility in design

Date created: 
2009
Abstract: 

The advances of generative and parametric CAD tools have enabled designers to create designs representations that are responsive, adoptable and flexible. However, the complexity of the models and limitation of human-visual systems posed challenges in effectively utilizing them for sensitivity analysis. In this prototyping study, we propose a method that aims at reduction of these challenges. The method proposes to improve visually
analysing sensitivity of a design model to changes. It adapts Model-View-Controller approach in software design to decouple control and visualization features from the design model while providing interfaces between them through parametric associations. The case studies is presented to demonstrate applicability and limitation of the method.

Document type: 
Article

Comprehending parametric CAD models

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2010-10
Abstract: 

In this study, we experimentally evaluated two GUI prototypes (named "split" and "integrated") equivalent to those used in the domain of parametric CAD modeling. Participants in the study were asked to perform a number of 3D model comprehension tasks, using both interfaces. The tasks themselves were classified into three classes: parameterization, topological and geometrical tasks. We measured the task completion times, error rates, and user satisfaction for both interfaces. The experimental results showed that task completion times are significantly shorter when the "split" interface is used, in all cases of interest: 1) tasks taken as a whole and 2) tasks viewed by task type. There was no significant difference in error rates; however, error rate was significantly higher in the case of parameterization tasks for both interfaces. User satisfaction was significantly higher for the "split" interface. The study gave us a better understanding of the human performance when perceiving and comprehending parametric CAD models, and offered insight into the usability aspects of the two studied interfaces; we also believe that the knowledge obtained could be of practical utility to implementers of parametric CAD modeling packages.

Document type: 
Conference presentation

Socially communicative characters for interactive applications

Date created: 
2006
Abstract: 

Interactive Face Animation - Comprehensive Environment (iFACE) is a general-purpose software framework
that encapsulates the functionality of “face multimedia object” for a variety of interactive applications such as
games and online services. iFACE exposes programming interfaces and provides authoring and scripting tools to
design a face object, define its behaviours, and animate it through static or interactive situations. The framework
is based on four parameterized spaces of Geometry, Mood, Personality, and Knowledge that together form the
appearance and behaviour of the face object. iFACE can function as a common “face engine” for design and runtime
environments to simplify the work of content and software developers.

Document type: 
Conference presentation

Socially expressive communication agents: A face-centric approach

Date created: 
2005
Abstract: 

Interactive Face Animation - Comprehensive Environment (iFACE) is a general purpose
software framework that encapsulates the functionality of “face multimedia object”.
iFACE exposes programming interfaces and provides authoring and scripting tools to design a
face object, define its behaviors, and animate it through static or interactive situations. The
framework is based on four parameterized spaces of Geometry, Mood, Personality, and
Knowledge that together form the appearance and behavior of the face object. iFACE
capabilities are demonstrated within the context of some artistic and educational projects .

Document type: 
Conference presentation

Designing an adaptive multimedia interactive to support shared learning experiences

Date created: 
2006
Abstract: 

With the aid of new technologies, integrated design approaches
are becoming increasingly incorporated into exhibit design in
museums, aquaria and science centres. These settings share many
similar design constraints that need to be addressed when
designing multimedia interactives as exhibits. The use of adaptive
systems and techniques can overcome many of the constraints
inherent in these environments as well as enhance the educational
content they incorporate. Our main design goal was to facilitate a
process to create user centric, collaborative and reflective learning
spaces around the smart multimedia interactives. We were
interested in encouraging deeper exploration of the content than
what is typically possible through wall signage, video display or a
supplemental web page. We discuss techniques to bring adaptive
systems into public informal learning settings, and validate these
techniques in a major aquarium with a beluga simulation
interactive. The virtual belugas, in a natural pod context, learn and
alter their behavior based on contextual visitor interaction. Data
from researchers, aquarium staff and visitors was incorporated
into the evolving interactive, which uses physically based systems
for natural whale locomotion and water, artificial intelligence
systems to simulation natural behavior, all of which respond to
user input. The interactive allows visitors to engage in educational
"what-if" scenarios of wild beluga emergent behavior using a
shared tangible interface controlling a large screen display. Copyright ACM, 2006. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive
version was published in In ACM SIGGRAPH 2006 Educators program (p. 14). Boston, Massachusetts: ACM. doi:10.1145/1179295.1179310

Document type: 
Conference presentation

Emotional remapping of music to facial animation

Date created: 
2006
Abstract: 

We propose a method to extract the emotional data from a piece
of music and then use that data via a remapping algorithm to
automatically animate an emotional 3D face sequence. The
method is based on studies of the emotional aspect of music and
our parametric-based behavioral head model for face animation.
We address the issue of affective communication remapping in
general, i.e. translation of affective content (eg. emotions, and
mood) from one communication form to another. We report on
the results of our MusicFace system, which use these techniques
to automatically create emotional facial animations from multiinstrument
polyphonic music scores in MIDI format and a
remapping rule set. ? ACM, 2006. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive
version was published in Proceedings of the 2006 ACM SIGGRAPH symposium on Videogames, 143-149. Boston, Massachusetts: ACM. doi:10.1145/1183316.1183337

Document type: 
Conference presentation

Multispace behavioral model for face-based affective social agents

Date created: 
2007-01
Abstract: 

This paper describes a behavioral model for affective social agents based on three independent but interacting parameter spaces:
knowledge, personality, andmood. These spaces control a lower-level geometry space that provides parameters at the facial feature
level. Personality and mood use findings in behavioral psychology to relate the perception of personality types and emotional
states to the facial actions and expressions through two-dimensional models for personality and emotion. Knowledge encapsulates
the tasks to be performed and the decision-making process using a specially designed XML-based language. While the geometry
space provides an MPEG-4 compatible set of parameters for low-level control, the behavioral extensions available through the
triple spaces provide flexible means of designing complicated personality types, facial expression, and dynamic interactive scenarios.

Document type: 
Article