This thesis presents the inquiry and development of a new system for the presentation of audiovisual information and experiences, called Pixelphonics, which operates by colocating sound and image within the visual display area(s). Audiovisual colocation places sound and image cues in close spatial proximity to each other, so that the displayed media functions more analogously to natural perception. Most audiovisual systems spatially dislocate sound and image information sources, by placing visual information within the screen, and audio information external to the screen through headphones or speaker arrays. Audiovisual colocation enhances the object-event correlations in mediated content, and is a new affordance for media production and reception.
Copyright is held by the author.
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Supervisor or Senior Supervisor
Thesis advisor: St. Pierre, Paul Matthew