Resource type
Thesis type
(Thesis) M.Sc.
Date created
2010
Authors/Contributors
Author: Couture-Beil, Alex Sebastien
Abstract
In this thesis, we present a novel real-time computer vision-based system for facilitating interactions between a single human and a multi-robot system: a user first selects an individual robot from a group of robots, by simply looking at it, and then commands the selected robot with a motion-based gesture. We describe a novel multi-robot system that demonstrates the feasibility of using face contact and motion-based gestures as two non-verbal communication channels for human-robot interaction. Robots first perform face detection using a well-known face detector. The resulting "score" of the detected face is used in a distributed leader election algorithm to estimate which robot the user is looking at. The selected robot then derives a set of motion features, based on blurred optical flow, which is extracted from a user-centric region. These motion cues are then used to discriminate between gestures (robot commands) using an efficient learned classifier.
Document
Copyright statement
Copyright is held by the author.
Scholarly level
Language
English
Member of collection
Download file | Size |
---|---|
etd5891.pdf | 5.86 MB |