Autonomous mobile robots are becoming more prevalent in our society as they become more useful. Often the environments we would like to see robots working in will be dynamic, with objects like people moving throughout. This thesis explores methods to make autonomous mobile robots more effective in the presence of dynamic objects. Specifically, two applications are developed on top of existing computer vision techniques where robots interact directly with moving objects. The first application involves multiple robots collaboratively sensing and following an arbitrary moving object. This is the first demonstration of its kind where robots jointly localize themselves and an object of interest while planning motion that is sympathetic to the vision system. Live robot experiments are conducted demonstrating the efficacy of the proposed system. The second application is a novel human-robot interaction system based on face engagement applied to an unmanned aerial vehicle (UAV). A unique use of facial recognition software enables an uninstrumented user to command a UAV with only their face. A series of experiments demonstrate the effectiveness of the interaction system for sending UAVs on a variety of flight trajectories.
Copyright is held by the author.
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Supervisor or Senior Supervisor
Thesis advisor: Vaughan, Richard
Member of collection