A Brain-Inspired Multi-Modal Perceptual System for Social Robots: An Experimental Realization

Peer reviewed: 
Yes, item is peer reviewed.
Scholarly level: 
Final version published as: 

Alqaderi, Mohammad & Rad, Ahmad. (2018). A Brain-Inspired Multi-Modal Perceptual System for Social Robots: An Experimental Realization. IEEE Access. 6. 1-1. DOI: 10.1109/ACCESS.2018.2851841.

Date created: 
Human-robot interaction
Machine perception
Multi-modal systems
Social robots
Spiking neural networks
Top-down influences

We propose a multi-modal perceptual system that is inspired by the inner working of the human brain; in particular, the hierarchical structure of the sensory cortex and the spatial-temporal binding criteria. The system is context independent and can be applied to many on-going problems in social robotics, including but not limited to person recognition, emotion recognition, and multi-modal robot doctor to name a few. The system encapsulates the parallel distributed processing of real-world stimuli through different sensor modalities and encoding them into features vectors which in turn are processed via a number of dedicated processing units (DPUs) through hierarchical paths. DPUs are algorithmic realizations of the cell assemblies in neuroscience. A plausible and realistic perceptual system is presented via the integration of the outputs from these units by spiking neural networks. We will also discuss other components of the system including top-down influences and the integration of information through temporal binding with fading memory and suggest two alternatives to realize these criteria. Finally, we will demonstrate the implementation of this architecture on a hardware platform as a social robot and report experimental studies on the system.

Document type: 
Rights remain with the authors.