Skip to main content

Regularized representation learning with deep neural networks

Resource type
Thesis type
(Thesis) Ph.D.
Date created
Author: Gong, Yu
The recent success of deep neural networks largely relies on their significant capacity to learn meaningful representations. A large number of parameters store the experience learned from the training data, and the representations are the activations of the hidden layers that represent the direct response to new data. Nonetheless, these highly performant models are sensitive to shifts in the data distribution or changes in the task. The development of conventional deep neural networks also necessitates their application where computational constraints become crucial. However, the reduction of size or precision can significantly undermine the quality of representations. Therefore, it is vital to explore further in deep representation learning when dealing with practical scenarios. In this dissertation, we will first discuss and compare different methods that are designed to regularize and then introduce our proposed approaches to improve conventional deep representation learning under some practical scenarios. First, we focus on improving probabilistic representations with incomplete heterogeneous data. Second, we present the challenge of learning from imbalanced data and offer our solution to regularize and acquire more effective representations. Third, we focus on the problem of computational constraints on how to fully explore the representations of deep neural networks. In summary, we provide solutions to regularize and enhance traditional deep representation learning when facing changes in the data distributions or model settings.
118 pages.
Copyright statement
Copyright is held by the author(s).
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Supervisor or Senior Supervisor
Thesis advisor: Mori, Greg
Member of collection
Download file Size
etd22857.pdf 39.25 MB

Views & downloads - as of June 2023

Views: 12
Downloads: 0