Skip to main content

Tensor completion methods for collaborative intelligence

Resource type
Thesis type
(Thesis) M.A.Sc.
Date created
In the race to bring Artificial Intelligence (AI) to the edge, collaborative intelligence has emerged as a promising way to lighten the computation load on edge devices that run applications based on Deep Neural Networks (DNNs). Typically, a deep model is split at a given layer into edge and cloud sub-models. The deep feature tensor produced by the edge sub-model is transmitted to the cloud, where the remaining computationally intensive workload is performed by the cloud sub-model. The communication channel between the edge and cloud is imperfect, which will result in missing data in the deep feature tensor received at the cloud side, an issue that has mostly been ignored by existing literature on the topic. In this thesis I study four methods for recovering missing data in the deep feature tensor. Three of the studied methods are existing, generic tensor completion methods, and are adapted here to recover deep feature tensor data, while the fourth method is newly developed specifically for deep feature tensor completion. Simulation studies show that the new method is 3 − 18 times faster than the other three methods, which is an important consideration in collaborative intelligence. For VGG16’s sparse tensors, all methods produce statistically equivalent classification results across all loss levels tested. For ResNet34’s non-sparse tensors, the new method offers statistically better classification accuracy (by 0.25% − 6.30%) compared to other methods for matched execution speeds, and second-best accuracy among the four methods when they are allowed to run until convergence.
Copyright statement
Copyright is held by the author.
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Scholarly level
Supervisor or Senior Supervisor
Thesis advisor: bajic, ivan
Member of collection
Download file Size
etd20767.pdf 7.98 MB

Views & downloads - as of June 2023

Views: 0
Downloads: 0