Skip to main content

LinearBoost for classification

Resource type
Thesis type
(Thesis) M.Sc.
Date created
2023-08-21
Authors/Contributors
Author: Lin, Dekai
Abstract
AdaBoost is a well-studied and widely used ensemble method that improves classification performance iteratively by focusing on misclassified samples through reweighting. It assigns higher weights to misclassified samples in each training iteration and makes the final predictions through voting of all base classifiers learned from all iterations. This reweighting, however, can lead to a disproportionate focus on properly classified samples in early iterations, resulting in many inferior classifiers for such samples in the final voting. In this work, we propose LinearBoost, a competing ensemble approach, to address this issue. Instead of voting by multiple classifiers, LinearBoost classifies a sample by the first ''promising'' classifier learned in the iterative process, where ''promising'' means a prediction having a high enough confidence of being correct. The next iteration of training will focus on the remaining samples that do not have a promising prediction. Therefore, LinearBoost can maintain the performance of properly classified samples in early iterations (by identifying their promising classifiers) while improving the performance of misclassified samples iteratively. LinearBoost is a general boosting strategy that can work with any type of base classifiers. Experiments on datasets with different characteristics and different types of base classifiers show that LinearBoost usually does better than AdaBoost and its variations, achieving higher Macro and Weighted F1 scores.
Document
Extent
42 pages.
Identifier
etd22687
Copyright statement
Copyright is held by the author(s).
Permissions
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Supervisor or Senior Supervisor
Thesis advisor: Wang, Ke
Language
English
Member of collection
Download file Size
etd22687.pdf 556.91 KB

Views & downloads - as of June 2023

Views: 38
Downloads: 2