Experiments on phrasal chunking in NLP using exponentiated gradient for structured prediction

Date created: 
Natural Language Processing
Computational Linguistics
Machine Learning
Sequence Learning for NLP

Exponentiated Gradient (EG) updates were originally introduced in (Kivinen and Warmuth, 1997) in the context of online learning algorithms. EG updates were shown by (Collins et al., 2008) to provide fast batch and online algorithms for learning a max-margin classifier. They show that EG can converge quickly due to multiplicative updates, and that EG updates can be factored into tractable components for structured prediction tasks where the number of output labels is exponential in the size of the input. In this project, we implement EG for a Natural Language Processing structured prediction task of phrasal chunking (finding noun phrases, and other phrases in text) and we compare the performance of EG with other discriminative learning algorithms that have state of the art results on this task.

Document type: 
Graduating extended essay / Research project
Copyright remains with the author. The author granted permission for the file to be printed and for the text to be copied and pasted.
Senior supervisor: 
Dr. Anoop Sarkar
Applied Science: School of Computing Science
Thesis type: 
((Computing Science) Project) M.Sc.