Skip navigation
Please use this identifier to cite or link to this item:
Authors: Abrol, Farhan
Advisors: Blei, David
Department: Computer Science
Class Year: 2014
Abstract: Stochastic Variational Inference [7] has proven to be a fast and reliable framework for inferring posterior distributions over large corpora. One of its many applications has been to topic modeling using the Latent Dirichlet Allocation model. However, it is prone to get stuck in local optima. Deterministic annealing has traditionally been applied to Expectation-Maximization algorithms to converge to better local optima by transforming the objective function with a temperature parameter. In this paper, I apply the idea of Deterministic Annealing to Stochastic Variational Inference to help it converge to better local optima. I motivate the use of annealing through a statistical physics analogy and derive a general annealed framework for stochastic variational inference. I then explore this algorithm in relation to the Latent Dirichlet Allocation model. The results show that across various large datasets, we can achieve better optimum quicker using annealing. The annealing procedure has free parameters whose impact on the convergence of the algorithm were studied
Extent: 42 pages
Type of Material: Princeton University Senior Theses
Language: en_US
Appears in Collections:Computer Science, 1988-2017

Files in This Item:
File SizeFormat 
Abrol_Farhan_Thesis_1in (3).pdf2.43 MBAdobe PDF    Request a copy

Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.