Please use this identifier to cite or link to this item:
|Title:||A DETERMINISTIC ANNEALING APPROACH TO STOCHASTIC VARIATIONAL INFERENCE|
|Abstract:||Stochastic Variational Inference  has proven to be a fast and reliable framework for inferring posterior distributions over large corpora. One of its many applications has been to topic modeling using the Latent Dirichlet Allocation model. However, it is prone to get stuck in local optima. Deterministic annealing has traditionally been applied to Expectation-Maximization algorithms to converge to better local optima by transforming the objective function with a temperature parameter. In this paper, I apply the idea of Deterministic Annealing to Stochastic Variational Inference to help it converge to better local optima. I motivate the use of annealing through a statistical physics analogy and derive a general annealed framework for stochastic variational inference. I then explore this algorithm in relation to the Latent Dirichlet Allocation model. The results show that across various large datasets, we can achieve better optimum quicker using annealing. The annealing procedure has free parameters whose impact on the convergence of the algorithm were studied|
|Type of Material:||Princeton University Senior Theses|
|Appears in Collections:||Computer Science, 1988-2017|
Files in This Item:
|Abrol_Farhan_Thesis_1in (3).pdf||2.43 MB||Adobe PDF||Request a copy|
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.