Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01j3860708m
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorRamadge, Peter Jen_US
dc.contributor.authorEis, David Jeremyen_US
dc.contributor.otherElectrical Engineering Departmenten_US
dc.date.accessioned2014-06-05T19:45:27Z-
dc.date.available2014-06-05T19:45:27Z-
dc.date.issued2014en_US
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01j3860708m-
dc.description.abstractMany datasets can be looked at as signals over a graph structure. To this end, the field of graphical models has been a fruitful area of research. This thesis examines a model for time series data called the Sticky Hierarchical Dirichlet Process Hidden Markov Model (SHDPHMM), proposed by Emily Fox. It is useful for clustering time series data where the number of hidden states is unknown, which is often the case in practice. The contribution of this thesis is the derivation of the deterministic variational inference update equations for doing inference on the SHDPHMM. This is an improvement over the Markov Chain Monte Carlo (MCMC) algorithm proposed by Fox as it allows for direct assessment of convergence and can run faster. For a noisy signal over the nodes in a graph, the fused lasso can be used as a denoising method. The fused lasso is a particular case of the generalized lasso, a regularized regression problem which encourages sparsity in a linear transformation of the regression coefficients. This thesis completes the picture of equivalences between the generalized lasso, its dual problem, and the subspace constrained lasso (SCL) and its dual problem. The sparsity is directly expressed in the SCL. The structure of this problem allows for codeword screening. Many of the screening methods both of the single shot and sequential variety are derived in this thesis for the SCL, which depends on the structure of the dual problem. Screening in this case is not as effective as it is for the lasso, where it can reduce very large dictionaries to a fraction of the size. However, it is still an important tool which allows for increased speed of solving the SCL and hence the generalized lasso.en_US
dc.language.isoenen_US
dc.publisherPrinceton, NJ : Princeton Universityen_US
dc.relation.isformatofThe Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the <a href=http://catalog.princeton.edu> library's main catalog </a>en_US
dc.subjectgeneralized lassoen_US
dc.subjectmachine learningen_US
dc.subjectscreeningen_US
dc.subjectsticky hdp-hmmen_US
dc.subjectvariational inferenceen_US
dc.subject.classificationComputer scienceen_US
dc.subject.classificationInformation scienceen_US
dc.subject.classificationElectrical engineeringen_US
dc.titleMachine Learning on Graphsen_US
dc.typeAcademic dissertations (Ph.D.)en_US
pu.projectgrantnumber690-2143en_US
Appears in Collections:Electrical Engineering

Files in This Item:
File Description SizeFormat 
Eis_princeton_0181D_10995.pdf744.15 kBAdobe PDFView/Download


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.