Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01fb494c071
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorVanderbei, Robert-
dc.contributor.authorPang, Haotian-
dc.contributor.otherElectrical Engineering Department-
dc.date.accessioned2017-09-22T14:44:36Z-
dc.date.available2017-09-22T14:44:36Z-
dc.date.issued2017-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01fb494c071-
dc.description.abstractThe past decades have witnessed significantly progress in machine learning, and solving these problems requires the advancing in optimization techniques. High dimensional sparse learning has imposed a great computational challenge to large scale data analysis. In this dissertation, parametric simplex method is applied to solve a broad class of sparse learning approaches, which can be formulated as linear programs parametrized by a regularization parameter. There are serious drawbacks to the existing methods for solving these types of problems as tuning the parameter for the desired solution is very inefficient. The customized parametric simplex method is introduced and it uses the unknown weighting factor as the parameter and provides a powerful and efficient way to address these shortcomings. Although the simplex method has an exponential complexity in the worse case, it has been shown that the parametric simplex method is an appropriate method for these cases when the expected solution is sparse. An R package named fastclime which efficiently solves a variety machine learning problems by the customized parametric simplex method is developed. A convex optimization method named Inexact Peaceman-Rachford Splitting Method (IPRSM) is studied. Similar to the Alternating Direction Method of Multiplier (ADMM), the strictly contractive Peaceman-Rachford Splitting Method (PRSM) is also used to solve a convex minimization problem with linear constraints and a separable objective function. In many applications, it is quite expensive to obtain exact solutions to these subproblems. The inexact methods intend to solve the iterative subproblems when the exact solutions do not exist or they are hard to obtain. Finally, the new graph Perceptron algorithm, a graph estimation method which performs on online binary classification problems is proposed. The new graph Perceptron algorithm is a new kernel based Perceptron derived from online class-action and extend to online graph estimation with a new kernel trick.-
dc.language.isoen-
dc.publisherPrinceton, NJ : Princeton University-
dc.relation.isformatofThe Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: <a href=http://catalog.princeton.edu> catalog.princeton.edu </a>-
dc.subjectConvex Optimization-
dc.subjectLinear Programming-
dc.subjectMachine Learning-
dc.subjectOnline Learning-
dc.subjectParametric Simplex Method-
dc.subject.classificationElectrical engineering-
dc.subject.classificationOperations research-
dc.titleThe Application of Optimization Techniques in Machine Learning-
dc.typeAcademic dissertations (Ph.D.)-
pu.projectgrantnumber690-2143-
Appears in Collections:Electrical Engineering

Files in This Item:
There are no files associated with this item.


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.