Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp018s45qc073
Full metadata record
DC FieldValueLanguage
dc.contributorSinger, Amit-
dc.contributor.advisorArora, Sanjeev-
dc.contributor.authorTu, Brian Chang-
dc.date.accessioned2015-06-15T14:33:23Z-
dc.date.available2015-06-15T14:33:23Z-
dc.date.created2015-05-04-
dc.date.issued2015-06-15-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp018s45qc073-
dc.description.abstractIn the field of machine learning, kernel methods have risen to become a very popular tool to enable learning algorithms to detect very general types of relationships. Kernels do this by implicitly lifting the data into a feature space of higher dimension, and then taking an inner product. Despite this power, because of the implicit mapping that is performed, kernel methods usually scale poorly with the size of the input [1]. To address this problem we employ a method, due to Rahimi and Recht [7], called random Fourier features that computes random projections of the input data into a feature space that approximates the kernel, thereby making the runtime linear in the input. This method allows us to analyze datasets that are too large for kernel methods. We explore the applications of this method on two datasets, MNIST and SVHN, both of which have sizes on the order of 105.en_US
dc.format.extent18 pagesen_US
dc.language.isoen_USen_US
dc.titlePractical Exploration of Randomized Features For Classification Tasksen_US
dc.typePrinceton University Senior Theses-
pu.date.classyear2015en_US
pu.departmentMathematicsen_US
pu.pdf.coverpageSeniorThesisCoverPage-
Appears in Collections:Mathematics, 1934-2023

Files in This Item:
File SizeFormat 
PUTheses2015-Tu_Brian_Chang.pdf544.5 kBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.