Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01h415pd61v
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorJha, Niraj K-
dc.contributor.authorZuluaga, David-
dc.date.accessioned2020-10-02T21:30:25Z-
dc.date.available2020-10-02T21:30:25Z-
dc.date.created2020-05-24-
dc.date.issued2020-10-02-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01h415pd61v-
dc.description.abstractAdaBoost is a well known boosting algorithm that excels at taking an ensemble of weak learners and creating a strong classifier by taking the weighted sum of their predictions. However, it has not been used extensively with neural networks because they are already strong classifiers. Existing implementations of the two concepts create a large ensemble of strong classifiers that do not provide a significant enough improvement in accuracy to warrant their size. In this paper, we will implement a novel AdaBoost implementation of neural networks that creates a strong classifier from weak learners , while also having a more compact design relative to existing implementations. This will be done by using the SCANN [1] growing/pruning architecture along with feature compression to create weak artificially neural network (ANN) models. We will also adopt the concept of random decision forests by subsampling a different set of features for each model that is part of the ensemble. Finally, the AdaBoost algorithm is used to resample the training distribution in each iteration and bias it towards examples that were misclassified. Each model will contribute weighted predictions that are summed to determine the final ensemble decision for each input. The results show this implementation was effective in creating a strong classifier from weak learners for binary classification in the musk and epileptic datasets. However, the ensemble’s accuracy is still lower than that of a generic model that has a number of connections equal to the sum of all the connections in the ensemble’s members. This indicates that the novel implementation is a good launching points, but further work needs to be conducted to make this novel implementation practical.-
dc.format.mimetypeapplication/pdf-
dc.language.isoen-
dc.titleImproved Binary Classification with Compact Neural Networks using AdaBoost, SCANN, & SMOTE Oversampling-
dc.typePrinceton University Senior Theses-
pu.date.classyear2020-
pu.departmentElectrical Engineering-
pu.pdf.coverpageSeniorThesisCoverPage-
dc.rights.accessRightsWalk-in Access. This thesis can only be viewed on computer terminals at the <a href=http://mudd.princeton.edu>Mudd Manuscript Library</a>.-
pu.contributor.authorid920057468-
pu.certificateApplications of Computing Program-
pu.mudd.walkinyes-
Appears in Collections:Electrical and Computer Engineering, 1932-2023

Files in This Item:
File Description SizeFormat 
ZULUAGA-DAVID-THESIS.pdf509.59 kBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.