Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01s7526g34f
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorMittal, Prateek-
dc.contributor.authorWagh, Sameer-
dc.contributor.otherElectrical Engineering Department-
dc.date.accessioned2020-07-13T03:32:27Z-
dc.date.available2020-07-13T03:32:27Z-
dc.date.issued2020-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01s7526g34f-
dc.description.abstractApplications of machine learning have become increasingly common in recent years. For instance, navigation systems like Google Maps use machine learning to better predict traffic patterns; Facebook, LinkedIn, and other social media platforms use machine learning to customize user's news feeds. Central to all these systems is user data. However, the sensitive nature of the collected data has also led to a number of privacy concerns. Privacy-preserving machine learning enables systems that can perform such computation over sensitive data while protecting its privacy. In this dissertation, we focus on developing efficient protocols for machine learning as a target analytics application. To incorporate privacy, we use a multi-party computation-based approach. In multi-party computation, a number of non-colluding entities jointly perform computation over the data and privacy stems from no party having any information about the data being computed on. At the heart of this dissertation are three frameworks -- SecureNN, FALCON, and Ponytail -- each pushing the frontiers of privacy-preserving machine learning and propose novel approaches to protocol design. SecureNN and FALCON introduce, for the first time, highly efficient protocols for computation of non-linear functions (such as rectified linear unit, maxpool, batch-normalization) using purely modular arithmetic. Ponytail demonstrates the use of homomorphic encryption to significantly improve over prior art in private matrix multiplication. Each framework provides both significant asymptotic as well as concrete efficiency gains over prior work by improving computation as well as communication performance by an order of magnitude. These building blocks -- matrix multiplication, rectified linear unit, maxpool, batch-normalization -- are central to machine learning and improvements to these significantly improve upon prior art in private machine learning. Furthermore, each of these systems is implemented and benchmarked to reduce the barrier of deployment. Uniquely positioned at the intersection of both theory and practice, these frameworks bridge the gap between plaintext and privacy-preserving computation while contributing new directions for research to the community.-
dc.language.isoen-
dc.publisherPrinceton, NJ : Princeton University-
dc.relation.isformatofThe Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: <a href=http://catalog.princeton.edu> catalog.princeton.edu </a>-
dc.subjectApplied Cryptography-
dc.subjectHomomorphic Encryption-
dc.subjectMulti-Party Computation-
dc.subjectPrivacy Enhancing Technologies-
dc.subjectPrivacy-Preserving Machine Learning-
dc.subject.classificationComputer science-
dc.subject.classificationComputer engineering-
dc.titleNew Directions in Efficient Privacy-Preserving Machine Learning-
dc.typeAcademic dissertations (Ph.D.)-
Appears in Collections:Electrical Engineering

Files in This Item:
File Description SizeFormat 
Wagh_princeton_0181D_13320.pdf1.87 MBAdobe PDFView/Download


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.