Skip navigation
Please use this identifier to cite or link to this item:
Title: New Directions in Efficient Privacy-Preserving Machine Learning
Authors: Wagh, Sameer
Advisors: Mittal, Prateek
Contributors: Electrical Engineering Department
Keywords: Applied Cryptography
Homomorphic Encryption
Multi-Party Computation
Privacy Enhancing Technologies
Privacy-Preserving Machine Learning
Subjects: Computer science
Computer engineering
Issue Date: 2020
Publisher: Princeton, NJ : Princeton University
Abstract: Applications of machine learning have become increasingly common in recent years. For instance, navigation systems like Google Maps use machine learning to better predict traffic patterns; Facebook, LinkedIn, and other social media platforms use machine learning to customize user's news feeds. Central to all these systems is user data. However, the sensitive nature of the collected data has also led to a number of privacy concerns. Privacy-preserving machine learning enables systems that can perform such computation over sensitive data while protecting its privacy. In this dissertation, we focus on developing efficient protocols for machine learning as a target analytics application. To incorporate privacy, we use a multi-party computation-based approach. In multi-party computation, a number of non-colluding entities jointly perform computation over the data and privacy stems from no party having any information about the data being computed on. At the heart of this dissertation are three frameworks -- SecureNN, FALCON, and Ponytail -- each pushing the frontiers of privacy-preserving machine learning and propose novel approaches to protocol design. SecureNN and FALCON introduce, for the first time, highly efficient protocols for computation of non-linear functions (such as rectified linear unit, maxpool, batch-normalization) using purely modular arithmetic. Ponytail demonstrates the use of homomorphic encryption to significantly improve over prior art in private matrix multiplication. Each framework provides both significant asymptotic as well as concrete efficiency gains over prior work by improving computation as well as communication performance by an order of magnitude. These building blocks -- matrix multiplication, rectified linear unit, maxpool, batch-normalization -- are central to machine learning and improvements to these significantly improve upon prior art in private machine learning. Furthermore, each of these systems is implemented and benchmarked to reduce the barrier of deployment. Uniquely positioned at the intersection of both theory and practice, these frameworks bridge the gap between plaintext and privacy-preserving computation while contributing new directions for research to the community.
Alternate format: The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog:
Type of Material: Academic dissertations (Ph.D.)
Language: en
Appears in Collections:Electrical Engineering

Files in This Item:
File Description SizeFormat 
Wagh_princeton_0181D_13320.pdf1.87 MBAdobe PDFView/Download

Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.