Payman Mohassel & Peter Rindal ~ eprint/2018/403 ~ CCS’18
Machine learning is widely used to produce models for a range of applications and is increasingly offered as a service by major technology companies. However, the required massive data collection raises privacy concerns during both training and prediction stages.
In this paper, we design and implement a general framework for privacy-preserving machine learning and use it to obtain new solutions for training linear regression, logistic regression and neural network models. Our protocols are in a three-server model wherein data owners secret share their data among three servers who train and evaluate models on the joint data using three-party computation (3PC).
Our main contribution is a new and complete framework (ABY3) for efficiently switching back and forth between arithmetic, binary, and Yao 3PC which is of independent interest. Many of the conversions are based on new techniques that are designed and optimized for the first time in this paper. We also propose new techniques for fixed-point multiplication of shared decimal values that extends beyond the three-party case, and customized protocols for evaluating piecewise polynomial functions. We design variants of each building block that is secure against malicious adversaries who deviate arbitrarily.
We implement our system in C++. Our protocols are up to four orders of magnitude faster than the best prior work, hence significantly reducing the gap between privacy-preserving and plaintext training.