Technical Reports
Permanent URI for this collection
Browse
Browsing Technical Reports by Subject "Data Privay"
Now showing 1 - 1 of 1
Results Per Page
Sort Options
- ItemEvaluation of Di erentially Private Non-parametric Machine Learning as a ServiceDANDEKAR, Ashish; BASU, Debabrota; BRESSAN, StephaneMachine learning algorithms create models from training data for the purpose of estimation, prediction and classi cation. While releasing parametric machine learning models requires the release of the parameters of the model, releasing non-parametric machine learning models requires the release of the training dataset along with the parameters. Indeed, estimation, prediction or classi cation with non-parametric models computes some form of correlation between new data and the training data. The release of the training dataset creates a risk of breach of privacy. An alternative to the release of the training dataset is the presentation of the non-parametric model as a service. Still, the non-parametric model as a service may leak information about the training dataset. We study how to provide di erential privacy guarantees for non-parametric models as a service. This cannot be achieved by perturbation of the output but requires perturbation of the model functions. We show how to apply the perturbation to the model functions of histogram, kernel density estimator, kernel SVM and Gaussian process regression in order to provide ( ; )-di erential privacy. We evaluate the trade-o between the privacy guarantee and the error incurred for each of these non-parametric machine learning algorithms on benchmarks and real-world datasets.Our contribution is twofold. We show that functional perturbation is not only pragmatic for releasing machine learning models as a service but also yields higher e ectiveness than output perturbation mechanisms for speci ed privacy parameters. We show the practical step to perturbate the model functions of histogram, kernel SVM, Gaussian process regression along with kernel density estimator. We evaluate the tradeo between the privacy guarantee and the error incurred for each of these non-parametric machine learning algorithms for a real-world dataset as well as a selection of benchmarks.