Differentially Private Empirical Risk Minimization with Smooth Non-Convex Loss Functions: A Non-Stationary View
نویسندگان
چکیده
منابع مشابه
Differentially Private Empirical Risk Minimization
Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ε-differential privacy definition du...
متن کاملEfficient Empirical Risk Minimization with Smooth Loss Functions in Non-interactive Local Differential Privacy
In this paper, we study the Empirical Risk Minimization problem in the non-interactive local model of differential privacy. We first show that if the ERM loss function is (∞, T )-smooth, then we can avoid a dependence of the sample complexity, to achieve error α, on the exponential of the dimensionality p with base 1/α (i.e., α−p), which answers a question in (Smith et al., 2017). Our approach ...
متن کاملDifferentially Private Empirical Risk Minimization with Input Perturbation
We propose a novel framework for the differentially private ERM, input perturbation. Existing differentially private ERM implicitly assumed that the data contributors submit their private data to a database expecting that the database invokes a differentially private mechanism for publication of the learned model. In input perturbation, each data contributor independently randomizes her/his dat...
متن کاملSmooth minimization of non-smooth functions
In this paper we propose a new approach for constructing efficient schemes for non-smooth convex optimization. It is based on a special smoothing technique, which can be applied to functions with explicit max-structure. Our approach can be considered as an alternative to black-box minimization. From the viewpoint of efficiency estimates, we manage to improve the traditional bounds on the number...
متن کاملMinimization of Non-smooth, Non-convex Functionals by Iterative Thresholding
Numerical algorithms for a special class of non-smooth and non-convex minimization problems in infinite dimensional Hilbert spaces are considered. The functionals under consideration are the sum of a smooth and non-smooth functional, both possibly non-convex. We propose a generalization of the gradient projection method and analyze its convergence properties. For separable constraints in the se...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the AAAI Conference on Artificial Intelligence
سال: 2019
ISSN: 2374-3468,2159-5399
DOI: 10.1609/aaai.v33i01.33011182