title | section | abstract | layout | series | publisher | issn | id | month | tex_title | firstpage | lastpage | page | order | cycles | bibtex_author | author | date | address | container-title | volume | genre | issued | extras | ||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
On Convex Optimization with Semi-Sensitive Features |
Original Papers |
We study the differentially private (DP) empirical risk minimization (ERM) problem under the \emph{semi-sensitive DP} setting where only some features are sensitive. This generalizes the Label DP setting where only the label is sensitive. We give improved upper and lower bounds on the excess risk for DP-ERM. In particular, we show that the error only scales polylogarithmically in terms of the sensitive domain size, improving upon previous results that scale polynomially in the size of the sensitive domain (Ghazi et al., NeurIPS 2021). |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
ghazi24a |
0 |
On Convex Optimization with Semi-Sensitive Features |
1916 |
1938 |
1916-1938 |
1916 |
false |
Ghazi, Badih and Kamath, Pritish and Kumar, Ravi and Manurangsi, Pasin and Meka, Raghu and Zhang, Chiyuan |
|
2024-06-30 |
Proceedings of Thirty Seventh Conference on Learning Theory |
247 |
inproceedings |
|