title | section | abstract | layout | series | publisher | issn | id | month | tex_title | firstpage | lastpage | page | order | cycles | bibtex_author | author | date | address | container-title | volume | genre | issued | extras | |||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Universal Rates for Regression: Separations between Cut-Off and Absolute Loss |
Original Papers |
In this work we initiate the study of regression in the universal rates framework of Bousquet et al. Unlike the traditional uniform learning setting, we are interested in obtaining learning guarantees that hold for all fixed data-generating distributions, but do not hold uniformly across them. We focus on the realizable setting and we consider two different well-studied loss functions: the cut-off loss at scale |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
attias24a |
0 |
Universal Rates for Regression: Separations between Cut-Off and Absolute Loss |
359 |
405 |
359-405 |
359 |
false |
Attias, Idan and Hanneke, Steve and Kalavasis, Alkis and Karbasi, Amin and Velegkas, Grigoris |
|
2024-06-30 |
Proceedings of Thirty Seventh Conference on Learning Theory |
247 |
inproceedings |
|