Add Flower Baseline: FedHT #3987
Labels
good first issue
Good for newcomers
part: baselines
Add or update baseline
state: under review
Currently reviewing issue/PR
type: feature request
This issue or comment suggests an additional feature.
Paper
Tong, Q., Liang, G., Zhu, T. and Bi, J., 2020. Federated nonconvex sparse learning. arXiv preprint arXiv:2101.00052.
Link
https://arxiv.org/abs/2101.00052
Maybe give motivations about why the paper should be implemented as a baseline.
Myself and my coauthors used Fed-HT (and Fed-IterHT) in recent work to generate sparse, federated models. These two aggregation strategies were effective in generating highly predictive models while also allowing for constrained sparsity (with some sparsity threshold), instead of just sparsity through regularization. We'd like to implement this baseline for easier implementation in the future through Flower. The goal with this baseline is to implement Fed-HT and Fed-IterHT as a custom Flower aggregation strategy and recreate the methodology of the paper, to include applications using simulated and benchmark datasets.
Is there something else you want to add?
No
Implementation
To implement this baseline, it is recommended to do the following items in that order:
For first time contributors
first contribution
docPrepare - understand the scope
Verify your implementation
EXTENDED_README.md
that was created in your baseline directoryREADME.md
is ready to be run by someone that is no familiar with your code. Are all step-by-step instructions clear?README.md
and verify everything runs.The text was updated successfully, but these errors were encountered: