Federated Learning
Federated Learning studies how models can be trained without a centralized dataset through the cooperation of multiple clients that each posess their own private dataset.
To this end, the models usually train for a few epochs on their own dataset before synchronizing with a central server. The central server averages the weights of the client models and re-distributed the synchronized model to the clients.
I am interested in ways to improve this aggregation process. Indeed, the updates from different clients can present large conflicts and deteriorate the model’s performance.
Work on the matter:
- Advising Alice Parodi on a student research project in collaboration with Frédéric Precioso and Diane Lingrand.
- Advised Kenza Roche on her end of Master’s internship in collaboration with Frédéric Precioso and Diane Lingrand.