Help Privacy Policy Disclaimer
  Advanced SearchBrowse





Federated Learning from Small Datasets


Fischer,  Jonas
Databases and Information Systems, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

(Preprint), 399KB

Supplementary Material (public)
There is no public supplementary material available

Kamp, M., Fischer, J., & Vreeken, J. (2021). Federated Learning from Small Datasets. Retrieved from https://arxiv.org/abs/2110.03469.

Cite as: https://hdl.handle.net/21.11116/0000-0009-653B-4
Federated learning allows multiple parties to collaboratively train a joint
model without sharing local data. This enables applications of machine learning
in settings of inherently distributed, undisclosable data such as in the
medical domain. In practice, joint training is usually achieved by aggregating
local models, for which local training objectives have to be in expectation
similar to the joint (global) objective. Often, however, local datasets are so
small that local objectives differ greatly from the global objective, resulting
in federated learning to fail. We propose a novel approach that intertwines
model aggregations with permutations of local models. The permutations expose
each local model to a daisy chain of local datasets resulting in more efficient
training in data-sparse domains. This enables training on extremely small local
datasets, such as patient data across hospitals, while retaining the training
efficiency and privacy benefits of federated learning.