English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Paper

Meta-Aggregating Networks for Class-Incremental Learning

MPS-Authors
/persons/resource/persons244493

Liu,  Yaoyao
Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society;

/persons/resource/persons45383

Schiele,  Bernt
Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Liu, Y., Schiele, B., & Sun, Q. (2020). Meta-Aggregating Networks for Class-Incremental Learning. Retrieved from https://arxiv.org/abs/2010.05063.


Cite as: https://hdl.handle.net/21.11116/0000-0007-80F3-5
Abstract
Class-Incremental Learning (CIL) aims to learn a classification model with
the number of classes increasing phase-by-phase. The inherent problem in CIL is
the stability-plasticity dilemma between the learning of old and new classes,
i.e., high-plasticity models easily forget old classes but high-stability
models are weak to learn new classes. We alleviate this issue by proposing a
novel network architecture called Meta-Aggregating Networks (MANets) in which
we explicitly build two residual blocks at each residual level (taking ResNet
as the baseline architecture): a stable block and a plastic block. We aggregate
the output feature maps from these two blocks and then feed the results to the
next-level blocks. We meta-learn the aggregating weights in order to
dynamically optimize and balance between two types of blocks, i.e., between
stability and plasticity. We conduct extensive experiments on three CIL
benchmarks: CIFAR-100, ImageNet-Subset, and ImageNet, and show that many
existing CIL methods can be straightforwardly incorporated on the architecture
of MANets to boost their performance.