English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Meta-Aggregating Networks for Class-Incremental Learning

Liu, Y., Schiele, B., & Sun, Q. (2020). Meta-Aggregating Networks for Class-Incremental Learning. Retrieved from https://arxiv.org/abs/2010.05063.

Item is

Files

show Files
hide Files
:
arXiv:2010.05063.pdf (Preprint), 2MB
 
File Permalink:
-
Name:
arXiv:2010.05063.pdf
Description:
File downloaded from arXiv at 2020-12-03 08:24 Code: https://github.com/yaoyao-liu/class-incremental-learning
OA-Status:
Visibility:
Private
MIME-Type / Checksum:
application/pdf
Technical Metadata:
Copyright Date:
-
Copyright Info:
-

Locators

show

Creators

show
hide
 Creators:
Liu, Yaoyao1, Author           
Schiele, Bernt1, Author           
Sun, Qianru2, Author           
Affiliations:
1Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society, ou_1116547              
2External Organizations, ou_persistent22              

Content

show
hide
Free keywords: Computer Science, Computer Vision and Pattern Recognition, cs.CV,Statistics, Machine Learning, stat.ML
 Abstract: Class-Incremental Learning (CIL) aims to learn a classification model with
the number of classes increasing phase-by-phase. The inherent problem in CIL is
the stability-plasticity dilemma between the learning of old and new classes,
i.e., high-plasticity models easily forget old classes but high-stability
models are weak to learn new classes. We alleviate this issue by proposing a
novel network architecture called Meta-Aggregating Networks (MANets) in which
we explicitly build two residual blocks at each residual level (taking ResNet
as the baseline architecture): a stable block and a plastic block. We aggregate
the output feature maps from these two blocks and then feed the results to the
next-level blocks. We meta-learn the aggregating weights in order to
dynamically optimize and balance between two types of blocks, i.e., between
stability and plasticity. We conduct extensive experiments on three CIL
benchmarks: CIFAR-100, ImageNet-Subset, and ImageNet, and show that many
existing CIL methods can be straightforwardly incorporated on the architecture
of MANets to boost their performance.

Details

show
hide
Language(s): eng - English
 Dates: 2020-10-102020
 Publication Status: Published online
 Pages: 14 p.
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: arXiv: 2010.05063
BibTex Citekey: Liu_arXiv2010.05063
URI: https://arxiv.org/abs/2010.05063
 Degree: -

Event

show

Legal Case

show

Project information

show

Source

show