Continual Recognition with Adaptive Memory Update
نویسندگان
چکیده
Class incremental continual learning aims to improve the ability of modern classification models continually recognize new classes without forgetting previous ones. Prior art in field has largely considered using a replay buffer. In this article, we start from an observation that existing replay-based method would fail when stored exemplars are not hard enough get good decision boundary between previously learned class and class. To prevent situation, propose perspective remedy after for first time. proposed method, set is preserved as working memory, which helps classes. When memory insufficient distinguish classes, more discriminating samples be swapped long-term built up during early training process, adaptive way. Our recognition model with update capable overcoming problem catastrophic various coming sequence, especially similar but different Extensive experiments on real-world datasets demonstrate superior state-of-the-art algorithms. Moreover, our can used general plugin any algorithm further their performance.
منابع مشابه
Cryptography Against Continual Memory Leakage
Recall from last lecture that we have several ways to model leakage. One model is “only computation leaks” by Micali and Reyzin [11], which assumes a form of secure memory that does not leak as long as no computation is done on the data. Another one is “memory leakage” by Akavia, Goldwasser, and Vaikuntanathan [1], which assumes that everything can leak information. From an orthogonal dimension...
متن کاملEpisodic memory for continual model learning
Both the human brain and artificial learning agents operating in real-world or comparably complex environments are faced with the challenge of online model selection. In principle this challenge can be overcome: hierarchical Bayesian inference provides a principled method for model selection and it converges on the same posterior for both off-line (i.e. batch) and online learning. However, main...
متن کاملCryptography Resilient to Continual Memory Leakage
In recent years, there has been a major effort to design cryptographic schemes that remain secure even if part of the secret key is leaked. This is due to a recent proliferation of side channel attacks which, through various physical means, can recover part of the secret key. We explore the possibility of achieving security even with continual leakage, i.e., even if some information is leaked e...
متن کاملInteractive Proofs under Continual Memory Leakage
We consider the task of constructing interactive proofs for NP which can provide meaningful security for a prover even in the presence of continual memory leakage. We imagine a setting where an adversarial verifier participates in multiple sequential interactive proof executions for a fixed NP statement x. In every execution, the adversarial verifier is additionally allowed to leak a fraction o...
متن کاملGradient Episodic Memory for Continual Learning
One major obstacle towards artificial intelligence is the poor ability of models to quickly solve new problems, without forgetting previously acquired knowledge. To better understand this issue, we study the problem of learning over a continuum of data, where the model observes, once and one by one, examples concerning a sequence of tasks. First, we propose a set of metrics to evaluate models l...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: ACM Transactions on Multimedia Computing, Communications, and Applications
سال: 2023
ISSN: ['1551-6857', '1551-6865']
DOI: https://doi.org/10.1145/3573202