Emergent Mind

Abstract

In contrast to batch learning where all training data is available at once, continual learning represents a family of methods that accumulate knowledge and learn continuously with data available in sequential order. Similar to the human learning process with the ability of learning, fusing, and accumulating new knowledge coming at different time steps, continual learning is considered to have high practical significance. Hence, continual learning has been studied in various artificial intelligence tasks. In this paper, we present a comprehensive review of the recent progress of continual learning in computer vision. In particular, the works are grouped by their representative techniques, including regularization, knowledge distillation, memory, generative replay, parameter isolation, and a combination of the above techniques. For each category of these techniques, both its characteristics and applications in computer vision are presented. At the end of this overview, several subareas, where continuous knowledge accumulation is potentially helpful while continual learning has not been well studied, are discussed.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a detailed summary of this paper with a premium account.

We ran into a problem analyzing this paper.

Subscribe by Email

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.

References
  1. Catastrophic interference in connectionist networks: The sequential learning problem. volume 24 of Psychology of Learning and Motivation, pages 109–165. Academic Press, 1989. Last accessed on 24.09.2021.
  2. Robert M. French. Catastrophic forgetting in connectionist networks. Trends in Cognitive Sciences, 3(4):128–135, 1999. Last accessed on 24.09.2021.
  3. Incremental learning from noisy data. Machine learning, 1(3):317–354
  4. Effective learning in dynamic environments by explicit context tracking. In European Conference on Machine Learning, pages 227–243. Springer
  5. A survey on concept drift adaptation. ACM computing surveys (CSUR), 46(4):1–37
  6. Classifier adaptation at prediction time. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 1401–1409
  7. Stephen Grossberg. Consciousness clears the mind. Neural Networks, 20(9):1040–1053
  8. The stability-plasticity dilemma: Investigating the continuum from catastrophic forgetting to age-limited learning effects. Frontiers in psychology, 4:504
  9. CVPR 2020 Continual Learning in Computer Vision Competition: Approaches, Results, Current Challenges and Future Directions
  10. 2nd clvision cvpr workshop. Last accessed on 24.09.2021.
  11. 3rd Continual Learning Workshop Challenge on Egocentric Category and Instance Level Object Understanding
  12. 4th clvision cvpr workshop. Last accessed on 21.11.2023.
  13. Online Continual Learning in Image Classification: An Empirical Survey
  14. A continual learning survey: Defying forgetting in classification tasks. IEEE Transactions on Pattern Analysis and Machine Intelligence
  15. Class-incremental learning: survey and performance evaluation on image classification
  16. A comprehensive study of class incremental learning algorithms for visual tasks. Neural Networks
  17. Continual lifelong learning with neural networks: A review. Neural Networks, 113:54–71
  18. Continual learning for robotics: Definition, framework, learning strategies, opportunities and challenges. Information fusion, 58:52–68
  19. Online Continual Learning with Maximally Interfered Retrieval
  20. Three scenarios for continual learning
  21. Gradient episodic memory for continual learning. Advances in neural information processing systems, 30:6467–6476
  22. Riemannian walk for incremental learning: Understanding forgetting and intransigence. In Proceedings of the European Conference on Computer Vision (ECCV), pages 532–547
  23. Overcoming catastrophic forgetting in neural networks. Proceedings of the national academy of sciences, 114(13):3521–3526
  24. Continual learning through synaptic intelligence. In International Conference on Machine Learning, pages 3987–3995. PMLR
  25. Xu He and Herbert Jaeger. Overcoming catastrophic interference using conceptor-aided backpropagation. In International Conference on Learning Representations
  26. Uncertainty-guided Continual Learning with Bayesian Neural Networks
  27. Rotate your networks: Better weight consolidation and less catastrophic forgetting. In 2018 24th International Conference on Pattern Recognition (ICPR), pages 2262–2268. IEEE
  28. Progress & compress: A scalable framework for continual learning. In International Conference on Machine Learning, pages 4528–4537. PMLR
  29. Optimizing neural networks with kronecker-factored approximate curvature. In International conference on machine learning, pages 2408–2417. PMLR
  30. Generalized Variational Continual Learning
  31. Overcoming Catastrophic Forgetting by Incremental Moment Matching
  32. Online Structured Laplace Approximations For Overcoming Catastrophic Forgetting
  33. Continual learning with extended kronecker-factored approximate curvature. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9001–9010
  34. Continual learning by asymmetric loss approximation with single-side overestimation. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 3335–3344
  35. Memory aware synapses: Learning what (not) to forget. In Proceedings of the European Conference on Computer Vision (ECCV), pages 139–154
  36. Incremental Few-Shot Learning with Attention Attractor Networks
  37. Localist attractor networks. Neural Computation, 13(5):1045–1064
  38. CPR: Classifier-Projection Regularization for Continual Learning
  39. Continual learning by using information of each class holistically. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pages 7797–7805
  40. Hrn: A holistic approach to one class learning. Advances in Neural Information Processing Systems, 33:19111–19124
  41. Uncertainty-based Continual Learning with Adaptive Regularization
  42. Continual Learning with Node-Importance based Adaptive Group Sparse Regularization
  43. Continual adaptation of visual representations via domain randomization and meta-learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 4443–4453
  44. Class-incremental domain adaptation. In Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part XIII 16, pages 53–69. Springer
  45. Continual Learning in Generative Adversarial Nets
  46. Image de-raining via continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 4907–4916
  47. Prototypical Networks for Few-shot Learning
  48. Continual learning of context-dependent processing in neural networks. Nature Machine Intelligence, 1(8):364–372
  49. Training networks in null space of feature covariance for continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 184–193
  50. Gradient Projection Memory for Continual Learning
  51. Understanding the Role of Training Regimes in Continual Learning
  52. Importance driven continual learning for segmentation across domains. In International Workshop on Machine Learning in Medical Imaging, pages 423–433. Springer
  53. Variational auto-regressive gaussian processes for continual learning. In International Conference on Machine Learning, pages 5290–5300. PMLR
  54. Learning without forgetting. IEEE transactions on pattern analysis and machine intelligence, 40(12):2935–2947
  55. Learning without memorizing. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 5138–5146
  56. Encoder based lifelong learning. In Proceedings of the IEEE International Conference on Computer Vision, pages 1320–1328
  57. Online continual learning under extreme memory constraints. In European Conference on Computer Vision, pages 720–735. Springer
  58. Podnet: Pooled outputs distillation for small-tasks incremental learning. In Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part XX 16, pages 86–102. Springer
  59. Do not forget to attend to uncertainty while mitigating catastrophic forgetting. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pages 736–745
  60. On learning the geodesic path for incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 1591–1600
  61. Distilling causal effect of data in class-incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 3957–3966
  62. Learning a unified classifier incrementally via rebalancing. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 831–839
  63. Maintaining discrimination and fairness in class incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 13208–13217
  64. Incremental learning in online scenario. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 13926–13935
  65. Continual learning of a mixed sequence of similar and dissimilar tasks. Advances in Neural Information Processing Systems, 33
  66. Sharing less is more: Lifelong learning in deep networks with selective layer transfer. In International Conference on Machine Learning, pages 6065–6075. PMLR
  67. Incremental learning techniques for semantic segmentation. In Proceedings of the IEEE/CVF International Conference on Computer Vision Workshops, pages 0–0
  68. Modeling the background for incremental learning in semantic segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9233–9242
  69. Plop: Learning without forgetting for continual semantic segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 4040–4050
  70. Continual semantic segmentation via repulsion-attraction of sparse and disentangled latent representations. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 1114–1124
  71. Endpoints weight fusion for class incremental semantic segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 7204–7213
  72. Incremental learning of object detectors without catastrophic forgetting. In Proceedings of the IEEE international conference on computer vision, pages 3400–3409
  73. Overcoming catastrophic forgetting in incremental object detection via elastic response distillation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9427–9436
  74. Memory Replay GANs: learning to generate images from new categories without forgetting
  75. Lifelong gan: Continual learning for conditional image generation. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 2759–2768
  76. ContCap: A scalable framework for continual image captioning
  77. Lifelong person re-identification via adaptive knowledge accumulation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 7901–7910
  78. Overcoming catastrophic forgetting with unlabeled data in the wild. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 312–321
  79. Semantic-aware knowledge distillation for few-shot class-incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 2534–2543
  80. Xtarnet: Learning to extract task-adaptive representation for incremental few-shot learning. In International Conference on Machine Learning, pages 10852–10860. PMLR
  81. Incremental few-shot object detection. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 13846–13855
  82. Incremental Meta-Learning via Indirect Discriminant Alignment
  83. Objects as Points
  84. Federated continual learning with weighted inter-client transfer. In International Conference on Machine Learning, pages 12073–12086. PMLR
  85. icarl: Incremental classifier and representation learning. In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pages 2001–2010
  86. On Tiny Episodic Memories in Continual Learning
  87. Jeffrey S Vitter. Random sampling with a reservoir. ACM Transactions on Mathematical Software (TOMS), 11(1):37–57
  88. Gradient based sample selection for online continual learning
  89. Mnemonics training: Multi-class incremental learning without forgetting. In Proceedings of the IEEE/CVF conference on Computer Vision and Pattern Recognition, pages 12245–12254
  90. Imbalanced continual learning with partitioning reservoir sampling. In European Conference on Computer Vision, pages 411–428. Springer
  91. Online continual learning from imbalanced data. In International Conference on Machine Learning, pages 1952–1961. PMLR
  92. Coresets via Bilevel Optimization for Continual Learning and Streaming
  93. Rethinking experience replay: a bag of tricks for continual learning. In 2020 25th International Conference on Pattern Recognition (ICPR), pages 2180–2187. IEEE
  94. Rainbow memory: Continual learning with a memory of diverse samples. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 8218–8227
  95. Online class-incremental continual learning with adversarial shapley value. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pages 9630–9638
  96. Large scale incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 374–382
  97. Contextual transformation networks for online continual learning. In International Conference on Learning Representations
  98. Il2m: Class incremental learning with dual memory. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 583–592
  99. Using Hindsight to Anchor Past Knowledge in Continual Learning
  100. Scail: Classifier weights scaling for class incremental learning. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pages 1266–1275
  101. Initial Classifier Weights Replay for Memoryless Class Incremental Learning
  102. Remembering for the Right Reasons: Explanations Reduce Catastrophic Forgetting
  103. Remind your neural network to prevent catastrophic forgetting. In European Conference on Computer Vision, pages 466–483. Springer
  104. Latent replay for real-time continual learning. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 10203–10209. IEEE
  105. Product quantization for nearest neighbor search. IEEE transactions on pattern analysis and machine intelligence, 33(1):117–128
  106. Incremental learning for semantic segmentation of large-scale remote sensing data. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 12(9):3524–3537
  107. Continual semantic segmentation with automatic memory sample selection. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 3082–3092
  108. Towards open world object detection. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 5830–5840
  109. Selective replay enhances learning in online continual analogical reasoning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 3502–3512
  110. Raven: A dataset for relational and analogical visual reasoning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 5317–5327
  111. Efficient Lifelong Learning with A-GEM
  112. Toward training recurrent neural networks for lifelong learning. Neural computation, 32(1):1–35
  113. Accelerating learning via knowledge transfer. 2016.
  114. Kernel continual learning. In International Conference on Machine Learning, pages 2621–2631. PMLR
  115. Layerwise optimization by gradient decomposition for continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9634–9643
  116. Few-shot class-incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 12183–12192
  117. A” neural-gas” network learns topologies. 1991.
  118. Topology-preserving class-incremental learning. In European Conference on Computer Vision, pages 254–270. Springer
  119. Thomas Martinetz. Competitive hebbian learning rule forms perfectly topology preserving maps. In International conference on artificial neural networks, pages 427–434. Springer
  120. Continual learning with hypernetworks
  121. Memory-efficient incremental learning through feature adaptation. In European Conference on Computer Vision, pages 699–715. Springer
  122. Wandering Within a World: Online Contextualized Few-Shot Learning
  123. Prototype augmentation and self-supervision for incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 5871–5880
  124. Meta-Consolidation for Continual Learning
  125. Incremental few-shot instance segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 1185–1194
  126. Incremental few-shot semantic segmentation via embedding adaptive-update and hyper-class representation. In Proceedings of the 30th ACM International Conference on Multimedia, pages 5547–5556
  127. Gdumb: A simple approach that questions our progress in continual learning. In European conference on computer vision, pages 524–540. Springer
  128. Continual Learning with Deep Generative Replay
  129. FearNet: Brain-Inspired Model for Incremental Learning
  130. Generative replay with feedback connections as a general strategy for continual learning
  131. Anthony Robins. Catastrophic forgetting, rehearsal and pseudorehearsal. Connection Science, 7(2):123–146
  132. GAN Memory with No Forgetting
  133. Generative models from the perspective of continual learning. In 2019 International Joint Conference on Neural Networks (IJCNN), pages 1–8. IEEE
  134. Complementary Learning for Overcoming Catastrophic Forgetting Using Experience Replay
  135. Ddgr: continual learning with deep diffusion-based generative replay. In International Conference on Machine Learning, pages 10744–10763. PMLR
  136. Ordisco: Effective and efficient usage of incremental unlabeled data for semi-supervised continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 5383–5392
  137. Generative adversarial nets. Advances in neural information processing systems, 27
  138. Clnerf: Continual learning meets nerf. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 23185–23194
  139. Instant neural graphics primitives with a multiresolution hash encoding. ACM Transactions on Graphics (ToG), 41(4):1–15
  140. Brain-inspired replay for continual learning with artificial neural networks. Nature communications, 11(1):1–14
  141. Learning latent representations across multiple data domains using lifelong vaegan. In European Conference on Computer Vision, pages 777–795. Springer
  142. Continual learning of predictive models in video sequences via variational autoencoders. In 2020 IEEE International Conference on Image Processing (ICIP), pages 753–757. IEEE
  143. Learning switching models for abnormality detection for autonomous driving. In 2018 21st International Conference on Information Fusion (FUSION), pages 2606–2613. IEEE
  144. Progressive Neural Networks
  145. Lifelong Learning with Dynamically Expandable Networks
  146. PathNet: Evolution Channels Gradient Descent in Super Neural Networks
  147. Packnet: Adding multiple tasks to a single network by iterative pruning. In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pages 7765–7773
  148. Expert gate: Lifelong learning with a network of experts. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 3366–3375
  149. Reinforced Continual Learning
  150. Learn to grow: A continual structure learning framework for overcoming catastrophic forgetting. In International Conference on Machine Learning, pages 3925–3934. PMLR
  151. Compacting, Picking and Growing for Unforgetting Continual Learning
  152. A Neural Dirichlet Process Mixture Model for Task-Free Continual Learning
  153. Efficient Continual Learning with Modular Networks and Task-Driven Priors
  154. Ova-inn: Continual learning with invertible neural networks. In 2020 International Joint Conference on Neural Networks (IJCNN), pages 1–7. IEEE
  155. NICE: Non-linear Independent Components Estimation
  156. Bayesian structural adaptation for continual learning. In International Conference on Machine Learning, pages 5850–5860. PMLR
  157. The indian buffet process: An introduction and review. Journal of Machine Learning Research, 12(4)
  158. Der: Dynamically expandable representation for class incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 3014–3023
  159. Few-shot incremental learning with continually evolved classifiers. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 12455–12464
  160. Scalable and Order-robust Continual Learning with Additive Parameter Decomposition
  161. Reparameterizing convolutions for incremental multi-task learning without task interference. In European Conference on Computer Vision, pages 689–707. Springer
  162. Adversarial continual learning. In Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part XI 16, pages 386–402. Springer
  163. Calibrating cnns for lifelong learning. In NeurIPS
  164. Rectification-based knowledge retention for continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 15282–15291
  165. Efficient feature transformations for discriminative and generative continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 13865–13875
  166. MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
  167. Piggyback gan: Efficient lifelong learning for image conditioned generation. In European Conference on Computer Vision, pages 397–413. Springer
  168. Hyper-lifelonggan: Scalable lifelong learning for image conditioned generation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 2246–2255
  169. HyperNetworks
  170. Lfs-gan: Lifelong few-shot image generation. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 11356–11366
  171. Else-net: Elastic semantic network for continual action recognition from skeleton data. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), October 2021.
  172. Li Xu and Jun Liu. Experts collaboration learning for continual multi-modal reasoning. IEEE Transactions on Image Processing
  173. Few-shot incremental learning for label-to-image translation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 3697–3707
  174. Increasingly packing multiple facial-informatics modules in a unified deep-learning model via lifelong learning. In Proceedings of the 2019 on International Conference on Multimedia Retrieval, pages 339–343
  175. To prune, or not to prune: exploring the efficacy of pruning for model compression
  176. Overcoming catastrophic forgetting with hard attention to the task. In International Conference on Machine Learning, pages 4548–4557. PMLR
  177. Piggyback: Adapting a single network to multiple tasks by learning to mask weights. In Proceedings of the European Conference on Computer Vision (ECCV), pages 67–82
  178. Random path selection for continual learning. 2019.
  179. Conditional channel gated networks for task-aware continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 3931–3940
  180. Continual learning via bit-level information preserving. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 16674–16683
  181. Learning to Continually Learn
  182. Optimizing Reusable Knowledge for Continual Learning via Metalearning
  183. Continual Learning with Adaptive Weights (CLAW)
  184. Overcoming catastrophic forgetting for continual learning via model adaptation. In International Conference on Learning Representations
  185. A continual learning framework for uncertainty-aware interactive image segmentation. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pages 6030–6038
  186. RATT: Recurrent Attention to Transient Tasks for Continual Image Captioning
  187. Variational Continual Learning
  188. Task-free continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 11254–11263
  189. Coresets for nonparametric estimation-the case of dp-means. In International Conference on Machine Learning, pages 209–217. PMLR
  190. Continual learning with bayesian neural networks for non-stationary data. In International Conference on Learning Representations
  191. Functional Regularisation for Continual Learning with Gaussian Processes
  192. Overcoming catastrophic forgetting by bayesian generative regularization. In International Conference on Machine Learning, pages 1760–1770. PMLR
  193. A tutorial on energy-based learning. Predicting structured data, 1(0)
  194. Accurate sampling using langevin dynamics. Physical Review E, 75(5):056707
  195. Learning to Learn without Forgetting by Maximizing Transfer and Minimizing Interference
  196. On First-Order Meta-Learning Algorithms
  197. Continual Deep Learning by Functional Regularisation of Memorable Past
  198. Graph-Based Continual Learning
  199. Linear Mode Connectivity in Multitask and Continual Learning
  200. End-to-end incremental learning. In Proceedings of the European conference on computer vision (ECCV), pages 233–248
  201. Lifelong learning via progressive distillation and retrospection. In Proceedings of the European Conference on Computer Vision (ECCV), pages 437–452
  202. When Video Classification Meets Incremental Classes
  203. CoReD: Generalizing Fake Media Detection with Continual Representation using Distillation
  204. Augmented box replay: Overcoming foreground shift for incremental object detection. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 11367–11377
  205. Incremental Classifier Learning with Generative Adversarial Networks
  206. Target: Federated class-continual learning via exemplar-free distillation. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 4782–4793
  207. Half-Real Half-Fake Distillation for Class-Incremental Semantic Segmentation
  208. Exemplar-supported generative reproduction for class incremental learning. In BMVC, page 98
  209. Incremental learning using conditional adversarial networks. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 6619–6628
  210. EEC: Learning to Encode and Regenerate Images for Continual Learning
  211. Ace: Adapting to changing environments for semantic segmentation. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 2121–2130
  212. Learning to remember: A synaptic plasticity driven framework for continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 11321–11329
  213. Continual Unsupervised Representation Learning
  214. Conditional image synthesis with auxiliary classifier gans. In International conference on machine learning, pages 2642–2651. PMLR
  215. Memory-Efficient Semi-Supervised Continual Learning: The World is its Own Replay Buffer
  216. Adaptive deep models for incremental learning: Considering capacity scalability and sustainability. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pages 74–82
  217. Dark Experience for General Continual Learning: a Strong, Simple Baseline
  218. FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence
  219. Dark knowledge. Presented as the keynote in BayLearn, 2:2
  220. Online fast adaptation and knowledge accumulation (osaka): a new approach to continual learning. Advances in Neural Information Processing Systems, 33
  221. Self-Training for Class-Incremental Semantic Segmentation
  222. Model-agnostic meta-learning for fast adaptation of deep networks. In International Conference on Machine Learning, pages 1126–1135. PMLR
  223. Meta-Learning Representations for Continual Learning
  224. Reconciling meta-learning and continual learning with online mixtures of tasks
  225. itaml: An incremental task-agnostic meta-learning approach. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 13588–13597
  226. Semantic drift compensation for class-incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 6982–6991
  227. More classifiers, less forgetting: A generic multi-classifier paradigm for incremental learning. In Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part XXVI 16, pages 699–716. Springer
  228. Lifelong Learning of Compositional Structures
  229. Beyond Shared Hierarchies: Deep Multitask Learning through Soft Layer Ordering
  230. Incremental object learning from contiguous views. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 8777–8786
  231. Incremental learning via rate reduction. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 1125–1133
  232. Deep Networks from the Principle of Rate Reduction
  233. Long live the lottery: The existence of winning tickets in lifelong learning. In International Conference on Learning Representations
  234. The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks
  235. Self-promoted prototype refinement for few-shot class-incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 6801–6810
  236. Adaptive aggregation networks for class-incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 2544–2553
  237. Iirc: Incremental implicitly-refined classification. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 11038–11047
  238. Lifelong graph learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 13719–13728
  239. Visually grounded continual learning of compositional semantics. arXiv e-prints, pages arXiv–2005
  240. Film: Visual reasoning with a general conditioning layer. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 32
  241. Psycholinguistics meets Continual Learning: Measuring Catastrophic Forgetting in Visual Question Answering
  242. Seeing past words: Testing the cross-modal capabilities of pretrained V&L models on counting tasks

Show All 242