Emergent Mind

Abstract

Incremental learning is a form of online learning. Incremental learning can modify the parameters and structure of the deep learning model so that the model does not forget the old knowledge while learning new knowledge. Preventing catastrophic forgetting is the most important task of incremental learning. However, the current incremental learning is often only for one type of input. For example, if the input images are of the same type, the current incremental model can learn new knowledge while not forgetting old knowledge. However, if several categories are added to the input graphics, the current model will not be able to deal with it correctly, and the accuracy will drop significantly. Therefore, this paper proposes a kind of incremental method, which adjusts the parameters of the model by identifying the prototype vector and increasing the distance of the vector, so that the model can learn new knowledge without catastrophic forgetting. Experiments show the effectiveness of our proposed method.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.