Copy the page URI to the clipboard
Wiwatcharakoses, Chayut and Berrar, Daniel
(2021).
DOI: https://doi.org/10.1016/j.eswa.2021.115662
Abstract
Continual learning algorithms can adapt to changes of data distributions, new classes, and even completely new tasks without catastrophically forgetting previously acquired knowledge. Here, we present a novel self-organizing incremental neural network, GSOINN+, for continual supervised learning. GSOINN+ learns a topological mapping of the input data to an undirected network and uses a weighted nearest-neighbor rule with fractional distance for classification. GSOINN+ learns incrementally new classification tasks do not need to be specified a priori, and no rehearsal of previously learned tasks with stored training sets is required. In a series of sequential learning experiments, we show that GSOINN+ can mitigate catastrophic forgetting, even when completely new tasks are to be learned.