Modern ways to overcome neural networks catastrophic forgetting and empirical investigations on their structural issues

 pdf (2907K)

This paper presents the results of experimental validation of some structural issues concerning the practical use of methods to overcome catastrophic forgetting of neural networks. A comparison of current effective methods like EWC (Elastic Weight Consolidation) and WVA (Weight Velocity Attenuation) is made and their advantages and disadvantages are considered. It is shown that EWC is better for tasks where full retention of learned skills is required on all the tasks in the training queue, while WVA is more suitable for sequential tasks with very limited computational resources, or when reuse of representations and acceleration of learning from task to task is required rather than exact retention of the skills. The attenuation of the WVA method must be applied to the optimization step, i. e. to the increments of neural network weights, rather than to the loss function gradient itself, and this is true for any gradient optimization method except the simplest stochastic gradient descent (SGD). The choice of the optimal weights attenuation function between the hyperbolic function and the exponent is considered. It is shown that hyperbolic attenuation is preferable because, despite comparable quality at optimal values of the hyperparameter of the WVA method, it is more robust to hyperparameter deviations from the optimal value (this hyperparameter in the WVA method provides a balance between preservation of old skills and learning a new skill). Empirical observations are presented that support the hypothesis that the optimal value of this hyperparameter does not depend on the number of tasks in the sequential learning queue. And, consequently, this hyperparameter can be picked up on a small number of tasks and used on longer sequences.

Keywords: catastrophic forgetting, elastic weight consolidation, EWC, weight velocity attenuation, WVA, neural networks, continual learning, machine learning, artificial intelligence
Citation in English: Kutalev A.A., Lapina A.A. Modern ways to overcome neural networks catastrophic forgetting and empirical investigations on their structural issues // Computer Research and Modeling, 2023, vol. 15, no. 1, pp. 45-56
Citation in English: Kutalev A.A., Lapina A.A. Modern ways to overcome neural networks catastrophic forgetting and empirical investigations on their structural issues // Computer Research and Modeling, 2023, vol. 15, no. 1, pp. 45-56
DOI: 10.20537/2076-7633-2023-15-1-45-56

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"