Replay for Online Continual Learning in Spiking Neural Networks

Date

2024

Authors

Patel, Raghav

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Continual learning, also called lifelong learning, is the process of accommodating newly learned information with prior information. A major challenge neural networks face in continual learning, is remembering what was learned previously. This issue is referred to as catastrophic forgetting. When the same neural network is trained on a new batch of data, it will perform poorly when tested on previously learned data. This problem is far from solved, and it becomes apparent that tackling this issue is a difficult balancing act: the neural network must be plastic enough to learn, but at the same time stable enough such that it does not forget what it learned. This investigation explores different replay approaches within spiking neural networks. We show that SNNs benefit in a similar manner as do traditional ANNs when simple replay is used and show that adding regularization can improve accuracy when using small replay buffer sizes.

Description

Keywords

Continual Learning, Lifelong Learning, Regularization, Replay, Spiking Neural Network, Spiking neural networks

Citation

Department

Computer Science