Daram, AnuragKudithipudi, Dhireesha2023-11-282023-11-282023-04-12Daram, A., & Kudithipudi, D. (2023). NEO: Neuron State Dependent Mechanisms for Efficient Continual Learning. Paper presented at the 2023 Annual Neuro-Inspired Computational Elements Conference, San Antonio, TX, USA. https://doi.org/10.1145/3584954.3584960978-1-4503-9947-0https://doi.org/10.1145/3584954.3584960https://hdl.handle.net/20.500.12588/2251Continual learning (sequential learning of tasks) is challenging for deep neural networks, mainly because of catastrophic forgetting, the tendency for accuracy on previously trained tasks to drop when new tasks are learned. Although several biologically-inspired techniques have been proposed for mitigating catastrophic forgetting, they typically require additional memory and/or computational overhead. Here, we propose a novel regularization approach that combines neuronal activation-based importance measurement with neuron state-dependent learning mechanisms to alleviate catastrophic forgetting in both task-aware and task-agnostic scenarios. We introduce a neuronal state-dependent mechanism driven by neuronal activity traces and selective learning rules, with storage requirements for regularization parameters that grow slower with network size - compared to schemes that calculate weight importance, whose storage grows quadratically. The proposed model, NEO, is able to achieve performance comparable to other state-of-the-art regularization based approaches to catastrophic forgetting, while operating with a reduced memory overhead.enAttribution 3.0 United Stateshttp://creativecommons.org/licenses/by/3.0/us/catastrophic forgettingtask agnostictask incremental learningdomain incremental learningneuron importanceNEO: Neuron State Dependent Mechanisms for Efficient Continual LearningArticle