An article on metaplasticity and catastrophic forgetting in artificial neural networks, on which our colleague Matúš Tomko worked as a co-author, is featured on the cover of the September edition of Trends in Neurosciences.
Artificial neural networks suffer from so-called catastrophic forgetting, when they continually learn multiple tasks in sequence. Our brains are capable of avoiding catastrophic forgetting. One of the tricks to reduce forgetting in continual learning is so called metaplasticity. The term was originally coined to describe neurobiological phenomena in which neural activity at one point in time influences the direction, amplitude, and/or duration of subsequently induced synaptic plasticity.
In the review article, the authors summarize the ways in which catastrophic forgetting can be prevented in artificial neural networks and how simultaneously used strategies in combination with methods based on metaplasticity increase the performance of artificial neural networks in continual learning.
Contributions by metaplasticity to solving the Catastrophic Forgetting Problem