Impact Factor (2025): 6.9
DOI Prefix: 10.47001/IRJIET
Lifelong
Machine Learning (LML) is a versatile improvement to neural networks that in-
creases models’ ability to learn from sequential data in bite size and
incrementally, while continually expanding generally acquired knowledge to new
tasks. One of the persistent problems encountered in LML is known as
catastrophic forgetting, whereby nets dislearn prior tasks upon exposure to new
tasks. The following review explores these challenges in detail and presents
fundamental neural network-based approaches to address such troubles in
lifelong learning systems. In the edition, we prevent updates some of the key
connectivist parameters while retaining prior knowledge from other tasks
through regularization methods such as Elastic Weight Consolidation (EWC) and
Learning without Forgetting (LwF). Even though useful, such strategies should
be used with caution since they require as much emphasis on revisiting previous
tasks as on acquiring new ones. Other rehearsal methods include the Partition
Reservoir Sampling (PRS) and Optimizing Class Distribution in Memory (OCDM)
that uses a portion of previous data for retraining, which can however prove
rather space consuming for large-scale applications. Some architectural
approaches, like the Compact, Picking, and Growing (CPG) principle, mean that
the network structure grows with new tasks and extend from existing neurons or
layers without influence from previous information. But these methods
predetermine scalability since they increase computational complexity with the
size of a casual network. Nevertheless, problems of how to deal with imbalance
in data and shift in labels are still open problems particularly when applied
in situations where the data distribution changes over time. However, lifelong
learning in neural networks continue to experience growth challenges in
catastrophic forgetting, scalability, and efficient knowledge transfer thus the
need for further re-search. It will be crucial for applying neural networks for
situations where it is required to learn over time but do not want to forget
what has been learnt earlier.
Country : Iraq
IRJIET, Volume 9, Issue 8, August 2025 pp. 12-21