21th AIAI 2025, 26 - 29 June 2025, Limassol, Cyprus

Clustering by Limit-Cycles in Hopfield Neural Network

Das Jhelum, Ladwani Vandana, Ramasubramanian V

Abstract:

  Hopfield Neural Network (HNN) is a classic auto-associative memory formulation with a content-addressable property – allowing storage of multiple patterns and retrieval of one of the stored patterns on being presented with a cue that is a partial/noisy version of the stored pattern. In this work, we examine in detail two kinds of ‘attractors’ of HNN, namely, a) Point Attractors (PA) and b) Limit-Cycles (LC), and demonstrate their ability to handle ‘clustered’ data in the form of vector patterns (for PA) and vector sequence patterns (for LC). Specifically, we note: a) Both attractor types exhibit high performance in modeling ‘clustered’ data in their ‘basins of attraction’, b) LC-attractor type offers superior retrieval performance over PA, by retrieving a stored ‘sequence’ (with the corresponding individual stored vector-states forming the LC) more accurately than when the vector-patterns are stored as a PA. We posit that this is due to the state-to-state neurodynamical cohesion in Limit-Cycle attractors, brought about by our proposed 2-stage firing-rule with a State-Transition-and-Stabilization retrieval mechanism that binds vector-states more accurately.  

*** Title, author list and abstract as submitted during Camera-Ready version delivery. Small changes that may have occurred during processing by Springer may not appear in this window.