Memory Chaos Before Forgetting
Memory loss is a result of aging. A new model for brain information storage reveals this.
Theoretical constructs known as attractor networks are a model of memory in the human brain. A new study of these networks tracks the path by which memories are eventually stored and forgotten [1]. The mathematical model, along with simulations, show that as memories age, patterns of neural activity are chaotic and impossible to predict, before disintegrating into random noise. It is not known if this behavior happens in the brain, but researchers suggest that they monitor how neural activity changes with time during memory retrieval tasks.
In both artificial and bionic neural networks, memories are stored as patterns of the signals that are sent among nodes in a network. In an artificial network, the output values of each node are determined by the inputs they receive from other nodes. In a similar way, inputs determine the probability of a neuron firing (sending an electrical pulse) and the frequency. In a second analogy, links between synapses (which represent nodes) have \”weights\”, which can increase or decrease the signal they transmit. The weight of any given link depends on the degree of synchronization between the nodes it connects. This can be changed as new memories are encoded.
Source:
https://physics.aps.org/articles/v16/14
Leave a Reply