Why Machines Need to Dream
The ingenious neurobiology of mammalian sleep has been mathematically modeled to streamline A.I. memory and storage capacity
--
Dreams are, for the most part, delightful. As we sleep, visual and audio fragments combine into nonsensical snippets and epic narratives. Loosely recalled moments merge with vivid, imagined scenes; we interact with characters known and characters conjured up; we explore our fantasies and, sometimes, face our fears. Yet sleeping and dreaming do not exist for our nocturnal pleasure alone.
As we slumber, our brains filter information collected in waking hours. Neurological processes spring into action. They discard what is irrelevant and consolidate what is important; they form and store memories. These mechanisms — found throughout the mammal world — are so effective that a team of Italian researchers have mathematically modeled them for use in artificial intelligence.
The result is an algorithm that expands the storage capacity of artificial networks by forcing them into an off-line sleep phase during which they reinforce relevant memories (pure states) and erase irrelevant ones (spurious states).
When math meets mammals
Adriano Barra is a theoretical physicist at the University of Salento in Italy. Barra is eager and animated when he speaks to me, frequently using a pack of Marlboro cigarettes as an unlikely prop to illustrate the finer points of A.I.
Along with his colleagues Elena Agliari and Alberto Fachechi, Barra studies complex systems, such as brains, and makes mathematical models of their neurobiology. “We theoretical physicists have a teeny advantage over engineers,” says Barra. “As the mathematics is the same, almost for free, we can apply our results in artificial intelligence. We are a bridge between neurobiology and engineering.”
The classic blueprint for artificial neural networks is the Hopfield model. Developed by John Hopfield in 1982, it describes how artificial networks learn and retrieve information using mechanisms, such as pattern recognition, that mimic real brains. The most popular learning rule for a Hopfield network is Hebbian learning, which proposes how the synapses between…