Unlocking the Potential of Computing with Chemicals: A Closer Look at ECRAM for Faster, Leaner AI
Computers that use chemicals can make AI faster and more efficient
How far off could an artificial mind be? The artificial brain may still be a long way away, but the synapse – the key element in the brain’s network – is now closer than ever.
It’s because an artificial neural network-running device that takes inspiration from batteries is now surprisingly effective. It’s called electrochemical RAM (ECRAM) and it’s giving transistor-based artificial intelligence a run for their money. Researchers have recently announced a number of advancements at the IEEE International Electron Device Meeting (2022) this week and elsewhere. These include ECRAM devices which use less energy, store memory longer and take up less room.
Artificial neural networks, which power machine-learning algorithms today, are software models of a large number of \”neurons\” based on electronics and their synapses. Researchers believe that a faster and more energy-efficient AI could be achieved by representing the components of neural networks, particularly the synapses with real devices. This concept is called analog AI. It requires a memory that has a number of hard-to-obtain characteristics: it must be able to store a wide range of analog values and switch between them reliably and rapidly, as well as hold their value for a very long time and be scalable.