Researchers have developed the first silicon-integrated ECRAM to be used in a practical AI Accelerator
Deep learning and artificial intelligent are responsible for a number of transformative changes. However, these transformations come at a cost. OpenAI’s ChatGPT, for example, costs $100,000 per day to run. Accelerators, or computer hardware that is designed to perform deep learning operations efficiently, could reduce this cost. This device will only be viable if the silicon-based computer hardware can be seamlessly integrated.
The University of Illinois Urbana-Champaign research team was able to achieve the first material-level implementation of ECRAMs on silicon transistors. Researchers led by professor Qing Cao and graduate student Jinsong Cui of the Department of Materials Science & Engineering recently published a report in Nature Electronics on an ECRAM designed and fabricated using materials that could be deposited directly onto the silicon during fabrication. This was the first practical ECRAM based deep learning acceleration.
Cao stated that \”other ECRAM devices had been made, with many of the difficult-to obtain properties required for deep learning accelerations. But ours was the first to achieve these properties while being integrated with silicon without any compatibility issues.\” This was the last barrier to widespread adoption of the technology.