Reinvigorating Hardware Through AI: Cerebras and SambaNova Systems Take on Supercomputing

AI is making hardware more interesting

Lawrence Livermore National Laboratory is one of the largest supercomputing users in the world. The U.S. Department of Energy’s supercomputers have a computing power of 200 petaflops or 200 billion floating point operations per second.

In the last two years, two newcomers have joined this lineup: Cerebras Systems Inc. The two startups have raised a total of $1.8 billion and are attempting to disrupt a market dominated by off-the shelf x86 central processor units and graphics processing unit with hardware designed specifically for artificial intelligence model development, inference processing, and running those models.

Cerebras claims that its WSE-2 chip can provide 2.6 trillion transistors, and 850,000 cores of CPU, to train neural networks. This is about 500 times more transistors and 100x as many cores than a high-end GPU. The company claims that the architecture, which has 40 gigabytes onboard memory as well as the ability to connect to up to 2.4 petabytes external memory, can process AI models too large to be feasible on GPU-based computers. The company raised $720 on a valuation of $4 billion.

Source:
https://siliconangle.com/2022/11/14/ai-made-hardware-interesting/

Leave a Reply

Your email address will not be published. Required fields are marked *