Cerebras partners with Dell on enterprise AI chips
Chip design startup Cerebras has announced a collaboration with Dell Technologies to deliver AI compute infrastructure for generative AI – and challenge Nvidia.
Cerebras' core product is the Wafer Scale Engine 3 – a chip featuring 4 trillion transistors and 900,000 AI cores, which is purpose built for training the industry’s largest models. The WSE-3 is built on a 5 nm process, and powers the Cerebras CS-3 AI supercomputer, which is capable of 125 petaflops of peak AI performance.
The collaboration will see Cerebras incorporate a new memory storage solution powered by Dell and AMD EPYC CPUs for its AI supercomputers, enabling enterprises to train models “orders of magnitude larger than the current state of the art."
“Our new collaboration with Dell is a turning point for Cerebras,” says Andrew Feldman, co-founder and CEO, Cerebras Systems. “This opens up our global sales distribution channels in a meaningful way, while providing customers with the additional AI hardware, software and expertise needed to enable full-scale enterprise deployments.”
Cerebras offers AI acceleration technology that enables 880 times the memory capacity of GPUs, 97 percent less code to build large language models, push-button model scaling and superior data preprocessing solutions. The technology enables effortless compute and memory scaling, data parallelism for less debugging, and its simple structure enables a lower total cost of ownership.