Artificial intelligence startup Cerebras Systems recently unveiled a groundbreaking new version of its AI processor, aiming to double the performance at the same cost as its previous model.
The processor is developed by the Santa Clara, California-based startup, known for its unique approach to AI chip design. Unlike traditional methods that rely on combining thousands of smaller chips, Cerebras bets on a single, dinner-plate-sized processor to outdo the competition, particularly Nvidia's advanced hardware clusters.
A bold claim by Cerebras CEO
Cerebras CEO Andrew Feldman made a bold statement regarding the company's achievements, "So the largest chip that we made was our first generation. People said we couldn't make it," Feldman recounted.
He highlighted the rapid technological advancements Cerebras has made, moving from a seven-nanometer process to announcing a five-nanometer part within just three years. "This is the largest part by more than three and a half trillion transistors," Feldman added, underscoring the scale of their latest achievement.
Tackling the power consumption challenge
One critical challenge in AI processing is managing power consumption, especially as the costs to build and run AI applications have surged. Cerebras said that its third-generation chip, the Wafer-Scale Engine 3 (WSE-3), addresses this issue head-on.
Despite its unparalleled performance capabilities, it uses the same amount of energy as its predecessor, presenting a more sustainable solution for AI development.
Plans to sell WSE-3 systems
The WSE-3 chip is a technological marvel, boasting 4 trillion transistors and the capacity to perform 125 petaflops of computing. Constructed using Taiwan Semiconductor Manufacturing Co's 5nm manufacturing process, this chip represents a significant leap forward in AI hardware.
Cerebras announced plans to sell its WSE-3 systems in conjunction with Qualcomm AI 100 Ultra chips. The partnership aims to enhance the ability to run artificial intelligence applications, focusing on the inference process.