In this paper, we survey existing approaches used to scale training to clusters of compute units and explore the limitations of each in the face of giant models.
November 17, 2021
Cambrian AI Research principal analyst Karl Freund explores Cerebras Systems' approach to brain-scale AI and the new technologies that enable it.
September 21, 2021
Deep learning has become one of the most important computational workloads of our generation, advancing applications across industries from healthcare to autonomous driving. But it is also profoundly computationally intensive.
June 29, 2021
Natural language processing has revolutionized how data is consumed, meaning that computational demand has skyrocketed. Companies in every industry are using GPU clusters to keep up. But is this really the best solution?
June 24, 2021
Despite overwhelming evidence that training large BERT-type models on enormous, domain-specific datasets produces higher accuracy results, few organizations do it.
May 24, 2021
An introduction to Cerebras as a company, including a discussion on the core innovations behind the Cerebras CS-2. What is it? How does it work? What does it enable for machine learning practitioners?
April 6, 2021
Sign up and get automatically updated when we publish new content!