Cerebras Systems

Cerebras Systems

Develops AI accelerators for efficient computing

About Cerebras Systems

Simplify's Rating
Why Cerebras Systems is rated
A+
Rated A on Competitive Edge
Rated A+ on Growth Potential
Rated A on Rating Differentiation

Industries

Data & Analytics

Hardware

AI & Machine Learning

Company Size

501-1,000

Company Stage

Series F

Total Funding

$720M

Headquarters

Sunnyvale, California

Founded

2016

Overview

Cerebras Systems accelerates artificial intelligence (AI) with its CS-2 system, which replaces traditional clusters of graphics processing units (GPUs) to simplify AI computations. This system allows clients, including pharmaceutical companies and research labs, to achieve faster results for critical applications like cancer drug response predictions. By selling its proprietary hardware and software solutions, Cerebras aims to reduce the costs and complexities associated with AI research and development. The company's goal is to enable quicker AI training and lower latency in AI inference.

Simplify Jobs

Simplify's Take

What believers are saying

  • Cerebras's data center expansion increases inference capacity to 40 million tokens per second.
  • Partnerships with Hugging Face and AlphaSense enhance Cerebras's market reach and capabilities.
  • Record-setting molecular dynamics performance positions Cerebras as a leader in scientific computing.

What critics are saying

  • Over-reliance on key partnerships like Hugging Face could pose risks if they dissolve.
  • Rapid data center expansion may lead to operational challenges and increased costs.
  • Intensifying competition from companies like OpenAI could reduce Cerebras's market share.

What makes Cerebras Systems unique

  • Cerebras's CS-2 system replaces traditional GPU clusters, simplifying AI computations.
  • The company offers the largest processor in the industry for faster AI training.
  • Cerebras's AI inference service promises dramatic performance and cost efficiency improvements.

Help us improve and share your feedback! Did you find this helpful?

Funding

Total Funding

$720M

Above

Industry Average

Funded Over

6 Rounds

Series F funding comparison data is currently unavailable. We're working to provide this information soon!
Series F Funding Comparison
Coming Soon

Growth & Insights and Company News

Headcount

6 month growth

2%

1 year growth

-3%

2 year growth

-6%
VentureBeat
Mar 11th, 2025
Cerebras Just Announced 6 New Ai Datacenters That Process 40M Tokens Per Second - And It Could Be Bad News For Nvidia

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More. Cerebras Systems, an AI hardware startup that has been steadily challenging Nvidia’s dominance in the artificial intelligence market, announced Tuesday a significant expansion of its data center footprint and two major enterprise partnerships that position the company to become the leading provider of high-speed AI inference services.The company will add six new AI data centers across North America and Europe, increasing its inference capacity twentyfold to over 40 million tokens per second. The expansion includes facilities in Dallas, Minneapolis, Oklahoma City, Montreal, New York and France, with 85% of the total capacity located in the United States.“This year, our goal is to truly satisfy all the demand and all the new demand we expect will come online as a result of new models like Llama 4 and new DeepSeek models,” said James Wang, director of product marketing at Cerebras, in an interview with VentureBeat. “This is our huge growth initiative this year to satisfy [the] almost unlimited demand we’re seeing across the board for inference tokens.”The data center expansion represents the company’s ambitious bet that the market for high-speed AI inference — the process where trained AI models generate outputs for real-world applications — will grow dramatically as companies seek faster alternatives to GPU-based solutions from Nvidia.Cerebras plans to expand capacity from 2 million to over 40 million tokens per second by Q4 2025 across eight data centers in North America and Europe. (Credit: Cerebras)Strategic partnerships that bring high-speed AI to developers and financial analystsAlongside the infrastructure expansion, Cerebras announced partnerships with Hugging Face, the popular AI developer platform, and AlphaSense, a market intelligence platform widely used in the financial services industry.The Hugging Face integration will allow its five million developers to access Cerebras Inference with a single click, without having to sign up for Cerebras separately

VentureBeat
Mar 11th, 2025
Cerebras Just Announced 6 New Ai Datacenters That Process 40M Tokens Per Second — And It Could Be Bad News For Nvidia

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More. Cerebras Systems, an AI hardware startup that has been steadily challenging Nvidia’s dominance in the artificial intelligence market, announced Tuesday a significant expansion of its data center footprint and two major enterprise partnerships that position the company to become the leading provider of high-speed AI inference services.The company will add six new AI data centers across North America and Europe, increasing its inference capacity twentyfold to over 40 million tokens per second. The expansion includes facilities in Dallas, Minneapolis, Oklahoma City, Montreal, New York, and France, with 85% of the total capacity located in the United States.“This year, our goal is to truly satisfy all the demand and all the new demand we expect will come online as a result of new models like Llama 4 and new DeepSeek models,” said James Wang, Director of Product Marketing at Cerebras, in an interview with VentureBeat. “This is our huge growth initiative this year to satisfy almost unlimited demand we’re seeing across the board for inference tokens.”The data center expansion represents the company’s ambitious bet that the market for high-speed AI inference — the process where trained AI models generate outputs for real-world applications — will grow dramatically as companies seek faster alternatives to GPU-based solutions from Nvidia.Cerebras plans to expand from 2 million to over 40 million tokens per second by Q4 2025 across eight data centers in North America and Europe. (Credit: Cerebras)Strategic partnerships that bring high-speed AI to developers and financial analystsAlongside the infrastructure expansion, Cerebras announced partnerships with Hugging Face, the popular AI developer platform, and AlphaSense, a market intelligence platform widely used in the financial services industry.The Hugging Face integration will allow its five million developers to access Cerebras Inference with a single click, without having to sign up for Cerebras separately

VentureBeat
Feb 11th, 2025
Cerebras-Perplexity Deal Targets $100B Search Market With Ultra-Fast Ai

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More. Cerebras Systems and Perplexity AI are joining forces to challenge the dominance of conventional search engines, announcing a partnership that promises to deliver near-instantaneous AI-powered search results at speeds previously thought impossible.The collaboration, announced today in an exclusive VentureBeat report, centers on Perplexity’s new Sonar model, which runs on Cerebras’s specialized AI chips at 1,200 tokens per second—making it one of the fastest AI search systems available. Built on Meta’s Llama 3.3 70B foundation, Sonar represents a significant bet that users will embrace AI-first search experiences if they’re fast enough.“Our partnership with Cerebras has been instrumental in bringing Sonar to life,” said Denis Yarats, Perplexity’s CTO, in a statement. “Cerebras’s cutting-edge AI inference infrastructure has enabled us to achieve unprecedented speeds and efficiency.”AI search just got faster—and Big Tech should pay attentionThe timing is notable, coming just days after Cerebras made headlines with its DeepSeek implementation, which demonstrated speeds 57 times faster than traditional GPU-based solutions. The company appears to be leveraging this momentum to establish itself as the go-to provider for high-speed AI inference.According to Perplexity’s internal testing, Sonar outperforms both GPT-4o mini and Claude 3.5 Haiku “by a substantial margin” in user satisfaction metrics, while matching or exceeding more expensive models like Claude 3.5 Sonnet

VentureBeat
Jan 30th, 2025
Cerebras Becomes The World’S Fastest Host For Deepseek R1, Outpacing Nvidia Gpus By 57X

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More. Cerebras Systems announced today it will host DeepSeek’s breakthrough R1 artificial intelligence model on U.S. servers, promising speeds up to 57 times faster than GPU-based solutions while keeping sensitive data within American borders. The move comes amid growing concerns about China’s rapid AI advancement and data privacy.The AI chip startup will deploy a 70-billion-parameter version of DeepSeek-R1 running on its proprietary wafer-scale hardware, delivering 1,600 tokens per second — a dramatic improvement over traditional GPU implementations that have struggled with newer “reasoning” AI models.Response times for various AI platforms, measured in seconds to first token generation. Cerebras leads with the lowest latency at 0.18 seconds, while Amazon’s platform takes nearly a full second to respond

VentureBeat
Jan 14th, 2025
Cerebras Systems Teams With Mayo Clinic On Genomic Model That Predicts Arthritis Treatment

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More. Cerebras Systems has teamed with Mayo Clinic to create an AI genomic foundation model that predicts the best medical treatments for people with reheumatoid arthritis.It could also be useful in predicting the best treatment for people with cancer and cardiovascular disease, said Andrew Feldman, CEO of Cerebras Systems, in an interview with GamesBeat. Mayo Clinic, in collaboration with Cerebras Systems, announced significant progress in developing artificial intelligence tools to advance patient care, today at the JP Morgan Healthcare Conference in San Francisco. As part of Mayo Clinic’s commitment to transforming healthcare, the institution has led the development of a world-class genomic foundation model, designed to support physicians and patients. Like Nvidia and other semiconductor companies, Cerebras if focused on AI supercomputing

There are no jobs for Cerebras Systems right now.

Find jobs on Simplify and start your career today

💡
Don't see your dream role? Check out thousands of other roles on Simplify. Browse all jobs →