Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/cerebras-systems-raises-1-billion-series-h.24493/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2030871
            [XFI] => 1060170
        )

    [wordpress] => /var/www/html
)

Cerebras Systems Raises $1 Billion Series H

Daniel Nenni

Admin
Staff member
1770263617962.png


Sunnyvale, Calif. — February 3, 2026 — Cerebras Systems today announced the closing of a $1 billion Series H financing at a post-money valuation of approximately $23 billion. The round was led by Tiger Global, with participation from Benchmark, Fidelity Management & Research Company, Atreides Management, Alpha Wave Global, Altimeter, AMD, Coatue, and 1789 Capital, among others.

For more information on Cerebras, visit www.cerebras.ai.

About Cerebras Systems
Cerebras Systems builds the fastest AI infrastructure in the world. We are a team of pioneering computer architects, computer scientists, AI researchers, and engineers of all types. We have come together to make AI blisteringly fast through innovation and invention because we believe that when AI is fast it will change the world. Our flagship technology, the Wafer Scale Engine 3 (WSE-3) is the world’s largest and fastest AI processor.56 times larger than the largest GPU, the WSE uses a fraction of the power per unit compute while delivering inference and training more than 20 times faster than the competition. Leading corporations, research institutes and governments on four continents chose Cerebras to run their AI workloads. Cerebras solutions are available on premise and in the cloud, for further information, visit cerebras.ai or follow us on LinkedIn, X and/or Threads.

Media Contact

PR@zmcommunications.com

 
According to reporting connected with the company’s IPO filings and amendments, Cerebras projected full-year 2025 revenue in the range of $300–$350M, driven by accelerated demand and large multi-year orders such as from G42. How does that get to a $23B valuation? AI continues........
 
According to reporting connected with the company’s IPO filings and amendments, Cerebras projected full-year 2025 revenue in the range of $300–$350M, driven by accelerated demand and large multi-year orders such as from G42. How does that get to a $23B valuation? AI continues........
and what happened to the IPO plan? Such a large round cannot be followed by IPO right away without revenue in short term magically grow by orders of magnitude
 
According to reporting connected with the company’s IPO filings and amendments, Cerebras projected full-year 2025 revenue in the range of $300–$350M, driven by accelerated demand and large multi-year orders such as from G42. How does that get to a $23B valuation? AI continues........
Far more revenue than Groq and much farther along in data center build-out. Groq "sold" for $20B so that deal helped price this round.

Google tells me that we should expect a WSE-4 in the future that will address the need for greater memory bandwidth and capacity for massive GenAI models.
  • 3D Stacked SRAM: Reports suggest the WSE-4 might move towards 3D stacking to increase memory bandwidth and capacity.
  • Enhanced Data Types: The WSE-4 is expected to feature dedicated support for FP8 and FP4 processing to accelerate inference for large language models.
  • Focus on Inference: Given the $10B+ OpenAI deal for inference, the WSE-4 is heavily tailored for running, rather than just training, models with trillions of parameters.
 
Last edited:
Back
Top