← Back to AIOpenAI and Cerebras Systems announce historic $10 billion AI computing partnership
🤖 AI: Computing Infrastructure

OpenAI Partners with Cerebras in Historic $10 Billion AI Computing Infrastructure Deal

📅 January 24, 2026 ✍️ AI News Team ⏱️ 8 min read

💰 Historic Deal

OpenAI signs a multi-year deal worth $10 billion with Cerebras Systems, securing access to the world's most powerful AI chips for training GPT-5 and future models.

The Largest Compute Deal in AI History

OpenAI, the company behind ChatGPT, announced an unprecedented deal with Cerebras Systems, a startup specializing in building massive AI processors. The $10 billion deal represents the largest investment in computing infrastructure in the history of artificial intelligence.

This move marks a strategic shift for OpenAI, which until now relied primarily on NVIDIA GPUs. The partnership with Cerebras gives the company access to an entirely different computing architecture.

$10B
Deal Value
5 Years
Partnership Duration
100+
WSE-3 Chips
10x
Compute Increase

📖 Read more: AI Energy: Reducing Power Consumption

Who is Cerebras?

Cerebras Systems was founded in 2016 with a bold goal: to create the world's largest processor. The result was the WSE (Wafer-Scale Engine), a chip that occupies an entire silicon wafer — 56 times larger than the biggest GPUs.

🤖 OpenAI

  • Founded: 2015, San Francisco
  • CEO: Sam Altman
  • Products: ChatGPT, GPT-4, DALL-E, Sora
  • Valuation: ~$100B
  • Users: 200+ million

🔧 Cerebras Systems

  • Founded: 2016, Sunnyvale CA
  • CEO: Andrew Feldman
  • Product: WSE-3 Wafer-Scale Chip
  • Valuation: ~$8B
  • Employees: 450+
Side-by-side comparison of OpenAI and Cerebras company logos and market positions

📖 Read more: Elon Musk Sues OpenAI for $134 Billion: Epic AI Battle

What is the WSE-3?

Cerebras' Wafer-Scale Engine 3 (WSE-3) is the largest and most powerful AI processor ever built. Instead of cutting a silicon wafer into many small chips, Cerebras uses the entire wafer as a single unified chip.

Cerebras WSE-3 wafer-scale processor - world's largest AI chip architecture

🔧 WSE-3 Technical Specifications

Transistors 4 Trillion
AI Cores 900,000
On-Chip Memory 44 GB SRAM
Memory Bandwidth 21 PB/s
Chip Size 46,225 mm² (56x larger than H100)
Peak Performance 125 PetaFLOPs (FP16)
Technical specifications and performance metrics of the WSE-3 AI processor

This partnership will allow us to train models that were previously impossible to imagine. Cerebras' architecture is ideal for next-generation AI.

— Sam Altman, CEO OpenAI

Why Did OpenAI Choose Cerebras?

🎯 Strategic Reasons

Speed: WSE chips train models 10x faster than traditional GPUs.
🔋 Energy Efficiency: 5x better performance per watt compared to NVIDIA H100.
🏗️ Simplicity: Less networking complexity between chips.
💡 Diversification: Reducing dependence on NVIDIA.
🚀 Future: Capability for models with trillions of parameters.
💰 Cost: Better long-term TCO despite the high upfront cost.
Six key reasons why OpenAI selected Cerebras over competing AI hardware providers

📖 Read more: Mira Murati Loses 2 Co-Founders - They Return to OpenAI

Implementation Timeline

Q1 2026
Agreement Signed
Official announcement of the partnership and installation of the first systems begins.
Q2-Q3 2026
Data Center Installation
Building new data centers with 50+ Cerebras CS-3 systems in the US and Europe.
Q4 2026
GPT-5 Training Begins
The first GPT-5 experiments will run on Cerebras clusters.
2027
Full Operation
100+ WSE-3 chips in full operation, training the most advanced AI models.
Project implementation roadmap showing 2026-2027 deployment timeline for compute infrastructure

What Does This Mean for Competition?

This deal changes the AI hardware landscape. NVIDIA, which dominates the market with 80%+ share, now faces a new competitor. Companies like Google, Microsoft, and Amazon will likely reassess their own strategies.

Meta is already developing its own chips, while Google is investing in TPUs. OpenAI's move toward Cerebras shows that the AI infrastructure market is becoming increasingly diverse.

🔮 Conclusion

The OpenAI-Cerebras deal marks a turning point for the AI industry. With a $10 billion investment in wafer-scale computing, OpenAI is betting on the future of computation.

The big question: Will Cerebras' architecture be able to meet the demands of GPT-5? The next 18 months will provide the answer.

OpenAI Cerebras WSE-3 AI computing artificial intelligence compute infrastructure ChatGPT AI hardware
← Back to AI News