Microsoft Announces Maia 200 AI Accelerator

Microsoft today announces the Maia 200 AI accelerator, its successor to Maia 100, built on TSMC’s 3nm process with more than 100 billion transistors. Microsoft says Maia 200 delivers three times the FP4 performance of Amazon Trainium Gen3 and stronger FP8 than Google’s TPU v7, will host OpenAI’s GPT-5.2, improves inference cost-efficiency by about 30 percent, and begins deployment in Azure US Central.
Scoring Rationale
Official announcement of a high-performance 3nm in-house accelerator with cloud deployment; independent benchmarks remain limited.
Practice with real Logistics & Shipping data
90 SQL & Python problems · 15 industry datasets
250 free problems · No credit card
See all Logistics & Shipping problemsStep-by-step roadmaps from zero to job-ready — curated courses, salary data, and the exact learning order that gets you hired.
Sources
- Read OriginalMicrosoft’s latest AI chip goes head-to-head with Amazon and Googletheverge.com
- Read OriginalMicrosoft revela Maia 200: o novo “monstro” de IA que já bate a Google e a Amazontugatech.com.pt
- Read OriginalMicrosoft debuts Maia 200 AI chip promising 3x inference performanceinterestingengineering.com
- Read OriginalMicrosoft unveils Maia 200 AI chip, claiming performance edge over Amazon and Googlecommstrader.com
- Read OriginalMicrosoft unveils Maia 200 chip to supercharge AI inferencedataconomy.com
- Read OriginalMicrosoft’s Maia Chip Targets A.I. Inference as Big Tech Rethinks Trainingobserver.com

