Today, we’re proud to introduce the next major milestone of our end-to-end AI infrastructure: Maia 200, a breakthrough inference accelerator engineered to dramatically shift the economics of large-scale AI.
Maia 200 is the most efficient inference system Microsoft has ever deployed, with 30% better performance per dollar over existing systems.
Watch the videos
Microsoft Azure Maia 200 with Scott Guthrie
Introducing Maia 200 with Jessica Hawk
Silicon to Systems: What powers Microsoft’s AI infrastructure
More resources
-
- Deep dive into the Maia 200 architecture
- AI chips are getting hotter. A microfluidics breakthrough goes straight to the silicon to cool up to three times better.
- Announcing Cobalt 200: Azure’s next cloud-native CPU
- Azure Maia for the era of AI: From silicon to software to systems
- With a systems approach to chips, Microsoft aims to tailor everything ‘from silicon to service’ to meet AI demand