Microsoft unveils 2 custom-designed chips to drive AI innovations
H
eating up the AI race, Microsoft has unveiled two in-house, custom-designed chips and integrated systems that can be used to train large language models.
The Microsoft Azure Maia AI Accelerator is optimised for artificial intelligence (AI) tasks and generative AI, and the Microsoft Azure Cobalt CPU, an Arm-based processor, is tailored to run general purpose compute workloads on the Microsoft Cloud.
The chips will start to roll out early next year to Microsoftâs data centres, initially powering the companyâs services such as Microsoft Copilot or Azure OpenAI Service,â the company said at its âMicrosoft Igniteâ event late on Wednesday.
âMicrosoft is building the infrastructure to support AI innovation, and we are reimagining every aspect of our data centres to meet the needs of our customers,â said Scott Guthrie, executive vice president of Microsoftâs Cloud + AI Group.
Microsoft sees the addition of homegrown chips as a way to ensure every element is tailored for Microsoft cloud and AI workloads.
The end goal is an Azure hardware system that offers maximum flexibility and can also be optimized for power, performance, sustainability or cost, said Rani Borkar, corporate vice president for Azure Hardware Systems and Infrastructure (AHSI).
âSoftware is our core strength, but frankly, we are a systems company. At Microsoft we are co-designing and optimising hardware and software together so that one plus one is greater than two,â Borkar said.
âWe have visibility into the entire stack, and silicon is just one of the ingredients,â she added.
At Microsoft Ignite, the company also announced the general availability of one of those key ingredients: Azure Boost, a system that makes storage and networking faster by taking those processes off the host servers onto purpose-built hardware and software.
To complement its custom silicon efforts, Microsoft also announced it is expanding industry partnerships to provide more infrastructure options for customers.
By adding first party silicon to a growing ecosystem of chips and hardware from industry partners, Microsoft will be able to offer more choice in price and performance for its customers, Borkar said.
Additionally, OpenAI has provided feedback on Azure Maia and Microsoftâs deep insights into how OpenAIâs workloads run on infrastructure tailored for its large language models is helping inform future Microsoft designs.
âSince first partnering with Microsoft, weâve collaborated to co-design Azureâs AI infrastructure at every layer for our models and unprecedented training needs,â said Sam Altman, CEO of OpenAI.
âAzureâs end-to-end AI architecture, now optimized down to the silicon with Maia, paves the way for training more capable models and making those models cheaper for our customers,â Altman added.
âïž Microsoft unveils 2 custom-designed chips to drive AI innovations
đ Post your comments
đ Found this article helpful? Spread the word and support us!