https://auntresodamid.com/iJugHxINePLH1VY/96561
Microsoft reveals second generation of its AI chip in effort to bolster cloud business

Microsoft reveals second generation of its AI chip in effort to bolster cloud business

Sharing is caring!


Scott Guthrie, executive vice president of cloud and enterprise at Microsoft, speaks at the Microsoft Build developer conference in Seattle on May 7, 2018. The Build conference, marking its second consecutive year in Seattle, is expected to put emphasis on the company’s cloud technologies and the artificial intelligence features within those services.

Grant Hindsley | Bloomberg | Getty Images

Microsoft announced the next generation of its artificial intelligence chip, a potential alternative to leading processors from Nvidia and to offerings from cloud rivals Amazon and Google.

The Maia 200 comes two years after Microsoft said it had developed its first AI chip, the Maia 100, which was never made available for cloud clients to rent. Scott Guthrie, Microsoft’s executive vice president for cloud and AI, said in a blog post Monday that, for the new chip, there will be “wider customer availability in the future.”

Guthrie called the Maia 200 “the most efficient inference system Microsoft has ever deployed.” Developers, academics, AI labs and people contributing to open-source AI models can apply for a preview of a software development kit.

Microsoft said its superintelligence team, led by Mustafa Suleyman, will use the new chip. The Microsoft 365 Copilot add-on for commercial productivity software bundles and the Microsoft Foundry service, for building on top of AI models, will use it as well.

Cloud providers face surging demand from generative AI model developers such as Anthropic and OpenAI and from companies building AI agents and other products on top of the popular models. Data center operators and infrastructure providers are trying to increase their computing prowess while keeping power consumption in check.

Microsoft is outfitting its U.S. Central region of data centers with Maia 200 chips, and they’ll arrive at the U.S. West 3 region after that, with additional locations to follow.

The chips use Taiwan Semiconductor Manufacturing Co.’s 3 nanometer process. Four are connected together inside each server. They rely on Ethernet cables, rather than the InfiniBand standard. Nvidia sells InfiniBand switches following its 2020 Mellanox acquisition.

The chip offers 30% higher performance than alternatives for the same price, Guthrie wrote. Microsoft said each Maia 200 packs more high-bandwidth memory than a third-generation Trainium AI chip from Amazon Web Services or from Google’s seventh-generation tensor processing unit.

Microsoft can achieve high performance by wiring up to 6,144 of the Maia 200 chips together, reducing energy usage and total cost of ownership, Guthrie wrote.

In 2023, Microsoft demonstrated that its GitHub Copilot coding assistant could run on Maia 100 processors.

WATCH: Chinese AI models adapt without Nvidia



Source link

Oval@3x 2

Don’t miss latest news!

Select list(s):

We don’t spam! Read our [link]privacy policy[/link] for more info.

🕶 Relax!

Put your feet up and let us do the hard work for you. Sign up to receive our latest news directly in your inbox.

Select Your Choice:

We’ll never send you spam or share your email address.
Find out more in our Privacy Policy.

🕶 Relax!

Put your feet up and let us do the hard work for you. Sign up to receive our latest news directly in your inbox.

Select Your Choice:

We’ll never send you spam or share your email address.
Find out more in our Privacy Policy.

Sharing is caring!

Read More :-  Trump was supposed to unlock IPO market, but CoreWeave debut reflects ongoing skepticism

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top