It was just a matter of time until Microsoft used a system approach to chips to tailor everything from silicon to services.
At its annual Ignite conference on Wednesday, Microsoft unveiled its first artificial intelligence (AI) chip and cloud-computing processor -- an attempt to take greater control of its technology and support efforts in AI.
Microsoft also announced new software that allows clients to design their own AI assistants, and rebranded Bing Chat to Copilot. It launches as part of the new Bing in February.
Bing Chat has served more than 1 billion prompts and queries since it launched, the company explained in a blog post.
Copilot with commercial data protection will become generally available, with an expansion to Microsoft 365 F3 on December 1.
advertisement
advertisement
The chips, according to Microsoft, represent the last piece of the puzzle to deliver infrastructure systems that include everything from silicon, software and servers to racks and cooling systems designed from top to bottom and can be optimized with internal and customer workloads in mind.
"There is a complete new way to relate to information, whether it's web information or information inside the enterprise," Microsoft CEO Satya Nadella told CNBC. "And that's what we mean by
streaming with our copilot approach, and that definitely has caught the imagination. It's becoming the new UI pretty much for everything or the new agent to both not just get the knowledge but to act
on the knowledge."
The more important change involves machines with reasoning capabilities-- not doing relational algebra, but neural algebra, Nadella said.
Chips are the “workhorses” of the cloud and so many other devices that support a variety of services including advertising. They rely on billions of transistors that process vast streams of ones and zeros flowing through data centers and servers.
That work ultimately allows users to do just about everything on a computer or mobile phone screen, from sending an email to generating an image in Bing with a sentence.
Microsoft is testing the Maia 100 chip with its Bing and Office AI products. It will provide Microsoft Azure cloud customers with a new way to develop and run AI programs that generate content.
Maia, along with the server chip Cobalt, will debut in some Microsoft data centers early next year.
The initiative will protect Microsoft from becoming overly dependent on any one supplier, such as Nvidia’s AI chips. Processors also allow companies like Microsoft to program the chip as they would like without any restrictions, even for Microsoft Advertising.
Maia 100 AI accelerator, named after a bright blue star, will run cloud AI workloads, like large language model training and inference. The company has been collaborating with OpenAI on the design and testing phases.
Microsoft’s multi-year investment shows how critical chips have become to gaining an edge in AI and cloud computing.
Making them in-house allows companies to write their own performance and price benefits from the hardware.