This might be wild speculation but hey , who ever thought we would be in Intel’s ecosystem. We’ve got plenty of Synapses to share around.
Microsoft CEO Nadella: 'Expect us to incorporate AI in every layer of the stack'
During Microsoft's fiscal first-quarter conference call, Tuesday, CEO Satya Nadella told Wall Street the company's private deal with OpenAI will pervade Microsoft's cloud computing operations. It's so big, he said, it's going to "transform" the company's Azure business.
OpenAI, he said, represents part of the next wave in computing. "The next big platform wave, as I said, is going to be AI, and we strongly also believe a lot of the enterprise value gets created by just being able to catch these waves and then have those waves impact every part of our tech stack and also create new solutions and new opportunities," said Nadella.
To that end, Microsoft "fully expect to, sort of, incorporate AI in every layer of the stack, whether it's in productivity, whether it's in our consumer services, and so we're excited about it."
Of the partnership with OpenAI, Nadella remarked, "There's an investment part to it, and there's a commercial partnership, but, fundamentally, it's going to be something that's going to drive, I think, innovation and competitive differentiation in every one of the Microsoft solutions by leading in AI."
Right now, the exemplary applications developed with OpenAI are GitHub CoPilot, where the neural nets assist programmers with completing coding tasks. "GitHub Copilot is, in fact, you could say the most at-scale LLM," said Nadella, using industry jargon for neural nets that handle language, so-called Large Language Models, "based on product out there in the marketplace today."
OpenAI's GPT-3, which forms part of ChatGPT's functioning, is one of the world's largest Large Language Models, as measured by number of parameters, or neural "weights."
Nadella suggested the company will further embed the technology in Microsoft's wares,
including something called Synapse. Synapse is Microsoft's catch-all database approach that allows for things such as a "data warehouse" and a "data lake," common ways of grouping data for analysis, and then the performance of queries against those curated databases.
"You can see us with data services beyond Azure OpenAI Service," Nadella told the analysts, "Think about what Synapse plus OpenAI APIs can do," without elaborating.
After telling analysts that customers are tightening their belts with respect to cloud spending, he pivoted and said that there's ongoing investment by Microsoft in OpenAI and
other AI capabilities.
In particular, Microsoft Azure is having to invest to build out not just the part of its computing facilities that develop the OpenAI code, what's known as "training," but also the vast infrastructure to respond to millions of queries by users of the software, known in the trade as "inference."
"We're working very, very hard to build both the training supercomputers and now,
of course, the inference infrastructure," he said. "Because once you use AI inside of your applications, it goes from just being training heavy to inference."
That change in design, said Nadella, will further drive customers to Azure. "I don't think any application start that happens next is going to look like the application starts of 2019 or 2020," he said.
"They're all going to have considerations around how is my AI inference, performance, cost, model is going to look like, and that's where we are well-positioned again."
As a result of the pervasive creep of AI, said Nadella, "I think, core Azure itself is being transformed, the infrastructure business is being transformed."