 From theCUBE Studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR. This is Breaking Analysis with Dave Vellante. Microsoft Ignite 2023 was one part, a celebration of year-long technological innovations, one part availability of previously announced products, one part vision, two parts ecosystem and a massive shot of co-pilots everywhere. You know, perhaps for the first time in industry history, we're seeing this huge demand for software coincide with the ability to make it easier to actually write software. Just as AWS turned the data center into an API, co-pilots are turning software development into natural language. The implications on productivity are much greater than we saw with the cloud and could surpass the impact of PCs on every desk. Hello and welcome to this week's theCUBE Insights powered by ETR. In this Breaking Analysis, we give you our first impressions of Microsoft Ignite 2023. The CUBE Research Analyst, George Gilbert and CUBE Collective contributor, Sarbjit Johal, both weighed in for this episode and will also share some recent ETR data that shows the progression of some of the major AI players in the past 12 months. So let's start with the keynote. As usual, Satya was a strong presenter, his main focus was on the AI co-pilot stack that is gonna supercharge the next wave of innovation. And there were several, we're announcing the general availability of types of announcements. Azure Boost, which is Microsoft server offload engine, Fabric, which is Microsoft's modern data platform, co-pilots for 365 and Studio, et cetera. And in addition, a hundred or more new updates. Now Jensen was on stage doing his thing and talking about the NVIDIA supercomputer clusters that they've jointly built with Microsoft. Now this is not Azure infrastructure. It's not Azure Boost. This is NVIDIA systems infrastructure with a thin layer of Azure software to help open AI train and run their models. Now Microsoft is, they're winning in training and inference of LLMs. And you know, it might not be so much their foresight, they were filling kind of plugging internal holes and they had to outsource the infrastructure to NVIDIA, which has put Microsoft in a really good position. Now while Jensen was on stage with Satya, Sam Altman, CEO of OpenAI was not. We found that interesting. So Azure now has a Nitro, they have a Graviton, they have an Inferentia, they have a Tranium. They might not like me saying that, but they're basically ARM based systems, ARM based processors, which we knew were coming, but they're finally here. And Satya spent most of his time double clicking into the AI stack, which looks like this. I don't have a ton of time today, so I can't dig in too deep, but let me say a few things here. Satya gave a nice, he gave this great commercial for Azure, he called it the world's computer and reiterated, talked about all their data centers around the world and how they have more than anybody. And so that's cool. You know, we didn't obviously get into the architecture and the infrastructure and AWS will give you a masterclass on how they're different, but we'll leave that for the other day. But he reiterated their commitment to have 100% of energy usage be renewable by 2025. He talked about the network and the fact that they're manufacturing these hollow core fibers, talked about their servers, their chips, which we're going to talk about in a moment, and all other alternatives in the chips from AMD and NVIDIA, and all the way up the stack into the data layer. And he spent a lot of time on the co-pilots. Now, one of the most interesting things that we saw is the Microsoft Graph, which connects all apps, services, and the infrastructure that supports them. I mean, think of it as essentially a semantic layer that makes all these elements and all the data that's feeding them coherent. The reason this is important is all these co-pilots, they work on the graph, so it allows them to actually take action. They know what to, in theory anyway, of what to do with confidence because the data is all coherent and it's trusted. Satya said this, he said, quote, the way to think about this is co-pilots will be the new UI that helps us gain access to the world's knowledge and your organization's knowledge. But most importantly, it's your agent that helps you act on that knowledge. And that is what Satya's saying here, is that is enabled by this knowledge graph. I mean, he didn't talk about that as the key connection, but that is the dots that we're connecting and the services underneath in Azure support the upper layers of this stack and drive consumption of compute, storage, networking, database and all those platform services that power not only all of the productivity software but also these co-pilots that consume these resources. So this is a massive flywheel for the consumption of Azure services and it points to the next decade of momentum that Microsoft is looking at. Let's briefly talk about the chips. There were two new chips announced, Maya, the AI chip for inference and training and Maya has manufactured on a five nanometer process. Pretty sure it's TSM has 105 billion transistors as being packaged in a data center config with closed loop cooling that can be retrofitted into existing data center infrastructure. Now look, that's not unique to Microsoft by the way. I've seen other vendors taking a similar approach but it's cool, no pun intended, I guess pun intended. The Azure Cobalt CPU is on the right here. It's 128 core chip built on an ARM Neoverse platform and it's designed for general cloud services on Azure. So as I said earlier, we have the Azure version of Nitro, the virtualization and storage and networking offload. We got AI chips similar to Amazon's Inferentia and training them and we get Graviton in the form of Cobalt. The big question is how much of a lead does AWS have because think about it, announced Graviton in 2018 it was working within a pair of long, long before that. It announced its AI chips in 2019 and 2021, respectively. It's all ARM based, so you know, time to tape out is going to be compressed and if Microsoft can line up boundary capacity which sounds like it has, perhaps it can, you know, narrow that gap to AWS. I don't know for sure but that's a topic for another day. So really quickly, some of the high level themes at Ignite that we saw, co-pilots for professional developers and GitHub, the GitHub co-pilot, co-pilots for like studio, for end users, platforms for citizen developers, co-pilots for 365, search co-pilot replacing sort of Bing search, change the name, I don't know if that's going to help get any more market share but might as well try because nobody's using Bing. Azure ops co-pilots including a security co-pilot which is new, co-pilots everywhere with a promise of even vertical market co-pilots. So lots of emphasis as well at Ignite on ecosystems from infrastructure partners to ISVs. We talked about the resource graph, really powerful and the takeaway is the dynamics of the LLM market and the differentiation that Microsoft is driving is notable. The integration of services and apps via that semantic graph and the juxtaposition of their messaging relative to AWS's very diverse choice oriented ethos. Look, we know this, AWS gets the last word at reinvent of the week after Thanksgiving and they have to combat the narrative that they're the old guard cloud. Microsoft is depositioning them and my guess is we're going to see a really good showing from Amazon as usual but the pressure is on and the clock is ticking and to underscore that point, I want to share some ETR data. So this is a graphic, we've shown this XY graphic in the past. It shows machine learning and AI spending patterns among some of the leading platforms and the vertical axis is net score or spending momentum and the horizontal axis is an indicator of presence in the data set. It's essentially determined by the N in the data set. It's a random survey of 1700 folks. So in the upper right, you can see open AI started with the ETR started tracking open AI in July of 2023 and you can see, you know, where it is today. That is an off the charts. It's literally behind the October 23 sign there, a graphic there, watermark if you will and way, way ahead. I mean, this is in the stratosphere in terms of spending momentum. You can see Microsoft, look at Microsoft just below open AI and actually more ubiquitous. Look where Microsoft was in October, 2022. Compare that with AWS who made moves but not as a significant move as was made by Microsoft and same with Google. Google, look where Google was in October, 2022 and the move that they made. So the moves that Google, Microsoft and of course open AI made were more dramatic on the right. Meaning they, this is a proxy for market presence. So there's a greater market presence and we plot IBM Watson and Oracle. Remember last week we published on IBM Watson, they made a big move up. So they were below Oracle last quarter. We're going to be watching that in the January survey. So the point is that this is really a race and that race is on. A lot of folks talk about, hey, this is a marathon implying we got plenty of time, it's early innings and while that's true, getting a head start in this race is going to confer competitive advantage and it has certainly already from the standpoint of mind share and initial revenue but the market is still small. So again, we're watching that very, very closely. Okay, let me make some quick wrap up comments and then we'll finish up. So this whole idea of co-pilots everywhere, everyone becomes a developer. If that's the new interface to technology, this is going to give us a massive productivity boost. It's starting with sort of the laptop, desktop and user interaction. It's rapidly moving to developers so they can develop software faster. That's going to go into many, many different use cases, vertical markets, domain-specific LLMs and the like as we've talked about. Again, this flywheel of productivity boost, Eric Brynjolfsson at a recent conference said he'd be disappointed if productivity doesn't grow to three to 4% even annually, up from its current kind of tepid 1.2%. Second point here is the accelerated demand for software meets the ease of building software. This is the first time we've ever seen that in the industry and it's going to create an interesting dynamic here. You know, John Furrier brought up on the Cube Pod today. Will that potentially have an unintended consequence of disrupting Microsoft? In other words, could in fact AWS customers or partners or users actually leverage LLMs or even Microsoft users to compete with Microsoft to develop better software than Microsoft? It's kind of an out-of-the-money long shot but I thought it was an interesting thought. The third point here, the Microsoft Graph, all app services and that supporting infrastructure becoming connected and coherent in a data layer. That's another massive flywheel for Azure. And the key point there is not only is your assistant, it also allows the AI to take action, that system of agency notion. So, look at end user productivity is the king and I think as we've said, 2023 is the year of technology innovation, a lot of technology hype. 2024 has to be the year of showing ROI and productivity and those companies that show that are going to distance themselves from their competitors. All right, we're going to leave it there for now, many thanks to George and Sargeet for their contributions this week. Alex Meyerson, who is on production and manages the podcast Ken Schiffman as well. Kristen Martin and Cheryl Knight, they helped get the word out on social media and in our newsletters and Rob Hoef is our EIC or at SiliconAngle.com. Remember, all these episodes are available as podcasts wherever you listen, just search, breaking analysis podcast. I publish each week on wikibon.com which is being rebranded to cuberesearch.com. We'll see that shortly and siliconangle.com. You can email me at davidotbalante at siliconangle.com or DM me at deepvalanteer.com, it's on our LinkedIn post. You're still watching, check out thecubeai.com. It's out of public beta. Anybody can check it out now. Don't forget to check out etr.ai for the best survey data in the enterprise tech business. We've got some really exciting news coming up with ETR. We're partnering with them. We're deepening our partnership with ETR and making that their services available together with our research services. Super psyched about that. More to come on that front. This is Dave Vellante for the cube research insights powered by ETR. Thanks for watching everybody. We'll see you next time on breaking analysis.