 Welcome back everyone to our AWS coverage here on location in Las Vegas. This is theCUBE's 11th year covering re-invent. It's always our favorite event. It's our largest event ever here at re-invent for theCUBE. I'm John Furrier, your host with Dave Vellante. And the tradition is at the end of every show, Andy Jassy would come by, but since Andy's not the CEO anymore, and Adam's busy, swam, he thinks we're coming by and holding the line for representing AWS to come on and join theCUBE for our close and wrap up. My pleasure, really glad to be here. And we're huge fans. You're real big support of theCUBE. You come on when you have news. The Hugging Face news was big in February and other showcases we've done. And the work you're doing is great. The keynote was amazing. So you give a keynote on day two. A lot of people were like blown away. Like, okay, wait a minute. Generate AI strategy? There it is. Lay that out on the day before. So congratulations and thanks for coming on theCUBE. I know, thank you. That is actually, we are super excited about all the things we launched as well because you see AWS strategy with Generative AI is really resonating with customers in a big way. And we are already seeing customers in every industry from healthcare like Pfizer to FinTech with Intuit to travel and booking.com to so many industries they are already innovating, but also not just at the models level, but even at the infrastructure level to application level. So I'm very, very excited to see. I have to ask, because it was someone we've been talking a lot about. In February when you were on theCUBE, when you announced your expanded Hugging Face relationship, I asked you, because we've been following you guys. We know you've been doing machine learning and AI for a very long time. So we were out there holding the line for you. But given the generative AI hype that happened, you guys had to kind of go, okay, this is game time. So game on, a little bit of a challenge for the industry. A lot's been happening. You guys have infused generative into almost everything, if not everything. Can you scope out like the size of the infusion? I mean, pretty much your keynote was pretty much all existing services with now Gen AI. It wasn't a pivot because you kind of just had to retool. How much of that was going on? Can you give us a size of a scope of like the magnitude of like what happened? Because last year we talked about LLMs a little bit, but it wasn't like on the stage, it wasn't in the announcements. And this year you got a stack, every product's got, you got Q, I mean everything's been, I wouldn't say, transformed. Sure, I mean, first of all, as you rightly pointed out, we have been doing NML for like 25 years and we have been actually using LLMs internally for quite a while. If you see Amazon.com, we rolled out an LLM-powered personalization and search experience even before Generative AI was cool and it did really wonders for improving our customer search relevance. Let alone in 2020 we rolled out quicksike queue to do natural language-based querying of dashboards. But as we were exploring and these models internally were getting more and more powerful, we started investing in all layers of the stack. I mean, even more than six years ago we launched custom silicon. So this stack is not something we just suddenly invented this year. This has been in the making, but the key thing was that this is what sets AWS apart. We first launch stuff when it is ready, but we also don't just give answer to only one customer persona, which is like, hey, I have a model and an API we are done. We are actually innovating in every layer of the stack with the infrastructure layer. We are investing with our strategic collaboration with NVIDIA, which we expanded with our GPU-based instances to custom silicon with Tranium and Inferentia. Let alone all the innovations we are doing in the software stack with SageMaker with things like HyperPod, which decrease the training time by up to 40%. So this has really, in a game changer for the likes of Public City, AI Stability, or AI 21, to even order desk and many others. So many things. If you go back six months, just want to ask you, the stability and some of the image stuff that's going on, six to eight months ago to do that kind of computation would have been mind-boggling. What's the step function increase has been happening? Has this been massive improvement on just say image generation? I know. So one of the things is, even two years ago, when people talk about what LLMs are, that means they typically do an training run that is usually in the order of maybe one or two weeks. Now, a typical training run goes probably in the order of a few months. And that is another reason why, as the scale of these training jobs increase, the complexity also increases, which is where I talked about yesterday, why you need to do constant check pointing and why you need to do better detection, why you need to do automatic distribution, which led to things like hyperpods as an example with SageMaker. And this is, again, an example of just even focusing on that persona, what you need to innovate. And now the same thing happens in Bedrock as well. When we think of, I mean, having you all built your own GenAI app with QBAI. So you know, building a GenAI app is a lot more than just actually taking an LLM and actually plugging it through. You still need to worry about contextual relevance. You need to worry about guardrails. You need to pick the right tool for the right job. So that's where you saw all the features like Bedrock model evaluation guardrails and embeddings and vector databases that we launched really play a big role in that. In embedding gaze, it's a great launch. Things are getting that in there. And having the right data platform as well is critical to this. Can you explain, if you had your own LLM, explain why it's so important to have all this optionality? Sure, when you think about this space, we are a strong believer that no one model will rule the world because there is absolutely going to be different models are going to be right for different use cases and even at different points in time. And because different models have different performance, latency and accuracy and cost characteristics and every application has its relevant ROI. And for instance, internal support IT chat application where you are answering questions for like maybe 200 documents, you don't need the most capable model where the inference cost is something like $10 for 10 requests. You probably need a highly customizable small model. So whereas when you're trying to do complex reasoning and automation, you probably need a model that is more capable on those. So that is why we actually always, Bedrock was the first one to come up with the option which is like, hey, it is you do need model choice and it is paramount. And we will help you with dealing with this complexity and we will probably access to the leading FMs. And if you look at it, the history when we built RDS and I was the engineer who built it, it had the same roots. At that time, every enterprise or God enterprise vendor used to say, oh, the answer to all your database is X. It doesn't matter what your question is. And we said, no, actually RDS, we had multiple engines and then we still went on to build our own. It's very similar here. I was talking with Tom Soterson, he came on earlier and he says, the people are going to get a lot of love in this next wave is with Genevieve IIS, the data engineer, the chief data officer and the network engineer. He says, because his point was, now that the developers will have a feeding frenzy on the foundation models as that developer layer, they're going to want to have kind of that data as code, AI as code abstracted away from them and all the pipelining and the data world will be changed. And we were, we've been saying on theCUBE that the data engineering has to get, not reset, but rethought through to feed AI. So building governance in from day one, you had a bunch of announcements, we're talking about zero ETL for instance, other things. So if this continues, that means there's got to be an engineering exercise under the hood. What are you, how do you see that? Is that, are we overthinking this? Or do you see radical change around governance and how people should be thinking about their data strategy to ride this next 10, 20 year wave? Yeah, I mean, if anything, generative AI makes the data strategy even more important because at the end of the day, without actually being able to customize with your data, generative AI applications are not really going to be useful for your customer and your business. But the key aspect is you still have to solve the traditional problems with having a strong data foundation. You still need to get breakdown your data silos and have them in high quality. You still need to actually make sure you are storing it in a vector database to be able to index. You still need to ensure that you're picking the right tools for the job. But what essentially now, that's why yesterday I was talking about the symbiosis, then how can gen AI help to make data engineers even more productive in improving data quality to do a zero ETL to also querying and... Do you see that as a flywheel? I mean, you guys love flywheels at Amazon, so the flywheel of gen AI interacting with the data, that's a new flywheel and human intellect, you mentioned the human. We have data in our head. Actually, you phrased it really well. That is another way to actually talk about the symbiosis and it's a very Amazon way of talking about it. I feel like we're going to Amazon after all these years. I do agree because I actually think data and gen AI and human, they all facilitate each other to create these remarkable experiences and because data fuel gen AI and cloud and data really led to our gen AI is what it is and gen AI is also going to make data management a lot easier and humans, we are the facilitators. At that point, your keynote to the human and AI relationship, you and I have talked about this a lot, the greatest chess player in the world, it's not a machine, it's not a human, it's a machine and a human. Even if it's plus AI is greater than AI, you saw that chess as well, we'll talk, and we'll argue by the way, by the chess people they love to talk and chat, but we were talking, Dave and I were, and when I asked his question, we noticed that the security team at Amazon is now one team and security, we think the security wave and now this data engineering wave is going to have the same trajectory, somewhat similar in the sense of creating a developer experience where the developers can just code in line in the IDE, we heard that with Code Whisperer, we saw that with Q, of course the business side is more of the answer to the co-pilot, but if you believe that the data has to be highly available, horizontally scalable, but vertically specialized, which we've been saying for five years, then you kind of rethink the unification of the data and because you have now a complex workloads, as you mentioned, and a complex topology, core cloud, cloud on premises, cloud on edge, cloud in the space, you got to have, we see Health Lake, you got data lakes forming centralized things, so is the organization going to be changing around how you organize your teams and your data? Because it almost seems like a reset has to happen. I mean, that is something we are starting to see you and then among companies, titles like CDO didn't exist even four years ago and now CDO is one of the most important jobs in every organization, where they are in charge of breaking down data silos in their organization and figuring out how do you get a unified data platform story because they know that it's absolutely critical to actually drive ML strategy and create net new experiences, not only to optimize cost structure, but to create net new customer experiences and I think that is now more important than ever. Will these verticals have data lakes? I mean, you can imagine Health Lake is an interesting answer. I like that one, because that opens up things. Security Lake. Security Lake. Security Lake. Security Lake, you know. Is there an analog to like the- Telecom Lake? Well, the Amazon data platforms, I mean, is there, it is a bigger challenge, right? The data zone become that sort of- I mean, data zone already actually helps facilitate a lot of these governance, especially because one of the challenges internally within an enterprise is to first ensure, first you are able to catalog what data sets you have, then let alone govern and then also share accordingly and keep the slide wheel going. And this we are convenient and not just the analytics, but also the database and the machine learning aspect so that it becomes an end to end data platform. So the show's ending with the big replay party tonight. As you look back and zoom out, you're chilling out here in the cube, look at your keynote. What do you think? I mean, what's your take on how it went? How do you feel? And what was the coolest thing, most exciting thing that you think, came out of all this? I mean, first of all, reinvent is one of my favorite times of the years because you look at it purely from a selfish, from my team perspective. The teams work incredibly hard throughout the year and then they get to showcase everything they do. And it's always exciting to also not only launch these things, but get customer reactions and also we get to see how customers are innovating with it. And when I hear some of the stories, like what you heard like from Huron AI, how they are democratizing cancer care in sub-Saharan Africa, that makes the whole thing worth it, right? Because we all work like incredibly hard, but when you have that kind of impact, and that's what I get. Even though, reinvent, I probably sleep like only four hours a day, but I still get that. But I mean, you're a data. You're DynamoDB, that's your heritage when you were an intern and started Amazon. But the data world is like, this is prime time. If you're a data person, I mean, it's a whole nother level, next level. Yeah, I mean, that is what makes it exciting that now data is more important than ever AI is more important than ever and builders are more important than ever. That's why it's such a beautiful symbiosis of flywheel, like you put it as well. So will there be a day when we just say, hey, rewrite my schema, it just doesn't? Actually, that day may not be in August, very long. I mean, the demo was a great, 1,000 apps in two hours, and it got me into Linux. I can imagine like self-building pipelines, automatic intelligence and reasoning around infrastructure provisioning, I mean. I actually think, those are the kind of things almost you had like the world is moving from, like when it used to move from assembly to C to C++ to Java. Now, the undifferentiated heavy lifting constantly keeps improving. We are in yet another technological error jump in removing the undifferentiated heavy lifting. So we can augment intelligence to these developers. Well, we appreciate you. Thanks for coming on theCUBE. I know you're super busy, take time out. Reflect crowds, pretty much back to steady state of reinvent. I know. 60 plus thousand people, they got the right, go to the state of the right, go left, try to cross over. I know, man. It is such an exciting thing to see so many people, and it's always good to see you both as well, so thanks for having me. Thanks, Swami, thanks for coming on. Okay, we'll be back with more live coverage here in theCUBE, but reinvent now. Back to Palo Alto Studios, where they can take it from here. We'll be right back after this short break.