 Hello and welcome back to theCUBE's live coverage here at SAS Innovate 2024. I'm John Furrier, host of theCUBE with my co-host and co-founder Dave Vellante, extracting the seeds from those. Got a great return guest, Jay Uptchart, CIO at SAS is on yesterday. He's got a guest done with him, Sri Raghavan, who's the principal technology partnerships data analyst at Amazon Web Services, AWS, Sri, great to see you. It's great to be here. Jay, great to come back, good to see you. Thank you. So this is a fun CIO, AWS. I mean, this is going to be, I think, a nice cloud conversation. Lots going on in the managed service. So welcome back. And a lot of work going on with AWS and partnership. Let's talk about the partnership update with AWS. So fantastic partnership. In fact, award-winning partnership I would add yesterday. Thank you very much for that. We're thrilled. Yeah, tell us a little bit about it real quick. Yeah, so yesterday we were called upon and we're giving an award the day before actually as the cloud and technology partner of the year. With SAS and we couldn't be more grateful, obviously, to get an award like this. It takes a village and there's a tremendous team on the AWS side as well as in the SAS side who've been working hand in glove for the past, I think four years or so. And we're here at the culmination of that with an award and with all kinds of signed agreements, contractual agreements in place for us to invest in the SAS business. So long story short, if I haven't repeated myself many, many times, I'm going to repeat myself once again. We're thrilled to be in this partnership. This is a great partner to be in business with and we're extremely thankful and grateful for that. Thank you. Jay, we actually had Brian on and talked about that. You do the hard work, get the right product, mix the results and we'll be on the scoreboard. That's my word, he didn't say that directly. We said revenue will be a result of that. The SAS manage offerings doing well, okay? And so that's proof in the pudding is something going on there. Give us the update of what the key to success is. Key to success. Well, let's talk about the results first. So yes, absolutely. SAS managed services as a portfolio continues to excel. We're incredibly proud of it. It's probably the fastest growing part of the SAS umbrella. Last year we enjoyed 30% growth in the managed services. The today's announcement with AWS expands our hosted managed services offering within that portfolio through AWS. And we are really excited about it because we're meeting our customers where they want to be. There have been strong demand signals for the last few years from our customers that said, hey, not only will we take via and run it because it's cloud portable and cloud agnostic and run it where you want. They enjoyed having it there, but they wanted SAS's expertise to install, configure, run and operate it for them. And so we're meeting them where they are and empowering that choice that we've given them. So what is the strategy for moving customers to the cloud? I mean, as a CIO, you could lift and shift. You could lift and modernize. What would you do if you were one of your customers? And what does that mean for Viya? Well, my experience, I'll lift trees depending on this too, every customer is different. There are customers who are running from a problem and an end of life, end of support, hardware, software issue, maybe a compliance issue. And it may be time requires you to lift and shift and maybe others where we have the luxury of time and we can modernize and migrate at the same time. In our case, we have a lot of customers with massive SAS 9 installation. They're hungry for the Viya innovation that you've heard about at the show. They want to go. Some can make that leap by modernization and migration in one step. Others need to pace it. So we try to meet our customers again where they are in their cloud journey. The good news is, we're a couple years into this, right, we've learned a lot as we've gone and so our ability to get our customers there safely so we de-risk the move has been really key to our success and frankly our customer success. And the drivers for that growth are just the need for more analytics in the cloud. Is there an AI driver here? What's the tailwind? Absolutely, there's an AI need, there's an analytics need. Jay said it right, we meet the customers where they are which is why the two companies get along so very well because it's not just about us or the technology or infrastructure, it's about what the customer needs are. And there are two parts to it, right? One is being cloud native and one is being cloud ready. Cloud ready is relatively easier because you can lift and shift and make sure they take advantage of the cloud. Cloud native is a little bit more difficult in the sense that a lot more engineering effort needs to happen. But behind all the things, particularly with SaaS, with the customer base that SaaS has across healthcare, financial services, manufacturing, you name it, the numbers of use cases are diverse as are the analytic needs. And to be able to do that, there are many different kinds of analytics which usually comes into play which is typically we call it multi-genre analytics. SaaS provides a phenomenal base of those kinds of analytics to turn and play which are complemented by analytics which is available from us like SageMaker models and whatnot, as well as let's not forget in this day and age, the generative AI piece. What can we do to bring, for instance, with customer 360? We're providing support with generative AI to be able to deliver content which then SaaS takes and passes it on to customers who are now able to have a very, very near real-time experience as far as their journey with their retailers or with their healthcare providers or with the banks are concerned. This is a classic example of two technologies coming into play. So yes, analytics is a part of it, infrastructure is a part of it. SaaS's ability to manage very complex installations with security, with the ability to serve up data in real-time, all those things become very crucial. And diverse, you mentioned diverse, but also expanding because with Viya Workbench, you're going to see a growth of users and developers, both on the democratization side, as well as on the developer side. So I think that's a great call-out point there. The thing that interested me yesterday on stage was the customer had Amazon on the chart. I took a screenshot of it. It says Amazon, web services at the bottom. Okay, and then the rest of that AI stack was SaaS and customer. No flashy, shiny new toy in there from anything else. Pure cloud and a stack built by the customer. That's production workload. So again, this is the year of the, let's see who can bring the production workloads to the table. Yeah, so in that case, that customer case, that was Georgia Pacific, one of our wonderful customers shared with AWS. That technology stack has evolved between all three parties, right? We've continued to evolve and push. They have over 10,000 models in production today, saving them millions of dollars in manufacturing. It's a fantastic story. And that piece of the stack at the bottom was SageMaker got the call-out, which was interesting. I was with Matt Wood a couple of weeks ago in New York City. I don't want to give the exact number because I'm not sure I remember correctly, but a very large proportion of the workloads and the problems can be solved with SageMaker was the statement he made, and it was more than 50%. And so, Gen AI is maybe the orchestra. And in this case, Viya is actually the orchestrator, bringing some tooling that is really more use case specific for your customers. That's right. That Viya as well as integration in with our ES product, so that event streaming protocol has been incredibly important to our manufacturing vertical. And so those two things together kind of initiated so much of where Georgia Pacific was going, came in with the AWS foundational piece. And I think we've enjoyed tremendous success together and a great customer. That's why I like that because it's Gen of AI, one of the things I love about Gen of AI is, I mean besides, I like hype by the way, but as long as it gets reality comes in, and it is, the hype is, the reality is matching the height in a very compressed scale. So I think cloud helps that too, and having the data will do that. But I like that. There's one point that Jamie, I didn't mean to interrupt, really quickly, 10,000 models. It's a lot of models, right? It's a massive number. To do model management is not easy. Anybody can create these models and bring them in and put them in production. But over time, you have to look at things like championship, challenging models, you need to look at how model decay happens. I cannot think of a better partnership where you bring two technologies together to be able to manage this massive number of machine learning models which are in place for you to address things like demand or customer journey, fraud detection, what have you? I mean that's something stupendous we're looking at. I'm glad you mentioned security. There was a tweet by Adam today. Did you see it? No, I did not. He's basically pounding AWS's chest which is the right thing to do about security. It's always been, I'll be at reinforce in June. This year it's in Philly. And it is something. You guys, AWS has never been a big security monetizer. It's just there. You got some products that you can buy. But it's always been about, it's built in. And so I was glad to see that tweet. Glad you brought that up, Sri. It is a big part of what we do, we have to. Simply because you know we have many regulated industries. Security has never been an afterthought. And now we're making sure people understand that. So Swami also tweeted about the Llama 3 as well. Saw that tweet. That brings up the whole diversity piece of it. So model diversity is a reality. We published last year. I think we're the first ones to actually publish this and call it out. The power law of models. You're going to have large models and you're going to have especially models come in. Turns out that's where the action is. And you guys are actually doing that with the lightweight models as well. Because you have a small model, it'll be highly valuable. Everyone knows that now. Okay, now you have model interaction. So what's going on is the use case of J. George Pacific, coming back to that, is that they built their own AI stack based on what they needed, the latency requirements. Okay, huge end-to-end requirement. That's a net new, that was generated because of Generavai. So Generavai is a whole other category of new things that can be different. It's okay to be different from one workload to the other. Why? Because the workload's different. So there's no general purpose AI in my opinion. That's my view. So that comes back now to cloud. So cloud operations is now standard. Cloud on premise edge. They all got to work together. So when we get back to the end-to-end workloads and I got to manage that data, how do you guys see your customers implementing the new stack, the new generavai models on existing workloads, net new workloads that were built from scratch because of Generavai and the full, I call, tear the house down and rebuild, right? So there's a full rebuild, retrofit, net new. There's three use cases that we're seeing. How can you guys unpack that a little bit for us? Yeah, go ahead, Tri. Go ahead. So this obviously is a point for a much, much larger discussion over and above the queue. But I think one of the first things we've seen with a lot of our customers is they always work back from the use case that they're trying to resolve. I was talking to a customer not too long ago, just a couple of days ago, one of the things that we're trying to do is they were trying to understand patient outcomes in healthcare. Now, they had a very clear idea what it is that they wanted to measure in terms of metrics, but what they weren't clear about is, okay, what are all the intervening steps that happened before somebody goes into, I don't know, diabetic shock, for instance? Okay, what's the data that they have? So to be able to do things like being able to retrofit or to be able to build upon existing use cases, one of the first things I look at is, you know, what does my data look like? Number one, what kind of infrastructure do I have the data I currently stored in? What kind of cleansing that I have to do? What kind of analytics do I have to put in place? Can I do things with regular analytics that I have from the machine learning world or do I have to go with large language models? And if so, what's the price to performance comparison that I have to deal with to be able to get those things up? So there are, as you can imagine, the highly unsatisfactory answer I'm going to give you right now is that there are a number of these components that first need to be looked at. Data, infrastructure, types of models, how you do the model management, how you actually look at, how you curate the models over time, how do you look at performance of models over time. These are some of the things that they look at from the standpoint of whether to retrofit or whether to do a brand new use case or whether to actually continue some of the existing work. You need system architecture. It's you that brings systems thinking to the table. You get to look at everything. Yes, you do, but you also have the struggle of figuring out how to be able to make all of the mesh which is what we've been working with tasks to be able to do that. Not sure of that. No, that's perfect. I totally agree with what you said. I think the other interesting thing is that the partnership together brings a complete cohesive tool set to the table, regardless of if it's a new use case, a retrofit use case, or something again that has yet to be explored. I think together we can bring again tools that help meet the needs of our customers in the moment that they made it in. I think it's part of the compelling reason why I think the partnership's been so successful. You know, a big theme of the show, or maybe not theme, but message that we've heard loud and clear is that, look, it's not just about the LLM. There's so many other components around the LLM. I think both your firms understand that quite well. Having said that, LLM's getting a lot of attention these days. So in AWS is making the assumption that model diversity is the right strategy for customers. Absolutely. Working backwards from the customers. There's two ends of the spectrum on this discussion. One end says, well, all the models are going to be commoditized. The AWS end of the spectrum, I would say, I'm not to put words in your mouth as no, that's not that case. John, you've even said, if innovation continues, they're not going to be commoditized. The combinatorial effects of being able to understand model management, right fit, price performance, energy, et cetera, is going to insulate those models from being commoditized. I'm a CAO. You maybe wouldn't mind if they're commoditized. It makes your life easier. Where do you guys land on that? Try to give me as an objective of the answers as possible on commoditization of LLM's. Look, at some point in time, I'm sure, the number of firms who are developing LLM's is going to increase, that's for sure. But to develop an LLM is no simple affair. It takes billions of dollars to be able to develop any LLM of any kind. So, you know, Amazon's come a long way, not just with Bedrock, which is our managed services. We have our own model called Titan. We have a bunch of other models from Anthrop, and Hugging Face and whatnot. You saw the announcement today at LLAMA. So, I don't believe we're anywhere near commoditization. These models are extremely purpose-built. They are very use case-driven. It's going to be a while before, I don't even know, we're going to ever get to that point where we find the magic to commoditize them. Today, AWS has made strategic investments in the generative AI space, and continuing to do so at a very, very high pace. And we believe that these are models that are going to be used for specific reasons. They can, of course, be used in combination, depending on the use case you're looking at. But I don't see them being commoditized for a while. And how do you take bets? Yeah, I mean, I think there's more players entering for sure. You're exactly right. Incredibly expensive to continue to build. There are also a moment in time that challenges some of the usefulness in the enterprise. So I think one of the places that SAS is going is smaller models, perhaps some that augment or can help ensure the integrity of the answer that might come back from an LLM. We saw that yesterday on stage from SAS's announcements. So will models be commoditized? I don't know that they will, perhaps, on the LLMs because, again, there will be more. Somewhat naturally happens with more play. I think those industry-specific models, those models that are a little bit more built for purpose for a particular enterprise use case. Absolutely. I think we're going to have sustained value, in my opinion. We're aligned with you on that, by the way. We don't believe in the commoditization strategy. I think you took very well on one of the few parts. Actually, Matt Woodjump came into my LinkedIn on that one time. And having said that, consolidation, probably likely because it's so expensive, and mass customization for particular models. And you almost have to do it, right? With the contextual data you put in place, customization is inevitable. And we're fine with that. We expect that to happen and we want that to happen. But in terms of models being commoditized, I don't see that. They might be obsolete. That, I think, obsolescence is a bigger conversation. Maybe, yeah. If you're not good, you've got it. If you don't meet the SLA, Brian and I just talked about in this last segment, if you don't meet certain thresholds, you're not even in the conversation to be considered. I agree. I mean, this is where the game is changing. It's not like you can't hang around. There's no technical debt anymore. It's called technical, you're out of business. Yeah. Right. My point is, I'd love to hear a CIO's perspective on this. I think it's a dangerous assumption to sit back and say, well, it's going to be commoditized, so I don't really have to invest in the customization for my business. I think that's a mistake. You can't wait. Yeah, right? You can't wait. If you're sitting on the sidelines and you're waiting for that price to come down or waiting for someone else to figure it out and then you'll inherit it, your competition's going to pass you by. Well, the CIO could look at it and say, hey, if it's not commoditized, it means it's valuable. I'm going to invest in it. That's right. So the commodity question, to me, goes the wrong way. Commodity means it doesn't cost that much. I can get a lot of it anywhere. I would say from that standpoint, the big models are more commoditized than anything else. So I think the investment is not about the commodities, but how is the value extracted? And keep in mind, that intellectual capital is yours. You don't have to share it with commodity. It's different. And so to that extent, it is to SaaS's advantage. It is to AWS's advantage for us to be able to invest heavily, which we are. Yeah, for sure. The last three minutes we have, I want to get two things out. One, I want to quote Swami who runs the data division over there. This kind of illustrates why I think you guys are aligned. I want to get your reaction on this. Customers need, this is a tweet from this today this morning. Customers need assurance that their generative AI workloads dash that contain highly valuable, sensitive data dash are private and protected. He goes on to say some of the things that Amazon related, but that's a core thesis of this show, that the workflow is the customer's intellectual property. And that workflow and data, which you guys have a lot of experience one with with customers and experience managing and applying math to and statistics and compute, is the new asset. Absolutely. Yeah, I mean, and I think we, what you're seeing in our portfolio and the way we're going to market with our customers and trying to engage their success is 100% trying to help them from the bottom up. So we have the tooling. Again, we hit either through partnerships or within our own portfolio. And we want to help them imagine what that looks like. And that does become their IP at the end of the day. That is their secret sauce for why their business will succeed. And Swami's a legend in Amazon, as you know, keep alumni as well. Yes. And he has a hard core on this too. They want more open data. Absolutely. They want transparency, but the privacy is huge. Yes. This is important for the workloads. When people start rethinking how to apply generative AI, whether it's a retrofit, rebuild or net new, you got to do, you got to understand the role. And he's actually worried about the ethics of it, right? What happens if it gets in the wrong hands? What happens if that privacy is abused? And what happens if there are certain bits of information that you should never be in possession of? And I think, you know, Swami's putting very thoughtful, innovative efforts towards making sure security and privacy are front and center of that whole approach. I thought Reggie did a great job this morning, the main stage around it, especially when it came to the data maker. So SAS is offering right around creating that synthetic data. So if it is highly sensitive, is there a way to augment it, change it, mask it in a certain way? That still makes it applicable in using the models. And then also a great way to also represent underrepresented data, if you will, in the data sets that you're going to model. I think this point is really why, not only you guys have a good partnership with the managed service, it also shows where the money's going to be made, the monetization, the results are going to be, whoever can make their business run better, not just have tech, the tech's just the glue, it's pervasive, everywhere, it's ubiquitous. So final question before we wrap up real quick, where does this partnership go next? Talk about the relationship. Obviously, Amazon's got a lot of customers. I was speculating on our analyst segment that you put SAS in the marketplace. That's a dangerous, that's going to be, whoo, revenue, you know. We dipped our toe in that water. So 360VN in the marketplace was a key decision for us. I think you'll continue to see more innovation there. It's something we're excited about. Obviously, tremendous reach with their customer portfolio we're excited about. We have a, I'll say a compute intensive install base ourselves. And then I think we're solving real world problems with artificial intelligence today at the enterprise level. And so our customers are naturally going to come to the two kings in the arena. J3, thanks for coming on theCUBE, really appreciate it. Thanks for having us here. It's a pleasure to be here and much appreciated. Thank you for everything. All right, awesome. Thank you so much. Live here on the show floor, I'm John Furrier, Dave Vellante, you're watching theCUBE, SiliconANGLE's leading tech coverage. Covering enterprise tech, we'll be right back after this short break.