 Good afternoon everyone. Welcome back to theCUBE's day one coverage of Snowflake Summit 2023. Live from Caesar's Forum in Toasty, Las Vegas. Though you wouldn't know it because we're all freezing inside Caesar's Forum. Lisa Martin with Dave Vellante. Dave, we always talk about every company regardless of industry has to be a data company. They have to be a software company. Last night, Jensen Huang from NVIDIA said there got to be AI factories, but intelligent data management is still at the core of organizations really extracting value from their data to deliver the insights and the outcomes that everybody wants. Well, Frank Slutman said last night we've had this long, you know, challenging relationship with data. Because data's ugly, you know? It's everywhere and it's never consistent. Yeah, a lot of people trying to make it pretty and extract insights from it. One of our alumni is back with us, Rick Tam Daniels, Global VP of Ecosystems and Technology at Informatica. It's great to have you back, Rick. Yeah, great to be here. So Informatica and Snowflake, a long partnership over eight years now. That's right. You guys made some big announcements this week. We got to talk about SuperPyte, which I know does not mean the Olympics and the flying tomato. Sounds like SuperCloud. It does sound like SuperCloud. Talk to us a bit about some of the things that you guys announced this week. Well, you're absolutely right. We have a long standing partnership with Snowflake and a big component of that is our joint innovation agenda, right? We're, as Informatica, we have so many customers around the globe that depend on us for critical data management and when they're using platforms like Snowflake, they're looking at us to stay ahead of the curve, right? To bring the latest and greatest innovations. And so one of the big ones you mentioned is SuperPyte for Snowflake. And we think about the different types of needs for data and data integration, enterprise data sources, like ERP systems, CRM systems, they're critical for analytics, right? But those systems are kind of hard to get data out of, typically. And in Informatica, that's our expertise, right? Is getting data out hard to get to places and able to bring that data though as quickly as possible at Snowflake. Because the other piece of it, that information's always changing. And you want to have it up to date. It's mission critical data. And so reducing the latency of data, making it more real-time, that's what SuperPyte's all about. So we see up to about three and a half times faster performance than our previous kind of change data capture, data replication technology. So it's a huge leap forward, leveraging some of the latest SnowPyte streaming capabilities from Snowflake. So is it a replacement for change data capture approaches? Would you say? So it's actually an optimized version of change data capture specific to Snowflake, right? So because of the engineering and product development partnership, we're leveraging some new capabilities in Snowflake that let us load data much faster, let us merge data in more quickly. So it's part of our kind of change data capture family, but it is such a huge leap forward. You know, it really deserved its own moniker. So hence SuperPyte. So it was joint engineering. Yes. Basic building blocks from Snowflake and then built on top of those. Is that right? Yeah, and as those blocks were in development, we're developing side by side and giving feedback back and forth and continue to make improvements and innovation together. So is that kind of, well, what if you did it this way? What if you did it that way? Exactly, yeah. How do you take advantage of it? Well, this is the trade-off. Exactly, and that's the depth of our partnership, right? The great example of collaboration. And where are customers in that collaboration? I imagine knowing Informatica and Snowflake because I do their integral to this collaboration. Oh yeah, absolutely. So right now we're in the public preview. We've had a private preview set of customers who gave us early feedback. A lot of excitement though, right? Because I think the demand for the data, the type of data we're bringing in is just huge and customers, the reception's been fantastic here, right? Seeing the performance gains and it's just, and the ease of use as well of just kind of activating this in their environment. So GPUs we heard last night from Frank Slutman and Jensen that GPUs aren't free, but evidently data integration is. Wait, how's that work? Yeah, so the other, one of our other big announcements is our cloud data integration free service tier. So last year at Summit we announced our free data loader. Giving about access to about 40 different data sources. And the whole concept here is just to lower the barrier, the entry on that first step of the journey. And Informatica, we're all about the complete data management journey, not just loading your data, but also cleansing your data, make sure it's trusted and usable and governed and cataloged. And so by that first step of, I just need to get my data in a snowflake, we're making that as easy as possible. That's why we have the free service offerings to get folks onboarded and starting to use our intelligent data management cloud. How does that play into Informatica's overall strategy? Well, it's a big part of, I think you look at how customers are adopting technology in the cloud today. I mean, Informatica, we've been on a consumption based model for quite a while now, which is really important. And we look at customers and how they want to onboard, they want to try things quickly, they want to get quick results. And that's what these service is all about, it's just super easy to sign up, you just put your email address in, that's it, right? And your access to the service is instantaneous and you can start loading data. So it's a big part of let's get started fast, but then when you hit those other challenges that can sometimes hold back scaling, like data quality, like data governance, our platform's there and it's all integrated and it's got that AI foundation underneath it as well, which is so important. We love consumption based models, I advise a lot of software companies, like years ago, this is the future, you got to go consumption based for most companies. But they also bring a challenge, in fact, Slutman actually talked about this last night, you got to be disciplined, you got to pay attention and it's great because it aligns with the value. But the other thing he said, or maybe it was Jensen said that LLMs create a whole new equation there. So how do you think about that consumption based model in the context of LLMs that you've introduced? Well, and you're absolutely right, in the consumption based world, customer success is paramount, right? And from an informatic perspective, we're doing great, I'll know those lines, we're up to I think 54 trillion transactions a month, we're running through our cloud platform, right? So that's about 69% growth year-over-year, so plenty of consumption coming onto the platform. So I think that's a big element of it. When you think about the LLMs space, there are really two angles for us in generative AI. The first is those models need data, right? That has not gone away and they need trusted data, they need high quality data. So we see a huge demand and opportunity for our services and the data management side there. But we're also invested heavily in using generative AI to really revolutionize data management. And so we announced our Clare GPT and Clare Co-Pilot Informatical World back in early May to address those kind of opportunities. So tell us more about both of those products. Yeah, so Clare GPT, the idea is, I think one of the big transformative things about generative AI is it actually lets you take some very complex and nuanced requests, express them in pretty significant descriptive English language descriptions, and then actually turn them into something actionable, executable, right? So you think about our data management cloud interface. Instead of sitting on a canvas, drawing a mapping or a pipeline, it's, I have a text box and I want to tell it, I need data from these three systems. I'm going to, the target's from my CEO, needs to look at executive dashboard. I need these metrics, this rollup, and be able to turn that into a pipeline of integration, connection, quality and everything is hugely transformative. So that's a bigger focus for us is helping customers kind of just have that next generation experience and really huge productivity enhancement. And then on the Clare Co-Pilot, that's all about in our regular product experience, how do we bring the power of generative AI to help make better decisions, to help have assistive technology to recommend data quality transformations or items that you should be concerned about. Again, it's all about efficiency and acceleration of the work when it comes to data management. Is that, now, is that the products that you've announced? Yes. It's shipping yet or it's in beta? Yeah, so we're in private preview with those right now, so early customers are there. And I think, but we announced a minute for medical world we've been working on for quite a while and it is a big area of focus for us. And how do you, everybody talks about guardrails, how do you make sure the thing doesn't hallucinate or give bad, is that part of what you're working on now to sort of put those sort of frameworks in place? Yeah, absolutely. I mean, so the trust in governance is critical, right? So governance is going to be important. So when I have a result from these generative AI tools that generate a mapping or a pipeline or definition, how do I make sure they're high quality? How do I make sure it's producing the right thing? So that validation verification, absolutely part of it. The nice thing is the domain we're in though in terms of what we're creating and generating is it's not the Wild West of the public domain where you're just searching everything in the internet. So it's a little more constrained, controlled and a little more manageable from that perspective. But absolutely, governance is top of mind, like it always is for us. Is in your world though, even though it's more narrow, is explainability critical? So how do you create that transparency? Is that something that you guys designed in? Did you partner for that? Yeah, so we have a whole team that's just focused on this area. And yeah, explainability and the user experience of what information needs to come with the output, right? When you look at some of the public GPT examples out there, explaining the sourcing of data, that's really important, right? What are the references? Where does that actually come from? Because that helps you, again, verify and make sure it's not hallucinated. Is it more than a link? Yeah, exactly. Is that fair to say? Absolutely, and we've always had very detailed lineage about what's going on with data. So when you create a mapping, you will give you a lot of transparency always into what's being done in our platform. And that's going to be absolutely critical for customers having confidence in the outputs of these capabilities. That confidence that you mentioned isn't trivial. That's incredibly important for organizations to have confidence, not just in the data, but in the systems and the quality of the data and the trust so that they can use that data, extract insights from it as quickly as possible and deliver to the end users. I'm curious, who's Claire? Is Claire an acronym? Yes, it's a good question, but there's a couple different stories where Claire stands for. But it conveniently has AI in it. It's actually an engine that we've had for quite a while. We've been invested in machine learning and AI and accelerating data management. But I think the difference is, previously with Claire, you could do things, like express the data quality rule in English language, right? It's like saying, okay, this value needs to be within these ranges, right? But the complexity of what you can express with generative AI is just a huge leap forward. So we're excited to build on that practice, that foundation and really accelerate into the generative AI space as well. And in customer conversations, where are you talking these days? Is it the chief data officer, chief information officer, lines of business? I imagine all of them need to align, but where are you finding customers coming to you saying help us out? Yeah, definitely chief data officer sits front and center to the conversation, but you have CIOs, you have chief analytics officers, and you have line of business. Line of business has become increasingly sophisticated on technology and data, and they have much bigger mandates there as well. But we do see this idea of a data community through data governance getting tighter, and the closer that community works together and collaborates around data, the more effective and efficient they are at transforming the business. So the native application framework, the container service has got a lot of the buzz today. I think you guys are taking advantage of at least some of that with the Snowflake native application. What's that experience been like? You guys are early on, I think there's maybe, I don't know, a couple dozen at this point in time, and knowing Snowflake, that's going to scale. But what's that experience been like? That's been great. I mean, it's similar to the work we're doing with SuperPype, highly collaborative, very close engineering, giving a tight feedback loops about what works, what doesn't work, and how we want to bring it to market. But at the end of the day, our native application is all about, again, lowering the barrier to entry, right? I want to be able to give the ability to use SuperPype without ever having to leave the Snowflake experience, but getting the power of the IDMC platform on the back end, right? The power that drives those 54 trillion transactions a month, customers benefit from that, but they can acquire this from the, they'll be able to acquire it from the marketplace. They announced today the ability to use your Snowflake credits and commit to actually purchase those third-party applications. So we're looking really excited about what's going to be a very frictionless and friction-free offering for customers and a way of acquiring our technology. Yeah, that was an interesting announcement that I think maybe it's a little bit misunderstood, but talking to a couple of customers, they're like, yeah, we get it, we're into it. This is a beautiful thing for us to just dial more optimization out of the platform. Yeah, and it makes a lot of sense if you look at the cloud platforms themselves and how they've approached marketplaces for the last several years, that concept of the spend commitment and being able to leverage as a third-party has been huge to help customers acquire all the pieces they need, but it is all about aligning that consumption-driven world. So what people might not know is even though we talk about consumption, there's an area under the curve you commit to a certain amount of spend. You don't have to spend it all in one month, but over the term, you do have to spend it. And so giving customers more options and more value vectors is a good thing. Yeah, absolutely. For example, we at Informatica where we also launched our new ISV Innovate Program for our ecosystem of ISVs who want to bring things like data security and data quality into our IDMC platform as well. Yeah, similar models, right? Do you have a favorite customer story or example that really shines a light on the value that Informatica is delivering to customers in simply flake environments? Absolutely. Actually, I just got done with the session with the Amaha Financial Credit and their great transformation story, right? They were in Informatica on-premises customer. We have a really strong modernization, automated modernization program for customers to go to our cloud offering. But they chose our IDMC platform not just because they were familiar with Informatica technology, but also because they wanted things like application integration. They wanted data governance and data catalog. And so we're really aligned in supporting their entire journey. They don't have to do it all on day one, but they know what the next steps are, right? As they get there. But they've had a huge impact on their business already in terms of just efficiency and their ability to leverage data as an asset. Well, we've got to become a data company and if they aren't already, they're on their way. Rick, thank you so much for joining us, sharing what's new with Informatica. Super pipe, clarifying what that is, clear GPT and all the things that you guys are doing together. We appreciate your insights. Thank you. All right, our pleasure. For our guests and for Dave Vellante, I'm Lisa Martin. Up next, stick around fresh from the keynote stage. Kristin Kleinerman joins us, the SVP of Product Management here at Snowflake. He's going to be having a couple conversations with Dave and George about Snowflake, so on innovations, announcements, bringing Gen AI to the enterprise at cloud scale, all that good stuff. I want to remind you that all of our Snowflake sessions, all of our CUBE events, you can find them on thecube.net and of course, all of our analysis on siliconangle.com. You're watching theCUBE, the leader in live tech coverage.