 Good afternoon, everyone. Welcome back to theCUBE's coverage. Day two, Snowflake Summit 23 live from Ceasar's Forum in Las Vegas. Lisa Martin with Dave Vellante. Dave, yesterday there was a, I like to liken it to the, those T-shirt guns in an NBA game. T-shirt gun cannon of content of launches that actually came out yesterday. Snowpark Container Services was one of them. Yes. That was the star and, you know. And the star is here on theCUBE. We're going to be digging into that next. We've got two guests with us. Caitlyn Calgrove joins us, the co-founder and CTO of Hex Technologies. And Torsten Graves, one of our alumni is back, Senior Director of Product Management at Snowflake. It's great to have you both. Welcome. Thank you so much for having us. Yeah, thanks for having us on. Snowpark Container Services, big news. We're going to get into that. First, Caitlyn, tell us a little bit about you. You are the CTO and co-founder of Hex Technologies. You guys are a 75 person strong organization. Young organization. Yeah, yeah. And big accomplishments. Tell us a little bit about you and about Hex. Yeah, I mean, my background is in software engineering, but I always skewed towards user experience and product. And over, you know, over my career, I've built dozens maybe of different data and data analytics experiences, both as a software engineer, a palantir for many years, and then eventually leading teams, building those products. And I think that really is the origin story of Hex. Barry and I and our third co-founder, Glenn, all worked together on some of these things. And both as builders, but also as users, we experienced the deep pain of data science workflows, especially back four or five years ago. And what we were seeing really in the market was that all of the tools available were sort of deeply fragmented. And fragmented along lines that didn't make any sense. Either if you wanted to use SQL, you had to go to one place. If you wanted to use Python, if you had to go to another place. And so you end up with these really, really painful workflows across half a dozen different tools. And these tools were also fundamentally not collaborative in the way that modern data teams work. You were working in a silo on a local notebook. I know we made a big deal about silos earlier in the conference. And it was really not, you were not able to share that work or make it useful to the rest of the organization. So these were really some of the problems that we were trying to tackle when we set out as co-founders. And it's actually been, it's been so exciting to see all the excitement at the conference about Hex to really see how we've been able to kind of deliver on a lot of those original goals that we had when we set out. And Hex was a design partner working with the product engineering team on Snow Park Container Services from day one. Talk to us a little bit about the collaboration and then even Torsten from your perspective, why Hex? Yeah, so we're super thrilled to have Hex as one of our launch partners for Snow Park Container Services. I think what intrigued us was exactly what Caitlin described right now. We also saw that silo thing happening, right? And we really looked at Hex as a way to break down the silos, right? We're all about breaking down data silos. And what was particularly fun was that I would argue that Hex was one of the more complicated offerings that we had to stand up on Snow Park Container Services. So working that deeply with Hex actually helped us find those rough edges and making sure that Snow Park Container Services really performs well, really scales and can run the most challenging applications that are out there today. So what you guys announced, if I understand it correctly, dramatically simplifies the way in which I can do things inside a snowflake. I think you're even containerizing Jensen's GPUs, right? Making that consumption easier. Where are you today and how far can you take this? In other words, I'm trying to get an understanding of where do I want to do, I presume you want your customers to do the data engineering and the data pipeline work inside a snowflake. And you want to make that as efficient and cost-effective as possible. I presume that that's been your experience at Hex, right? Yeah, absolutely. With all the partners, so over half of our customers use Snowflake as their preferred data warehouse. And I think, honestly, that's just a testament to the power and the flexibility of the product that Snowflake provides. And really what we're seeing is more and more that people are using Hex as the developer experience, like the interface on top of a lot of the infrastructure that Snowflake has built. And what I think Snowpark Container Services really does is it makes that completely seamless and transparent. We're all running in the same environment. And so there's really almost no distinction in terms of like, oh, Hex is running over here and Snowflake's running over here and how do they talk to each other? It's all just one amazing data science experience running on Snowflake. So that's the value, but help me understand this, because some customers have said, well, I don't want to do all that data pipelining inside of Snowflake. It's too expensive. I got to pay the Amazon markup that Snowflake's charging. I pay Snowflake. But I'm hearing from you that the experience is so much better, it makes it up in business value in other ways. And I'm trying to square that circle. Can you help me do that? Well, I can let Torsten speak to the sort of data pipelining side on Snowflake. With Hex, it's actually really beneficial for Hex customers to be running inside Snowflake, because originally you'd be paying for Snowflake, you'd be paying for Hex, you'd have a couple separate licenses, you have to go through procurement twice. But now actually with Hex running inside Snowpark, you can actually use Snowflake credits and you can actually start to draw down that and have one single budget for all of your data science and data engineering needs. So from our perspective, that's a huge win for the customer. But I don't know if you can speak to the data engineering workloads. But that's a super convenient, that's a convenience factor and it allows you to move faster, but yeah, can you? There's time to market considerations that just follow straight from what Caitlin just mentioned, right? Creating a solution that works if you can stay in the same simple Snowflake environment, just has much better time to market for you, for your specific use, because there's tremendous value in that for our customers. The other one is around risk. If you have a siloed solution with different separate pieces, you typically then have data that's being sent out of your Snowflake account, stored somewhere else as a redundant copy and as soon as you do that, you're introducing risk in your organization because you have a separate data asset that you need to govern and secure, right? So that also has implications for organizations that like literally we can remove right now because everything runs for our customers with even the most sensitive proprietary data, it runs in their Snowflake account within their security perimeter in Snowflake. So the best practice that you're recommending is get the data into Snowflake and then once it's there, do your full end-to-end operations on that data. Yeah, move your work to where you manage and store your data, which is Snowflake. Pretty simple. And it's okay. So how far can you take this? Like we just sort of, early days, what else do you need to do to really get people to say, okay, I'm now going to move that data in. It's a great sales pitch. Okay, you got to start shipping all these products and everything else. How far can you take this? So I think we are already taking it very far with Snowpark Container Services. Hex is a great example for that, where we are now essentially running a full stack application end-to-end in the customer Snowflake account, literally from the physical storage of the data in the Snowflake account all the way up to the code that generates the UI, the UX experience, right? And that's tremendous value. And I think where I would see kind of the next big milestone for us is around data applications that we're powering through that stack, through that experience, right? So now I would argue we can do that in our current infrastructure, in our current experience. As soon as you remove data from data applications, then I think it becomes interesting to see whether we want to take that on as well. So you've got a lot of different ways to query. You were sort of describing this before, I've been writing it down the whole conference, okay, you can obviously port SQL in data frames. Neva, you got search. I got documents in there. I don't know if that's a query method, but whatever, supervised machine learning models you can do. Now I can do, there's probably some stuff you did with NVIDIA that affects that, maybe to take it to unsupervised. And I have a lot of different data types. I got analytic OLAP, I got transactional data. You know, once you ship Unistore, I know a lot of people are excited to get that. You got streaming, all in one place, right? What does that mean to a developer like you? Honestly, really what it does is it empowers and makes the experiences that we can build on top of that so much richer. So I think one really great example and another partnership that we've been investing a lot in over the last year is the Snowpark team, or the Snowpark, the Snowpark library side, Snowpark ML, things like that, in addition to the container services. And this is a technology that really wasn't possible not that long ago. This sort of this ability to run Python at your cloud data warehouse scale. And then on top of that, what we've been able to do with Hacks, which is actually one of my favorite things that we've done because it seems so simple, but the technology is actually quite tricky. And so it's fairly unique in terms of how we've built it is this ability to move seamlessly back and forth between different types of querying. So some tasks are much easier to do in a language like SQL. And some, like model training or regressions or things like that, are much easier to do in Python. And so most, at the end of the day, most analytics and data science workflows use a combination of both. And before Snowpark, really you kind of had to make some hard choices. Either I'm going to use SQL and run it at scale, or I'm going to maybe use Python, but I'm not going to have that same compute power available to me. And Snowpark takes all of that away. And this is what we've been really excited to ship and launch with the Snowpark team is being able to treat Snowpark data frames as a native first class object in Hex that you can not only write direct Snowpark code and take advantage of all the compute power that Snowflake has, but you can query those with SQL, you can build visualizations on top of those, and really all of the new capabilities that Snowflake is bringing out, whether it's different types of models or all of those things, those are things that we're really excited to bring to the users in the Hex interface and really take advantage of those going forward. So that's the magic, right? You can do all that, you can join all those different data types and return a coherent set of data irrespective of the data types, the storage type, the database you're using or the data format that you're using. That's what I want to enable. Like literally a term that I like to use is polyglot Snowflake, right? So different audiences, different personas have different preferences around their programming languages, for example, right? So you have the data scientist, DML practitioner that's operating in Python, but that persona needs to collaborate with other parts of the business, like for instance an analyst that is much more comfortable operating in a SQL based environment, right? How do you actually orchestrate a workflow that creates a machine learning model in Python in the data science team, which is then being handed over to the analysts that are running this machine learning model in production in a SQL based environment or that runs as part of a data pipeline, again where a data engineer may build that data pipeline in SQL. And that's what we're enabling through that polyglot Snowflake experience is that virtue cycle between different teams to collaborate in their programming languages of choice. That collaboration is critical, especially when we talk about lines of business with data science, data engineers. What are some of the outcomes? Talked about what's in it for the data scientists, the developers, the engineers. What are some of the business outcomes? Any projections maybe by industry based on all the industry clouds, the data clouds that Snowflake has, that you're expecting organizations to be able to achieve with this technology? I'll maybe use an overused term here again. There's this democratizing ML aspect that everybody's obsessing about. I think we're making a big step forward with enabling that collaboration that I was just hinting at because there's so many data science projects that fail after a powerful model has been created, but it never gets productionized into the mission critical environments of the data stack. And we are simplifying the heck out of this. You stay within your Snowpark environment, be it Snowpark on a virtual warehouse or Snowpark container services, it makes it seamless, it's the same environment in which you create these artifacts we're using and running them in production. And then with Streamlit, it also becomes much more approachable for end users, business users in the organization. Yeah, and we're seeing this a lot with Hex too, where with previous sets of tooling, you would get these beautiful analyses or models that were locked up in the data team and they really had nowhere to go. And there wasn't a shared data warehouse that they could connect to and there wasn't a shared artifact that they could give to someone who was maybe less technical. But as we've seen and we've come into organizations, we've actually seen Hex rapidly expand to more or less the entire organization because it's so accessible for everybody of different technical levels that really they're able to, at whatever level of technicality they have, they're actually able to collaborate on those underlying data workflows themselves with these shared Hex projects. But they don't have to be super technical. So Hex has a couple of modes that it can operate in. So it's sort of like a more development mode, which is a notebook UI style, but they're deeply interactive and so you can actually also build experiences on top of those that you can share with other folks in the organization to take, as we were saying, sort of like building that connection between the more technical data scientists, maybe they're writing it, maybe the analysts are writing in SQL. Hex also has some more granular sharing capabilities where you can share pieces of logic with each other and then all the way down to the non-technical users where you have sort of interactive applications that you can build using the Hex product. So you started the company to attack this data workflow hell. Okay, and so I'm envisioning, and I know organizations that have, I don't know, 30 hyper-specialized, you know, they got data engineers, they got quality engineers, they got data scientists, they got analysts, and they spend their entire day trying to come up with dashboards that are consistent and it's really painful. One person makes a change, everything stops and they're waiting for one person and they're on these stove pipes. And I've seen industries, companies get started to try to compress that by 20% because it's so painful. It seems to me like you're blowing that away, like completely, so what happens to those hyper-specialized roles? Do they blend together? Do they get simplified? Do they just get more productive and they can do more work? How do you see that playing out? I would say a lot of the work, a lot of the time is being spent on just managing infrastructure problems. And I think we are now creating an opportunity for those teams to let go of these infrastructure problems and have us solve them for them. And these teams can now much more focus on the business question that they need to answer. And I think that will create tremendous value for those organizations. What's an example of that? You're not talking about spinning up an EC2 instance, right? Well, we are, actually. A lot of the time in order to enable some of the workflows that you were talking about, in terms of collaborating between different teams and having things that people need, different people in the organization needed to access, sometimes it was as hard as spinning up your own EC2 instance to host, in the case of something like the open source notebook product, Jupiter, in order to host Jupiter internally, we were seeing data teams fighting with that level of infrastructure. And these people are really smart and talented and that is something that they can do. But at the end of the day, what they're trying to do is they're trying to answer questions to drive the business forward and spending so much time managing infrastructure, managing all of this data cleanliness and all of those things, takes away from their ability to answer the really interesting creative questions in the business. And that's what they want to be doing. Yeah, it's exactly what they want to be doing. That's what they want to be doing. Guys, great stuff. Hex and Snowport Container Services, secure analytics and machine learning for the enterprise. When can folks get their hands on this? Now? Now. Yeah, come talk to us. Yeah. All right. We're ready. And what's in the name, Hex? I forgot to ask you, what's in the name? Yes, it actually just comes very simply from the hexagon originally because we thought that that was a really creative item for building logos and designs and things like that. But then actually as we've sort of developed our brand we've moved more into Hex as magic because that's really the aesthetic that we want to go for with our product. It should feel like magic. Awesome website. Spelliamis. Yeah, hex.tech. Yeah, hex.tech. Thank you so much for joining us guys. Congratulations. Go have that well deserved cheers with the Snowpark team. Will do, thank you. Appreciate your insights and your time. For our guests and for Dave Vellante, I'm Lisa Martin. You're watching theCUBE. Up next we're going to be talking about the telecom data cloud. You can find all of our content on theCUBE.net, all of our analysis and editorial content on siliconangle.com. You're watching theCUBE, the leader in live tech coverage.