 Hey everyone, welcome back to theCUBE. Lisa Martin with Dave Vellante. We're here in Las Vegas with Snowflake at the Snowflake Summit 22. This is the fourth annual. There's close to 10,000 people here. Lots going on. Customers, partners, analysts, press media, everyone talking about all of this news. We've got a couple of guests joining us. We're going to unpack Snowpark. Torsten grabs the director of product management at Snowflake, and Joe Nolte, AI and MDM architect at Allegious Group. Guys, welcome to the program. Thank you so much for having us. Isn't it great to be back in person? It is. Wonderful. Yes, indeed. Joe, talk to us a little bit about Allegious Group. What do you do? And then tell us a little bit about your role specifically. Well, Allegious Group is a collection of operating companies that do staffing. They're one of the biggest staffing companies in North America. We have a presence in Amia and in the APAC region. So we work to find people jobs, and we help get them staffed, and we help companies find people, and we help individuals find people. Incredibly important these days. Excuse me? Incredibly important these days. It is, it varies right now. Tell me a little bit about your role. You are the AI and MDM architect. You wear a lot of hats. Okay, so I'm an architect, and I support both of those verticals within the company. So I work, I have a set of engineers and data scientists that work with me on the AI side, and we build data science models and solutions that help support what the company wants to do, right? So we build it to make business processes faster and more streamlined, and we really see Snowpark and Python helping us to accelerate that and accelerate that delivery. So we're very excited about it. Explain Snowpark for people. I mean, I look at this wonderful sandbox. You can bring your own developer tools in, but explain in your words what it is. Yeah, so we got interested in Snowpark because increasingly the feedback was that everybody wants to interact with Snowflake through SQL. There are other languages that they would prefer to use, including Java, Scala, and of course Python, right? So then this led down to our work into Snowpark where we're building an infrastructure that allows us to host other languages natively on the Snowflake compute platform. And now here what we just announced is Snowpark for Python in public preview. So now you have the ability to natively run Python code on Snowflake and benefit from the thousands of packages and libraries that the open source community around Python has contributed over the years. And that's a huge benefit for data scientists, ML practitioners, and data engineers, because those are the languages and packages that are popular with them. So we very much look forward to working with the likes of you and other data scientists and data engineers around the Python ecosystem. Yeah, and Snowpark helps reduce the architectural footprint. And it makes the data pipelines a little easier and less complex. We had a pipeline and it works on D&B data and we converted that entire pipeline from Python running on a VM to directly running down on Snowflake, right? We were able to eliminate code because you don't have to worry about multi-threading, right? Because we can just set the warehouse size through a task, no more multi-threading, throw that code away, don't need to do it anymore, right? We get the same results, but the architecture to run that pipeline gets immensely easier because it's a stored procedure that's already there and implementing that, calling to that stored procedure is very easy. The architecture that we use today uses six different components. Just to be able to run that Python code on a VM within our ecosystem to make sure that it runs on time and scheduled and all of that, right? But with Snowflake, with Snowflake and Snowpark and Snowflake and Python, it's two components. It's the stored procedure and our ETL tool calling it. Okay, so you've simplified that stack and eliminated all the other stuff that you had to do that now Snowflake's doing. Am I correct that you're actually taking the application development stack and the analytic stack and bringing them together? Are they merging? I don't know. I think in a way, I'm not real sure how I would answer that question, to be quite honest. I think with Streamlit, there's a little bit of application that's going to be down there so you could maybe start to say that. I'd have to see how that carries out and what we do and what we reproduce to really give you an answer there, but yeah, maybe in a little bit. Well, the reason I asked you was because we always talk about injecting data into apps, injecting machine intelligence and ML and AI into apps, but there are two separate stacks today. Aren't they? Certainly the two are getting closer. With Python, Python, it gets a little better. Explain that. Just like in the keynote, right? The other day with Shree, when she showed her sample application, you can start to see that because you can do some data pipelining and data building and then throw that into a training module within Python right down inside a snowflake and have it sit in there and then you can use something like Streamlit to expose it to your users, right? We were talking about that the other day about how do you get an ML and AI after you have it running in front of people? We have a model right now that is a predictive and prescriptive model of one of our top KPIs, right? And right now, we can show it to everybody in the company, but it's through a Jupyter notebook. How do I deliver it? How do I get it in the front of people so they can use it? Well, with what we saw with Streamlit, right? It's a perfect match and then we can compile it. It's right down there on Snowflake and it's completely easier time to delivery to production because since it's already part of Snowflake, there's no architectural review, right? As long as the code passes code review and it's not poorly written code and isn't using a library that's dangerous, right? It's a simple deployment to production. So because it's encapsulated inside of that Snowflake environment, we have approval to just use it however we see fit. It's very, it's faster delivery. That code review has to occur irrespective of, you know, whatever you're running it on. But okay, so I get that and, but it's a frictionless environment you're saying, right? What would you have had to do prior to Snowflake that you don't have to do now? Well, one, it's a longer review process to allow me to push the solution into production, right? Because I have to explain to my infosec people, right? My other, it's not trusted. Well, don't use that word, right? It got it, there are checks and balances in everything that we do. It has to be verified. Yeah, that's all it's just part of the, it's part of the, what I like to call the good bureaucracy, right? Those processes are in place to help all of us stay protected. It's the checklist. Yeah, that's all it is. It's like flying a plane. But that checklist gets smaller and sometimes it's just one box now with Python through Snowpark running down on the Snowflake platform. And that's the real advantage because we can do things faster, right? We can do things easier, right? We're doing some mathematical data science right now and we're doing it through SQL. But Python will open that up much easier and allow us to deliver faster and more accurate results and easier. Not to mention, we're going to try to bolt on the hybrid tables to that afterwards. Oh, we're going to talk about that. So, can you, and I don't need an exact metric, but when you say faster, talking 10% faster, 20% faster, 50% faster? That really depends on the solution. Give me a range of the worst case, best case. I really don't have that. I know, I wish I did. I wish I had that for you but I really don't have that for you. Obviously it's meaningful. It is meaningful. It has a business impact. It'll be, I think what it will do is it will speed up our work inside of our iterations so we can then look at the code sooner, right? And evaluate it sooner, measure it sooner, measure it faster. So is it fair to say that as a result, you can do more? Yeah, that's fair to say. We'll be able to do more. And it will enable more of our people because they're used to working in Python. Can you talk a little bit about, from an enablement perspective, let's go up the stack to the folks at Allegious who are on the front lines helping people get jobs. What are some of the benefits that having Snowpark for Python under the hood, how does it facilitate them being able to get access to data to deliver what they need to to their clients? Well, I think what we would use Snowfake for Python for there is when we're building them tools to let them know whether or not a user or a piece of talent is already within our system, right? Things like that, right? That's how we would leverage that. But again, it's also new. We're still figuring out what solutions we would move to Python. We have some targeted, like I have developers that are waiting for this and they're in private preview now, they're playing around with it, they're ready to start using it. They're ready to start doing some analytical work on it to get some of our analytic work out of GCP, right? Because that's where it is right now, right? But all the data is in Snowflake, you know, just, but we need to move that down now and take the data out of us. The data wasn't in Snowflake before, so the dashboards are up in GCP. But now that we've moved all of that data down into Snowflake, the team that did those analytical dashboards, they want to use Python, because that's the way it's written right now. So it's an easier transformation and easier migration off of GCP and get us into Snow, doing everything in Snowflake, which is what we want. So you're saying you're doing the visualization in GCP, is that right? It's just some dashboarding, that's all. Not even visualization, you don't even give it for it. You don't even give me that, okay? No, because it's not visualization, it's just some dashboardings of numbers and percentages and things like that. It's no graphic or anything like that. And it doesn't make sense to run that in Snowflake in GCP. You could just move it into AWS or... Well, what we'll be able to do now is all that data before was in GCP and all that Python code was running in GCP. We've moved all that data out of GCP and now it's in Snowflake and now we're going to work on taking those Python scripts that we thought we were going to have to rewrite differently because Python wasn't available. Now that Python's available, we have an easier way of getting those dashboards back out to our people. Okay, but you're taking it out of GCP, putting it to Snowflake, where? Anywhere? Well, so we'll build those dashboards and they'll actually be displayed through Tableau, which is our enterprise tool for that. Okay, and then when you operationalize it, it'll go... But the idea is it's an easier pathway for us to migrate our code, our existing code, that's in Python, down into Snowflake, have it run against Snowflake, right? Because all the data's there. Because it's not a going out and coming back in, it's all integrated. We want our people working on the data in Snowflake. We want, that's our data platform, that's where we want our analytics done, right? We don't want them done in other places when I get all that data down and over our data cloud journey, we've worked really hard to move all of that data that we use out of existing systems on-prem and now we're attacking the data that's in GCP and making sure it's down. And it's not a lot of data and we fixed it with one data pipeline, exposes all that data down in Snowflake now. We're just migrating our code down to work against the Snowflake platform, which is what we want. Why are you excited about hybrid tables? What's the potential? Hybrid tables I'm excited about because some of the data science that we do inside of Snowflake produces a set of results and their recommendations. Well, we have to get those recommendations back to our people, back into our talent management system. And there's just some delays. There's about an hour delay of delivering that data back to that team. Well, with hybrid tables, I can just write it to the hybrid table and that hybrid table can be directly accessed from our talent management system, be it for the recruiters and for the hiring managers to be able to see those recommendations in near real time. And that, that's the value. That's the value. Yep, we learned that access to real time data in recent years is no longer a nice to have. It's like a huge competitive differentiator for every industry, including yours. Guys, thank you for joining Dave and me on the program talking about Snowpart for Python, what that announcement means, how Allegious is leveraging the technology. We look forward to hearing what comes when it's GA. Yeah, we're looking forward to it. All right guys, thank you. For our guests and Dave Vellante, I am Lisa Martin. You're watching theCUBE's coverage of Snowflake Summit 22. Stick around, we'll be right back with our next guest.