 Joining me is Aaron Suzuki, founder and CEO of Prowess. Aaron, welcome. Thank you. Thanks so much for having me. Absolutely. Thanks for joining us. So let's dive right in. Tell us about Prowess. Prowess has been around for quite a while. We've been serving the technology industry from the very beginning, almost 20 years. We've always been able to bridge the gap between the story of the product and what it actually does. And a lot of times there's a pretty fundamental disconnect between what engineering says and what marketing wants to claim. And so this is sort of how we got down this road of getting into testing and validation of products such as we do today quite extensively. That's really what we're focusing on right now is this idea of your independence as a lab. And in this particular case, it's a series of tests that you've done for using Dell hardware combined with Broadcom cards. So talk a little more about that, about that concept of independence and what that means. Yeah, you know, it's important to us that we stay vendor agnostic, platform agnostic. And there are a lot of things happening concurrently in the industry. A lot of people want to get a lot of work done really fast and most customers are not sort of vendor exclusive. In fact, we're not sure we know of any. We always try to keep this objective point of view. That is to say that we don't allow our customers to buy results when we're doing quantitative testing. We really are out there trying to come up with a story or a narrative. And that really seemed to be the missing link in all of this is that there are the quantitative houses that do traditional benchmark testing on one side. And then system integrators and kind of on the other extreme agencies that would really do the narrative and the system integrator side build out a solution. But they wouldn't be able to tell you how it would perform. And so reconciling those two things really became challenging. So having a source that would be able to give you that insight that goes beyond just transactions, you know, per whatever unit of time. And finding some of these metrics in between that were more relevant to people's jobs was really the inspiration for creating this unique practice that we call prowess labs. So Aaron, when I think about performance testing, it's very easy to think of it from the perspective that it's a bunch of hardware slapped together in Iraq. You get some engineer or scientist to run some tests. Why prowess? What do you specifically bring to the table that's meaningful? Performance testing is usually done in one of two ways predominantly. One way is a very academic approach which says this specific benchmark test run this specific way gives us X. Another approach is more narrative in nature and more demonstrative. And there's this huge gap in between and that's really what prowess labs exist to fulfill. So Aaron, give us an idea of the scale of prowess. How many of these projects have you worked on? How many how many customers have you worked with over over time? We do this work with most of the leading global hardware and software manufacturers and a select number of emerging providers as well. So for us, you know, year to year dozens of projects of varying scope and scale. Various projects also run in kind of programmatic form where we're kind of iterating constantly throughout the year. So it's really a lot of fun for our team members to do this and some of them have been doing it for 10 or 12 years in continuity. Aaron, thanks for joining us to talk about prowess today. My pleasure. Thanks for having me.