 So, two points, Max. First off, ideas aren't worth a damn. Everybody's got ideas, right? I could give a holy hoot about ideas. I mean, I got people who throw ideas at me all the friggin' time. I don't give a shit. I just really don't give a shit, right? I want actions. Show me how I'm gonna turn something into an action. How am I gonna make something better, right? And I wanna know ahead of time what that something is. Am I trying to improve customer retention, trying to improve recovery time for an athlete who's got back-to-back games, right? I know what I'm trying to do. And I wanna focus on that. Where ideas become great, and you said it really well, Max, is ideas are something that I wanna test. So, but I know what I wanna test vis-a-vis what outcome I'm trying to drive. So it isn't just, it isn't ideation for the sake of ideation. It's ideation around the idea that I need to drive an outcome. I need to have athletes that are better prepared for the next game, who can recover faster, who are stronger and can play through a longer point of the season. Here we are in March Madness. And we know, by the way, that the teams that tend to rise to the top are the teams that have gone through a more rigorous schedule, played tougher teams, right? They're better prepared for this. And it's really hard for a mid-major team to get better prepared because they're playing a bunch of lollipop teams in their own confidence. So it's ideas really don't excite me. Ideation does, around an environment that allows me to test ideas quickly, fail fast, in order to find those variables or metrics, those data sources, it just might be better predictors of performance. Yeah. I like the idea of acting quickly, failing quickly and learning quickly, right? You have this loop. And what happens is, and I think every strength coach in the world is probably guilty of this, is we get an idea and we just apply it. You go, oh, I think eccentric trainings, this great idea and we're going to do an eccentric training block and I just apply it to my athletes and you don't know what the hell happened because you don't have any contextual metrics that you base your process on to actually learn from. So you at the end of the day go, I think it worked. They jump high, but you're not comparing that to anything, right? They've been in the weight room for three months. My God, I hope they jump higher. I hope they're stronger. Like I can sit in the weight room and probably get stronger for three months. And my thought is, but let's have context and I call them anchor data points that we're always reflecting back on. So for example, if I have a key performance metric where I want to jump high, I'll always track jumping high, but then I can apply different interventions, eccentric training, power training, strength training, and I can see the stress response of these KPIs. So now I've set an environment that we have our charter still there. My charter being, I'm going to improve my athletic development. And that's my goal. I'm basing that charter on the KPI of jumping high. So key performance indicator of jumping high. Now I can apply different blocks and interventions with that anchor point over and over again. And the example I give is, I don't come home and ask my girlfriend how she's doing once every month. I ask her every day. And that's my anchor point, right? And I might try different things. I might try cooking. I might try making dinner. I might do the dishes. I might stop forgetting our dates. I might actually buy groceries for once. But if she gets happier then I'll continue to buy groceries. Maybe I'll remember it's her birthday, March 30th. I remember that. That's why I put it on there. Right. And so, but the idea is we have in life, the way life works, we have these nodular points where we call anchor points, where we self-reflect and we reflect off of others. And we understand our progress in our own life environment based on these anchor points and we progress and we apply different interventions. I want this job. Maybe I'll try having this idea outside of here. Maybe I'll play in a softball league. And we're always reflecting, is that making me happier? Is that making me feel fulfilled? And I don't understand why we don't take what we do every day in life subconsciously and apply it into the sports science world. But a lot of it is because it happens subconsciously because that's how our body has learned to evolve. We have anchor points. I want to survive. I want to have kids, lots of kids, strong kids and die. And I die so my kids can have my food. And that's what we want as a body. Your body doesn't care about anything else. And so that's why you walk with a limp after you get hurt. You don't walk perfect again. It's a waste of energy to walk perfect. You can still have kids with a limp. I hate to break it to you. We're not running from animals anymore. And so we have all these anchor points in life. Let's apply that same model now. And like you said, it's like design thinking and actually having that architecture to outline it. Whether it's in that hypothesis canvas to force us to now consciously do it because we're not just interacting with ourselves now. We're interacting with other systems, other nodes of information to now have to work together in unison to achieve our company's charter. Interesting, Max. There's a lot of key points in there. The one that strikes me is measurement. John Smale at Procter & Gamble Line was there. He said, I say you are what you measure and you measure what you reward. That was his way of saying as an organization that the compensation systems are critical. And the story you just walked through about what Kelsey and what you guys are doing and how you increase your happiness level. My worth of damnness. Your work, I mean, that is how you're rewarded. If you're rewarded by happiness and so you learn to measure if you're smart, that you don't miss birthdays, that you do dishes, you help up around the house, you do things and when you do those things, the happiness meter goes up and when you don't do those things, happiness meter goes down and you know because you're probably pulling not just once a day, but as you walk by her throughout the day on a weekend, you're constantly knowing, right? If you're like your mom, right? You know when mom's not happy. You don't need to be a day to sign this and know mom's not happy. And so then you re-engineer about, okay, what did I do wrong that causes unhappiness, right? And so life is a lot. There's a lot of life lessons that we can learn, that we can apply to either our business, our operations or sports, whatever it might be that your profession is in about the importance of capturing the right metrics and understanding how those metrics really drive you towards a desired outcome and the rewards you're gonna receive from those outcomes. Yeah, and with those, it's the right metrics, right? That's not metrics, the right metrics. I wanted to know if someone was happy, I wouldn't go look at the weather. I wouldn't check gas prices, especially if I'm curious if they're happy with me. Well, maybe they might reflect if they're happy in general. If they're happy with me, right? Now I'm contextualizing what I'm actually trying to look at. I know a little bit more about what I should look at, but I don't know everything. And so you might have metrics that you say, I know science says this metric is good. This metric is good. Maybe we wanna explore these couple of metrics over here because we think that either A, they're related to one of these metrics or they're related to the main outcome itself. And that gives you a way to then, I have these key anchor metrics. It's not stacking the deck, but it's knowing you're gonna get insights out of it. And then I have these exploratory metrics over here, which are gonna allow me then to dive and explore elsewhere. And if you're a company, those can be trade secrets, they can be proprietary information. If you're a trainer, it can be ways to learn how different athletes adapt to make yourself better. And again, if we're talking about a company and we're talking about a trainer, there's no difference when it comes to trade secrets, right? Trainers keep their trade secrets and companies keep their trade secrets. As we talk about this, it's really easy to see how these two environments, whether we're talking about a company, athletic development, sports, science, personal training, health and wellness are really universally governed by the same concepts because life itself is typically governed by these concepts. And when we're applying those kind of iterations to it, you can really begin to quickly learn what's going on and whether or not those metrics that you thought were good are good and whether or not you can learn new metrics and from that. Max, you raise an interesting question or made a point here that I might be very different in the sports world than it is in the business world. And that is the ability to test. And what I mean by that is the business world is full of concepts like A, B testing and placebo testing and simulations and things like that. When you're dealing with athletes individually, I would imagine it's really hard to test athlete A with one technique and athlete B with another technique when both these athletes are trying to maximize their performance capabilities in order to maximize the money they can generate. How do you deal with that? So, yes, no one wants to get the shitty program. That's correct. And for the most part, people don't, and I'll take, people don't test like that but here's my solution to it because I think being a critic without a solution is called being an asshole. My solution to that is making it very agile. And so we're not going to be able to test group A versus group B. But what you can do, if you're a coach and you have faith in, because there are a lot of programs coaches use, coaches probably use every off season they might try a new program. So there's no real difference in all honesty to try a new program on these seven athletes versus and then try a different one that you also trust on these seven athletes. And part of that comes from the fact that we have science and evidence to show that both these programs are really good, right? But there's no one's actually broken down the minutiae of it. And so, yes, you probably could do A and B testing because you have faith in both programs. So it's not like either athletes getting the wrong program they're both getting programs that are going to probably elicit an outcome of performing better, but who wants to perform the best? The second aspect would be what kind of longitudinal data that you can collect very easily to understand typical progression of athletes. For example, if you coach and you coach for eight years you'll have eight different freshmen classes theoretically. And you'll begin to understand how a freshman typically progresses to a sophomore in what their key performance indicators typically trend as. And so you can now say, okay, last year we did this, this year we do this, I'm going to see if my freshman class responds differently. Is this going to give us the perfect answer? Absolutely not. No, but without data you're just another person with an opinion. That's not my quote, I stole that quote, but it's true because if we don't try and audit ourselves and try to understand the process of how is someone developing then we're just strictly relying on confirmation bias to be my program was great. Pat some guys in the back that jumps higher we did awesome. But if we're truly into understanding what's best then we'll actually try and measure some of these progress some of these KPIs over time. And the example I give and it's unfortunate and fortunate I don't mean anything bad by this either. We're on a salary, right? And so what happens when you're on a salary is no matter really what happens assuming you're doing your job they're going to keep your job. But if you look at a startup a startup has one option and that's to make money or grow out of business. They don't really have the luxury of oh, we're just gonna hang out and not saying coaches hang up or we're just gonna keep this path we're going on. As a coach, how do I apply a similar model? Well, a startup, the thing about startup is you can go from worth $0 to worth $100 million to a billion dollars in one year. As a coach, we don't have that same environment because I'm not producing something tangible which doesn't always, it doesn't have the same capitalistic drive the invisible hand pushing us the same way the free market does with devices. And so we don't always follow the same path that these startups have done yet that same path and same model might provide better insights. So Max, you've hit something I found very interesting. Confirmation bias. If you don't take the time before you execute a test understand the variables that you're gonna test what happens is if you after the test is over you go back and try to triage what the drivers were of that impact and confirmation bias and revisionist history and all these other things that make humans really poor decision makers get in the way. And so before as a coach, I would imagine before as a coach, what you'd wanna do is set up ahead of time we're gonna test the following things to see if they have impact, right? Thoroughly, like the hypothesis development canvas, right? Thoroughly understand against what you're really gonna test. And then when you've done that test you would have much more confidence in the results of that test versus trying to say, wow, Jimmy jumped two inches higher this year. God, what did he do? Let's figure out and revision it. Was it what he ate? Was it where he slept? Oh, he played a lot of video games. That must be it. The video games made him jump higher, right? So it's, I think a lot of sports in particular even more than the business where a lot of sports is based on heuristics and gut feel it's run by a priesthood of former athletes who are great because of their own skills and capabilities and it maybe had very little do with their development and I don't wanna pick on Michael Jordan but Michael Jordan was notoriously a poor coach and a poor judge of talent. He made some of the most industry some of the worst draft choices industries ever seen. And that's because he mistakenly thought that everybody was like him, that he revision history about, well, what made me great were the following things. So I'm gonna look for people like that instead of reversing the course and saying, okay, let's figure out ahead of time what makes, what will make you a better pack player and then try in these tests across a number of different players to figure out, okay, which of these things actually had impact? So sports I think has gotten much better. Moneyball sort of opened up people's eyes to it. Now we're seeing more and more teams who are realizing that data science is a discipline. It's not something you apply after the fact but in order to really uncover what's the real drivers of performance you have to sit down before you do the test to really understand what it is you're testing because then you can learn from the tests. And let's be honest, right? Learning is a process of exploring and failing. And if you don't try and fail enough times if you don't have enough might moments you'll never have any breakthrough moments. I think what people misunderstand is they hear the word fail and assumed, oh, we did a six month program and failed. Now failure can occur in one day and that's okay, right? You can use, for example, I'm going to use this piece of technology as motivation for biofeedback to increase my athletes intent and the amount of effort they put into the weight room. That's your hypothesis. You can test that in one day. You put out that piece of technology and the athletes don't respond. Well, you'd have learned something now. Okay, that technology didn't bring about the motivation I thought. Why was that? You can do reflect and not revision because you had the infrastructure beforehand on maybe notes that you might have taken and scribbled down on your pad or observations from the coach's eye. It might have been, you know what? The athletes weren't very invested because the technology took too long to set up, right? It wasn't the technology's fault. It was the process of giving the technology available to act and utilize on. So maybe you retest again with it set up beforehand or a piece of technology that's much easier to use and the intent increases. So now you say, okay, it's not the technology's fault. It's the application of how we're using the technology. At the same time, we hear a lot of things like, take a little bit of a pivot, not too far though, is in the baseball world, you see technology being more used more and more as a tool and it's helping guide immediate actions on the field, whether it's not, it's spin rates, it's arm velocities with accelerometers or some sort of measurement they decide to use. But that's not necessarily collecting data. That's using technology as a performance tool. And I think there's a distinction between the two. The two are not mutually exclusive. You can still use it as a performance tool, but that performance data, if the infrastructure's not there to store, file or reflect and analyze, is only being used one-sided. And so people think, oh, we're doing sports science. We're doing data science because we're collecting data. Well, that's not, I can go count ants, that's collecting data. But that's not, unless I count ants every day and say, oh, my ant population's decreasing, right? And kind of a, here's a really easy way to think of it in my opinion. You have cookies in the fridge, right? And every day I go and every week we'll say, my mom makes cookies. This doesn't happen. I wish it did be very cool, but I love you mom and we didn't eat cookies every week. But in the fridge, I go and I count how many cookies there were, right? And using data, I'd say, oh, 12 cookies. If there's any cookies at all, I can eat, right? That's using technology in that moment. But doing data sciences, well, you know what? She's going to make, you know, 12 and a couple of days and I have two days left and there's six cookies. I can eat three today and three tomorrow. Because now you're doing prescriptive analytics, right? Because you are prescribing an action based on the information you collected and it's based on historical data. Because you know that every seventh day the cookies are coming. I just take it as I'm using technology as a tool. I might only eat one cookie and forever be leaving six cookies on the table, right? And so there's it. Don't wanna do that. No, we don't, but we trick ourselves. I think we see that, I'm not saying baseball does this, but I'm saying we see that in all domains, when we use technology, we say, oh, technology, good. We had someone use technology, that's data science. No, that's not data science. That's using technology to help augment training. Using data science is understanding the information that happened during the training process, looking at it contextually to then prescribe saying, I'm gonna do this exercise or this exercise based on the collection and maturation of the information. So instead of cookies here, I eat one cookie. It's, oh, historically I know there's gonna be 12 cookies every seven days. I have two days left. I can eat three cookies now. I can hide two and tell my sister Amelia, oh, there's only one left, very weird. I don't know who ate the other two. Well, Max, let me wrap up with a very interesting challenge that I think all data scientists face, well, maybe all citizens of data science face. And I'd say citizens of the data science, I mean, people understand how to use the results of data science, not necessarily people who are creating the data science. And here's the challenge, that if you make your decisions just based on the numbers alone, you're likely to end up with suboptimal results. And the reason why that happens is because there's lots of outside variables that have huge influence, especially when it comes to humans and even machines to a certain extent. Let me give you an example. Baseball is infatuated with cyber metrics and numbers. Everybody is making decisions. We're seeing this now in the current off season. Who was signing contracts and who was giving money and they're using the numbers to show how much is that person really worth and organizations are getting really surgical and their ability to figure out that that person is not worth a six year contract for $84 million, they're worth a two year contract for 36 and that's the best way I'm gonna pay but minimize my risks. And so the numbers are really driving allow that but it isn't just the big data that helps to make decisions and in fact, I would argue the insights carried from the small data is equally important, especially in sports. And I think this is a challenge in other parts of the business is the numbers itself, the data itself doesn't tell the full story. And in particular, think about how does an organization leverage the small data, the observed data to really help make better decisions. So right now in baseball, for example, in this off season, the teams became infatuated with using numbers to figure out who were they gonna offer contracts to how much they were gonna pay them for how long. And we saw really the contracts, in most cases really shrinking in value and size because people are using the numbers and comparing it to say, oh, so-and-so it only got this year only gonna get this. And numbers are great, but they missed some of the smaller aspects that really differentiate good athletes from great athletes. And those are things like fortitude, heart, effort, resilience, these kind of things that you can't find that in the numbers. So somebody's ability to a closer, right? Who goes out there in the eighth inning and just has a shit performance, gets beat up all over the place, comes back in and still has to lead and does that person have the guts, the fortitude, to go back out there after a bad eighth inning and go do it again? Who can fight through when they're tired? It's late in the game, you've been playing, it's a 48 minute game, you've been playing 40 minutes already, you've hardly had a break and you're down by two, the ball's in your hand, a three-pointer's gonna win it, what are you gonna do? The numbers don't measure that. There's these other metrics out there like fortitude and heart and such that you actually can start to measure. They don't show up in numbers, but they come from the insights from subject matter experts to say, yeah, that person has fight. And in fact, there's one pro team that actually what they do in the minor leagues, they actually put their players into situations that are almost no win because they wanna see what they're gonna do. Do they give up or do they fight back? And you know what, again, you can't, bad average doesn't tell you that if somebody's gonna get up and they're gonna give up, it's a ninth inning, you think you've lost, you know what, I don't want that person out there. And so think about in sports, how do you compliment the data that you can see coming off of devices with the data that experienced coach can say that that person's got something extra. They got the fight, they have the fortitude, they have the resilience. When they're down, they keep battling, they don't give up. And you know from experience, from playing and coaching, I know from playing and coaching, the guy is gonna give up, you know who they are. I don't want them on the court, right? It may be the best player from a numbers perspective. Hell, if that was the case, Carmelo Anthony would be an all-star every time. His numbers are always great. The guy lacks heart, right? He doesn't know how to win. So think about how as an organization, a sporting organization, you use the metrics to help give you a baseline. But don't forget about the soft metrics, the observable things that you got that tell you that somebody has something special. That is an awesome way to bring this together because subject matter experts, those are people who have been in the trenches, who see it firsthand. Data is here to augment you and your decisions. It's not here to override you. It's not here to take your place. And so when coaches fear data, it's the silliest thing ever because it's giving more ammo to a gunslinger. That's all it does, right? It's not gonna win the battle, right? It's just the bullets. You gotta still aim it in fire. And so when we look at it in regards to performance and athletic development, all these numbers, they'll never be right, ever. They'll never be 100% perfect, but neither will you. And so what we're trying to do is help your decisions with more information that you can process into your brain that you might otherwise not be able to quantify. So it's giving that paintbrush, not just the color red, but it's giving all the colors to you. And so now you can make whatever painting you want and you're not constrained by things you can't measure yourself. If I could add one point, Max, to Bill on that. Data won't make a shitty coach good, but it'll make a good coach great. Yeah, yeah, I couldn't agree more. Well, dad, thank you for being on here. Really appreciate it. And for everyone who's listening, this is going on prime March Madness time. And so to pull away the Dean of Big Data from March Madness who, for people listening, he made his bracket on the Google Cloud using AI. And so it only he, so I was just thanking him to come here and only he would be the one to, I guess, take, I don't want to take the fun out of it, but try and rig the family bracket for used all augmented decision raking he possibly can. Like I said, Data won't make somebody shitty good and I'm still not good. Google Cloud couldn't help me. I'm still at the bottom of the family pool. Oh, no, it's great to have you in, I guess, every minute here's worth double the Dean that's March Madness time. Thanks Max for the opportunity. It's a fun conversation. All right, thank you guys for listening. Really appreciate it. And I hope you guys do it next time.