 Hey, hey, this is Carlos. I'm the founder and CEO at Product School. Today I'm here with Che Sharma who's the CEO and founder at Epo. Hey, Che. Hey, great to be here, Carlos. Thanks for having me. I'm happy to have you on the show. Now I was taking a look at your background before this interview because I don't get across many data scientists that turn into CEOs. So I'd like to learn a bit more about that part of you before we jump into your current company. Yeah, definitely. So as you said, my background is in data science. Most of that time would say Airbnb, where I was the fourth data scientist quite early on. Stay there for about five years and then worked as an early data scientist at other growth companies. The transition to CEO was honestly out of a desire to build better tools for data teams. So in that regard, it's a pretty natural transition to be the practicing the sort of thing that you're building for. So just for anyone who's been in the data space or adjacent to it, there's been huge secular trends in how people do analytics work with the rise of the modern data stack. And that just implies that we need a different set of tools for the way people work today. And here's the interesting thing that I've seen from my perspective of working in product. Those tools used to be for the data teams and engineers, you have to be somewhat technical to get value out of those tools. And a lot of them are becoming much more visual. And we are seeing marketers, product people, other stakeholders using them. So I just want to learn more about that. What is your angle and how do you see the opportunity to create a tool that is flexible enough for other people so the data team doesn't become a bottleneck? Yeah, completely. The way the trends have gone is that it started off by you just have the basic ability to deal with large volumes of data. The birth of the internet and cloud infrastructure meant large amounts of data. And then you had the basics of distributing compute with Hadoop to be able to do anything at all. Now, the problem was to do anything at all still required a bunch of DevOps, chops, data engineering skills, scientist skills, and then evangelism skills. And so you ended up with these full stack data scientists for the only ones who could use it. But just the different parts of the supply chain have just gotten so much easier. Now, you don't have to run your own Hadoop cluster, you can just get snowflake. And it's all quite easy. Orchestrating jobs used to be, we literally had to build airflow Airbnb, Novice, DBT and other tools to make it easy. And so you actually now have the ability to build a tool stack. You didn't have to start from the ground up, which I think has led to all these great other tools come in. And I think one of the big things is so many of these capabilities used to exist in-house at places like Airbnb. But now that they're turning into actual companies, they're getting front-end resources and designers and self-serve becomes a lot more possible. Yeah, so let's talk about that. I feel that passion. You've been definitely part of the problem as a data scientist for different companies. So how did you experience that old school way of building data stacks and how did you arrive to the solution that you're building today with Apple? Yeah, completely. So again, my most formative data science career was at Airbnb where we started off and we started off with just MySQL replicas and crontabs, which for anyone who deals with data info, that's not a great place to scale a practice. We eventually adopted Hive, which is a way to run large jobs. And then we built Airflow to manage it. But along the way, the story I always like to tell is the thing about Airbnb in comparison with other companies is it was founded by designers. And so it wasn't founded by people who were naturally data quantitative people. It was founded by people who were much closer to a Steve Jobs-Walt Disney approach of intuition and design. And so as a data team, even while we're spinning up all this infra, we also had to drive impact in a way that would win over the orc. We didn't just come in with a halo that we can win arguments. And so the thing that always struck me was that we did a lot of great things at Airbnb, but the thing that made the data team into what it is today, which is probably one of the most sophisticated orcs on the planet, was the experimentation infrastructure. AB experiments is this fascinating workflow that builds this intimacy with metrics where you don't have to know things about ETL and databases and stuff like that. Instead, you just know this fairly simple paradigm, probably doing it since the fifth grade, dividing people into two groups randomly, having one group do one thing, the other do another, and see who does better and just kind of engaging with science without having to know all the ETL or data. And that ended up being a huge fit for Airbnb. Airbnb was such an entrepreneurial place. So the idea of being able to prove the quality of ideas without political processes, that was a great fit. And we ended up seeing some incredible impact during those years. I would say right around 2014, 2016, there were some huge changes coming from small interventions and large interventions. It was an incredible moment. And I remember those moments because this concept of AB testing was really revolutionary. And there were pioneers like Airbnb that made that tangible for the non-negative user. I remember the first time I was exposed to an AB test, and I was able to just literally tweak a couple of variations, put them in market within minutes, and then start getting insights and then making decisions on that. Now, today, we see in product management, AB testing or experimentation is a huge component. It's a huge skill that is being required in interviews. So from your perspective, how much knowledge in experimentation or AB testing you expect from a PM in order to be successful? Yeah. This is a big point of passion because if you consider the state of tooling for most of history, the assumed knowledge of a PM is actually quite high. There's a lot of ways you can shoot yourself in the foot within the experiment, not tracking enough data, having bias data, having improper setups, using statistics improperly. And these tools never really helped you there. They were happy to let you set up experiments and give you positive negative results, but never quite trusted them. And I think that leads to a lot of situations that you see today, which is that if you want to be a PM running AB experiments, you purchase some third-party tool for hundreds of thousands of dollars and then you still have to hire some data scientists who are actually going to give you trustworthy results. And so that was really a big part of what led to Epo being built, was to say, how can you have something really trustworthy as if you had a data scientist pulling data from the warehouse, giving you the ability to storytell, to root cause, and to socialize, all while being reliable. So the answer is up until Epo, I would say PM would be responsible for knowing data lifecycle and statistics. But with Epo, you basically just need to understand the scientific process. What do you want to learn? What is the best way to learn it? What are you looking for? So what are some of those best practices for PMs to collaborate effectively with data scientists and vice versa? Definitely. So this is something that I felt on both sides of the aisle, both as a data person and as a PM. There's a kind of startup cost of just making sure your data house is in order. So it's worth your org having at least one or two data people just to see, do you trust your metrics? When you say you're counting the number of purchases, is it a good number of the number of purchases? So if you're running an experiment that's meant to increase revenue, increase purchases, is the underlying data actually high quality or not? So I would say that there's a startup cost, which is just, do you actually trust your metrics? Once you have that, then I think their interaction with the PM and the data team becomes, do you have what you need to root cause things? For example, you have a purchase. Do you know like the marketing channel that came in on, or the subscription tier they're on, or the device they're on at that moment? Like the sort of things that might be explanatory for like why a certain conversion happened or did not, or why some effects are happening or not. Like making sure that it's all in place. So basically making sure you're well instrumented and that you trust your underlying data. Like that's something that a data and PM, data person and the PM should invest in early and make sure you're in a good state there. Because like, you know, instrumentation is something where once it goes out there in the world and you did not instrument it, you can't go back in time and get it. You have to do it all, friend. And I agree. I've seen this evolution in product tech in general, but specifically with data tools where a lot of companies start with the spreadsheet, literally just putting numbers and creating formulas. And then suddenly they have these elaborate architectures with a bunch of different tools to a point that it's also unclear, right? Like what pool to use, who to ask, and then why did we get here? And there's a lot of tools out there in the market and examples. Some of those companies are public today. That's it. I mean, it's clearly value, but at the same time, there is this confusion. So I'm hoping you can help us identify, like as from a product perspective, what are some of those key layers for them to have a data stack? And how do you see Apple kind of positioning the whole stack? Completely. So lately, as in in the past five years, you've seen a huge consolidation onto a common stack. So the good news is, there's basically a right answer now. If you look at any sort of growth stage company, so post product market fit, and even earlier, you'll see this data warehouse model. Basically, there's a big three of databases, Snowflake BigQuery Redshift, that almost every company uses. My favorite of the three is Snowflake for a variety of reasons. And then what you do is a lot of different SaaS tools will out of the box write to Snowflake. If not, then you need some sort of tool like an ETL tool, like a segment in five trends, get it into Snowflake. And then once your data team is in place, you can bring on a DBT to actually calculate what you need. So when we say the modern data stack, you're basically talking about Snowflake with DBT on top with some tools to get the data into there. And the benefit of that is that the most trusted metrics are usually pulling from a few different data sources. For example, your event data might not be very reliable, but your engineering transactional tables are actually quite reliable. Maybe you have multiple points of sale, and so you have purchases coming in from different sources. So basically the paradigm is, get everything into one place so that you can make logical choices on how to best model your data. And you're saying that that one place that is like high level enough can be accessible by people who are not just on the data team? It should be made to be. So the sequel ends up being the lingua franca of ending up with a high quality data store. But a lot of companies will get started by just kind of using the raw data and pulling it. The engineers, if they were pulling purchases, they would just do a postgres or something like that. The benefit with Snowflake is that your PM may or may not have access to a postgres table, but Snowflake can be given broad access to for pulling things. Yeah. One trend, and thank you for clarifying, because I appreciate those types of specific answers that we hear so much, so many options. And then I think putting these layers helps inform that decision. One of the trends, everyone's talking about AI and chat GPT and how are you going to change the world. We've been hearing for quite some time already. How are you thinking about really using AI and other technologies to get to insights faster, to not just present the data, but help person make a decision with that data? Yeah. Let's see. If you're trying to say I want to uncover insights to how to best improve my product with AI. I'll say I haven't seen it yet with chat GPT or AI. That level of meta analysis, I've seen more shallow queries being done quite readily, or if you want to author a blog post or something like that. You can use chat GPT to do things like how do I query this type of data and then it'll give you some sort of example, but it's usually not something to completely operate on. The thing with data is that for the more consequential the decision, the more precise and cleaner data has to be. If you are just making some sort of directional call, it's like, okay, plus or minus, whatever, it's fine if it's like 10, 20% off. If you're making calls, I'm like, should I launch this product or decommission the code and undo the last month of work, which is basically the call you're making with an AV experiment, then you kind of want the right answer. You want to make sure you're making the right call. Yeah. It's reminding me of what you said first about, do you trust your data? Yeah. That's a big part. It's a big part. If you want data to be part of the decision process, you have to first trust it. So kind of shifting gears a little bit into your team and then how you grew your company. Can you give me an idea of how big you guys are using any metrics that you feel comfortable with? Yeah. I mean, we're around 20 or so odd people, a bunch of data practitioners from across the industry. Again, my formal career, Airbnb, we have Snowflake, Uber, Stitch Fix, Pinterest, DNA in here. So quite a lot of different experiment capabilities. I would say some places that we tend to really spike on is one, data engineering, because experimentation is one of the more intense data engineering applications out there. That's usually what people find when they start building this in-house. And the other is design, because again, if you want trust, if you want people to make consequential decisions on things, then it has to be really clear what's going on. You need to not only feel empowered yourself to understand what you're saying, but you need to have the tools to storytell to your org. So I think those are two places where I take particular pride in the quality of our team. And Kev, for you personally, how have you evolved as a CEO? You can imagine when you were in a smaller company to today where there's more data people and designers and others helping you realize your vision? Completely. Well, the first thing is just actually making your vision and strategy really concrete. I think a big part of what the CEO needs to do is set the direction of the company and say what are things that we will be doing, what are things we won't be doing, and especially in the next quarter, two or three, like what is the very particular focus that underlies the strategy? Painting a really clear picture of where does the company need to go. You can do this in collaboration with your executives and others. And then the next stage would be making sure you have the right team to execute that strategy, like hiring the appropriate people. Once you have the people, making sure they're in the right positions, they're unblocked, they can operate really fluidly. And then from there, so much of culture stems down for the CEO. So also making sure you're extremely intentional on the culture you set. I come from a technical background as well. And speaking from experience, talking about strategy, culture, recruiting, it's not something that came naturally. So I'm just curious to know how were you able to acquire those kids while also probably letting go of some of your data skills to empower your data team to also grow? Yeah, I mean, I have always found that as a CEO and founder, I learned the most from other founders. You know, I'm very fortunate to have a nice cohort of kind of contemporary founders and also people who are much further ahead, who I can constantly learn from. But honestly, a lot of you're also just learning with your team, you know, you're all in it together, you're all in the thick of it, you know, working through problems together. And, you know, everyone's got a job to do. Some are kind of directly building the products and we're selling it. You know, as a CEO, you have to take stock of like, as everyone's kind of doing their jobs, does the whole add up to more than some of its parts? And if not, then what's missing? What, you know, is there some process? Is there some direction? Is there something else to add? How is your product team structured? We structure our product team on the life cycle of experiments. So, you know, every experiment, you know, starts from some leadership mandate on like, what are the priorities for the business? You know, is it growth? Is it profitability? Is there something? Is there some other strategic outcome from there that translates into metrics and from their product teams need to come up with ideas for how to drive those metrics? You know, there's sort of a planning and setup stage. From there, you give birth to certain experiments, you set them out in the wild, got to make sure that they run properly, that they execute as you expect them to. There's no issues that arise along the way. And then you make decisions and you evangelize those decisions. So kind of each step of the life cycle is something that we've organized teams around to make sure those moments are always highly trusted, always appropriately collaborative. And, you know, in the end, they lead to EPO champions getting promoted. You know, I always say like our ultimate success criteria is that the teams using EPO are rising up within the organizations. They're always seen as reliable. They always seem to know what's going on. And, you know, the org just really wants to put more power into their hands. I love asking this question to what I call product CEOs or people who come from a building background who understand what it is to get something done. At the same time, we were talking before that company is bigger than product. Product is a huge component. And at the very beginning of the stage of a company, probably it's everything, almost everything. But for you now, where obviously you have a strong product and a strong product team, you also have other things to take care of. Like, how do you structure your team day to day or week to week? Like, where do you put your most of your energy these days? Of my personal energy? Yeah. Yeah. I think a big thing is to say, like, you know, of this, of the different initiatives going on, which are the ones that need further investigation, correction, you know, effort added, you know, perhaps like the org is a little bit unevenly bent towards one side. And you know, you're going to have to make a staffing decision on this place. And so you get your hands dirty in it and start doing the operating just to get a feel for who do you need to actually be in there. Sometimes you get a sense that a certain, you know, piece, whether sales or marketing or product or end or whatever is having trouble working through some blockers. And so you kind of get in there and again, get a feel for how it's going, working with your executive there, of course, in parallel. And then other times, you might have a strategic decision to make. And so there's not some like direct doing to happen, but you want to work with the executive and actually go through this decision together. So, you know, how I decide to personally spend my time, there's a mixture of reasons why I might focus on marketing or sales or products. But the most important thing is that it's done intentionally, right? You start out the week and you look at your calendar and you say, you know, what are my priorities and then does my calendar match my priorities? Yeah. And you mentioned recruiting and having a strong executive team a few times. I want to double click on it because I agree. The only way I feel personally, say, moving on to different areas is by trusting a team. And I know how it is to find these type of people. So how, what's your philosophy around hiding? Yeah, I would say the big thing, especially with an executive team, is you have to trust them. You know, you have to like, really, you know, these are the people essentially taking care of your kid, right? You have to feel very good about kind of putting this set of work in their hands. And so that requires one, doing the proper research and, you know, talking with other people of, you know, hiring these executives, thinking it's your own experience to actually intentionally lay out what sort of background you want. The other is to just psychologically impact yourself as a CEO and just kind of convey to someone like, what do I want to see from you to make sure I trust you? You know, just to make sure that like, when I delegate work to you, that, you know, I'm not going to be helicoptering over you or anything like that. So when you make the outcome of trust a kind of first order goal, then it sort of becomes like, okay, you need to look within yourself, you need to look within the function, you need to learn whatever, like, what does it take to get there? What sort of handshake agreement can you make for the executive to make sure you get there? And then once you have that initial trust, then it's around aligning on outcomes and to say, you know, what do I expect to see from this practice such that if you, you know, hit these milestones, these outcomes, then, you know, you get broad difference, you get empowered, you know, celebrated. And if you don't hit those outcomes, you know, what else would you want to see? So I think a big thing as CEOs that you just really, you can't have a bunch of criteria or evaluations in your head, you need to just get it on paper, you need to get it really clear and transparent. That's the I'm smiling because that's a part that data can help, but sometimes it's not enough, right? And the interpersonal dynamics and, and really making sure there's a strong commitment and trust. It's, it's critical, at least in my experience as well. And obviously, as someone who comes from a data background who's building a data product, I can imagine you guys drink your own champagne and leverage some of Apple internally. Yeah, yeah, completely. You know, we've set up some experiments here and there. We use our own feature flagging. And I think beyond just literal ABA experiments, we just believe in running experiments, right? You know, the idea that like, let's just try it out and kind of look for some signals that it's working or not, you know, even if they're qualitative signals or whatever, like, you know, as a culture, we just really embrace the idea that like, if you have some idea and you want to try it out, go try it out. Tell me more about that because I, obviously, I believe in that and I heard from other CEOs how important it is. The counter is if there is not enough psychological safety, what's going to happen if that experiment actually doesn't work, you know, because and so I want to know more about how you are able to create those conditions for people to feel empowered to actually try things. Yeah, completely. You know, I think a lot about the Jeff Bezos framework, the like type one, type two decisions, if you've heard that, you know, basically, there are a bunch of decisions. Basically, every decision is reversible or irreversible, right? Reversible decisions are like, okay, you did this thing, you decided it wasn't good, and you decided to go the other direction completely. Does it actually matter? You know, like, if you change your mind on this decision, is it something you can take back? And if it's the case that you can take back the decision, it's reversible, you know, you can start doing this other process instead of this process, you can do this other, you know, workflow instead of this workflow. You should just try it out, right? Like, there's, if the cost of making the wrong decision isn't that high, you should just try it out. Alternatively, there are other decisions which are kind of closer to one way doors. You know, once you go down this route, it's actually very, very messy to undo. And that's where you actually have to align and treat the decision a little bit more seriously. And you know, you may run, you still may run an experiment just to de-risk it, but it's going to be something that's much more intentional. You know, you're probably going to write it on a pager, you're probably going to, you know, do a pre-mortem, that sort of thing. But the vast majority of decisions are actually reversible. So what's next for you? What's that vision for Apple? Yeah, I think a big, a big thing we want to do is, you know, we're not just selling experimentation infrastructure, we're trying to sell an experimentation culture. You know, for all of our companies, we want to make sure that, you know, everyone in those companies is, those companies are able to test out ideas, show impact, you know, be able to show ideas are good without having to win political processes. And so for that, you know, one starting point is to just say, let's make Apple intensely collaborative so that, you know, decisions being small, you can kind of all get together, make those decisions, make those decisions, kind of document it, turn it into something that's knowledge-generating. And then the other part of it is to just increase the scope of the type of decisions that are being made with Apple. You know, there's a lot of things that are not just the eB experiments that, you know, still have that shape that you should be able to try them out and see how it goes. So I don't want to say too much more until we roll it out, but I'll just tease that, you know, the idea that we're trying to change a culture, and that involves, you know, bringing much more people into a collaborative decision-making framework and handling a lot more types of decisions is very much on our radar. Well, I really enjoyed chatting and learning from you. Thanks so much for your time. Absolutely. Thanks for having me, Carlos.