 Hello everyone, and welcome to this week's Product School webinar. Thanks for joining us today. Just in case you didn't know, Product School teaches product management, product leadership, coding, data analytics, digital marketing, and UX design courses online and at our 16 campuses worldwide. On top of that, every week we offer some amazing local product management events and host online webinars, livestreams, and ask me anything sessions. Head over to productschool.com after this webinar to check them out. Today we have an awesome guest presenting. I'd like to introduce you to Allie O'Connell. Allie's background is in early stage and growth stage startups working in roles from CEO to growth marketing to product. She's an advocate for lean startup and agile methodology to find product market fit in her work as a product manager at CarbonFlight. On top of this, Allie has also co-founded an AR startup called Carbon Objects. Feel free to leave any questions for Allie in the comments of Facebook and I'll ask her at the end and without further ado, let's welcome Allie. Thanks for joining us today. Thank you, yeah. Thanks for having me at Product School. Just one second and I'll get some slides started for us. Cool. That full slide showing up. Yeah, looks good. Awesome. All right, thanks so much. So yeah, as Dan mentioned, I'm Allie O'Connell and I'm gonna be talking with you guys today about what it means to be data-driven as a product manager. Yeah, so who am I? Dan gave me a great intro. But yeah, I think people find their way to product management through lots of different avenues. I was a international and global studies major. It was really interested in whitewater kayaking coming out of school. I ended up working in technology and arts administration as the director of a startup incubator and eventually started a startup that had to do with art and augmented reality and married those two passions. From there, I really lived the startup lifestyle and ended up working as the director of growth at another startup called Bellhawks. Consulted with some other folks and then ended up doing product management carbon five where I've been for three years. And in this role, I do consulting, which I love because we get to change projects every so many months and I've been able to work in so many different markets and industries and with different teams. And I think it's really shown me how variable the role can be in different organizations and how to flex and make sure we're building the right thing for the right people. So why am I here today? Today, we're gonna talk a little bit about what it means to be data-driven because it can look very different in different product stages. And I'm here to give some snapshots of what I've experienced in utilizing data to drive decision making for software and products. So this will be pretty high level and kind of give you a snapshot of different moments. But I think in any one of these topics, you could dive really deeply and there's lots of rich content on these subjects too. All right, so I wanna talk about what being data-driven definitely is and this is how this talk is gonna break down. It's starting with a baseline so you know if you're going up or down on certain metrics. It's tracking the vitals so you know if something changes. Defining a North Star which helps align your business and your product vision and that can change. And defining and prioritizing assumptions and questions, creating and tracking experiments to vet those assumptions or answer those questions. And then reviewing and restarting this whole process as needed as your product grows and changes. All right, being data-driven is not. These are things that I've seen people talk about being data-driven and I think that this is kind of the wrong way to approach it. So it's not using metrics to hide behind because the team can't debate and decide together. This is not a replacement for a healthy working relationship in product design development with the stakeholders. There's just no substitute for that. You can't lean on metrics to replace that thing. It's not evaluating quantitative metrics over qualitative metrics. They're both very important and it's making decisions based on data without considering product and business goals as not being data-driven. So you can't just say this metric tells us we should do this. It's really more holistic than that and it should always roll up to the goals that the product and the business has. And yeah, some people talk about qualitative versus quantitative metrics. And I argue that it needs to be qualitative and quantitative metrics. So yeah, there we go. That's our latest year. All right, so first step is starting with the baseline. Where you start will look a little different based on product phase. So I'm gonna go over some different product phases and what getting a baseline when I step into a new project with a new company looks like. These are some of the things that will help you get your bearings that you can start. So for an existing product. All right, for an existing product you wanna start with what are the measures of success? So you're gonna wanna find out how does the business define success for their business model? What does the product team do to measure success for their users and user value? And do these things align? And we'll talk in a little bit about creating a North Star metric, but this is the question you wanted to start off with. Is there any kind of disparity between these two things? And what metrics are used? And you have some tools for those things. So I love doing stakeholder interviews one-on-one with a script and trying to figure out what does success look like to different stakeholders in the company. Could be marketing and sales, the CEO, the development team. Any folks you can talk to that are gonna have an opinion about what success looks like to the business. Find out what metrics are using right now and create an artifact so that you can come back to that thing over time and make sure that everyone's still on the same page. And then work on resolving differences between the business and product and we'll talk about that a little more later. And user level behavioral data. That's the next thing you wanna get a baseline on after you understand the measures of success. So you wanna start tracking and recording existing user level data if they have any. The company you're stepping into doesn't have a ton of behavioral metrics on the user level. So you can actually segment users and understand how people are using the product differently. You wanna get something set up pretty quickly. So you can start tracking that. And you're gonna wanna make that another baseline. Make an artifact, record those things. You're gonna start tracking it in the future but it's great to have a jumping off point. And you wanna create dashboards so that anyone on the team can have transparency into what users are doing on the platform. The toolkit for this is anything they have existing, implement a track anything tool if you can really quickly. It may or may not be the thing you want to use long term but keep as a great example of a really low dev effort way to start tracking all your user behavior. And then you can go back and query it and create dashboards but it only starts once you install it. So you wanna make sure that as early as possible you start tracking because it will take a little while to get enough data to create a baseline. All right, so now we have user behavior. We have the definition of success for an existing product. Now we're gonna talk about a qualitative baseline. So this is talking users one-on-one. There's real no substitute for this work and surveys and focus groups don't really cut it. To do this you're gonna wanna flush out your user personas, conduct user interviews with at least five people in those most highly prioritized segments you wanna work with and make sure that the personas you've defined are really the ones that show what the user pain points are and are a good example of who the users are you wanna focus on. Let's see, and you wanna know, do your assumptions about who the users are and line up with their behavior in the product. So you can marry that with your user behavior and you wanna have a dialogue while people are using the product. You can observe them one-on-one and ask them questions about what they're doing, why they're doing it, that's great. Toolkits for this are creating personas. Like I mentioned, observation interviews. You can use screen casting like full story but I've found it is a lot better when you can actually ask people in person about their experience and what they're doing and watch them use the tool in case there's something you can learn that they are not telling you. And then synthesize and create an artifact of learnings and this is your third baseline, you wanna set, you want to make sure you've got a qualitative understanding of who the users are in addition to these other pieces and then from there you'll have a great jumping off point. I wanted to mention a little bit about a scaling product baseline as well. And this could be an existing product but companies that are in growth stage are really interested in optimizing. There's a growth focus in these companies that there's a lot of dovetailing with growth marketing but these metrics are extremely important to the company and so they're extremely important to you as a product manager. And I wanted to mention it here because in this stage of the company you might not be adding a lot of new features. You found product market fit and now you're trying to get as many users for the funnel as possible and scale your company. So you're gonna be focusing on lifetime user value and retention, what's the conversion rate? You're gonna be looking at your funnel like each step of the funnel was the conversion rate there. And here's an example of what a conversion rate funnel might look like in a very simplified way. This is a screenshot from amplitude. So this is a product where someone might, they sign up for something, they have to go to an email and then pay for a subscription. So you see that the actual conversion rate is pretty small but this is not unlikely from the top of the funnel to the bottom of the funnel but a product might have many more steps than this and as a product manager you might be doing A.B.Test and refining things to make sure that there's less friction for people to get what they want. And these are tools that help with that using optimistically to A.B.Test things. You could do the same thing in email marketing, path you analysis, kind of what we just looked at considering BI tools and unit economics to understand how this is all rolling up to the business metrics and the bottom line for the company. So this might be something you'd see. It's not in every company that I've worked with but it's definitely a focus in startups about three years and it's grow or die. All right, and a note about a new product baseline. This one can be tough because you don't have a lot of users to come in and start measuring things. So for a new product baseline, the most important thing is do we have product market fit? For a brand new product or a brand new company we wanna know what is the problem we're solving and for whom and will that work? So the things we wanna use here is user interviews. We wanna think about lean canvas and business model canvas, prototype testing, conducting landing page experiments to test demand. So here's some of the tools we might also use. Crate personas do user interviews to make sure those are correct. Using vision prototypes, paper prototypes, design sprints from that great book from Google. And here's an example of what you're vetting. Like the questions that you're asking at this stage aren't really about conversion rate or testing a complementary offering. They're really core to the business. So I like to think of it in this funnel shape because it allows you to say like what are the biggest, riskiest problems to this whole business model and start solving them from there. So this is an example of just a made up product. So is this a problem that our users actually have? Is it a big enough problem that they'll pay for a solution? Is this the solution that they'll pay for? How are we gonna tell people about our business? So as you work through this funnel your business model gets less risky and you can find alignment between the product and the market. So yeah, what about quantitative metrics? That's a good question because a lot of what you're doing is on the ground pretty scrappy lean startup stuff at this stage but you can find some quantitative stuff to help you along your way. One of those things is existing market studies. And I do a little bit of mentoring at a startup accelerator still. And this is the thing that is such low hanging fruit. Like if people are curious what the market opportunity is or how much people use smart speakers in their home or are people on, which platforms are they using? We'll be able to access users to this platform we wanna go to market with. You can find existing market studies and often they're free or very cheap. You can even find some of them on LinkedIn. Academic research is another thing you can use Google Scholar and find some academic research to get more insight into your potential user base. You can conduct surveys at scale yourself and get more of a quantitative metric. You can do tons and tons of user interviews at scale. And then landing page AB test can also funnel more users your way to actually get a quantitative metric about whether your value proposition is desirable or not. So you can do the landing page test to test coffee and then kind of throw some ad traffic at it and see what comes up. So there is an opportunity to get some quantitative metrics in there even when you're very early stage. All right, so we have tracked the baseline. We have a jumping off point. We understand with an existing product a scaling product or a very early stage product how we can make sure that we have some place to move from. So now we're gonna just talk about the things you're gonna track on an ongoing basis that are pretty common. You wanna know your active users. And depending on the product you might want daily active users. It might be fine to have quarterly active users if it's a data product that finance folks are using and they're only interested in getting in there and getting out once a quarter. So you wanna know what does health look like for your active users. There may be a conversion rate. You wanna know how often users are coming back. That's retention. You wanna understand what features are the most popular so you can track feature use and user paths like which features are they using and what order and session length. So these are things that you might not necessarily have a goal to move one of these markers yet but it's good to just have a pulse create a dashboard. And then you wanna know when a change happens to the product and see how it affects usage. So if you start tracking these vitals you make a change to the product. It may or may not be part of an experiment. It might be some kind of release or it could for a market change. You can track these things and this is just kind of a basic image of what that might look like on any dashboard. You would have a date and then you would see some change in traffic and it's nice to be able to understand why or have some theories about why. Monitoring vitals is also important because you can start segmenting users and really understand and fine tune your understanding of different user groups based on usage and cluster them. And it also you wanna make sure you have an avenue for users to contact you in addition to this. You're getting vitals on what people are doing, your understanding user segments but they also need to be able to reach out to you and tell you if something's wrong or if they have an idea for the product. And it's often good to have an avenue that's not just your review page because then you can mitigate some of that before you get a really bad review. Sometimes people just really love your product but wanna help it be a little bit better and you wanna make sure it's easy for them to find that avenue to do so. I'm gonna touch on the North Star metric just really briefly. It's important because like we mentioned when you come into an existing company, especially if there's not alignment between the business things of success and the product team things of success that can be really tough. So it's worthwhile spending some time understanding like what is that core metric that defines the relationship between customer problems and the revenue the business wants to generate. And this is pretty complex. There's like a whole talk just on this. So you wanna make sure it focuses on customer value. You wanna assess it over time. Examples of what these things could be for like a start-up might be how many client accounts are created through users sharing. So that gives you a lot of different levers you can push. You might wanna help users share. You wanna make the client account sign up faster. And so you wanna have a few metrics involved so that you can kind of push on those different levers. For growth phase, it might be how many weekly users are uploading files through the platform. Existing product North Star might be how many monthly active users complete some tasks in a user flow. So yeah, this is how we kind of married the product and business goals and find alignment. And this takes time and it takes a lot of input from stakeholders. And it's important to know this thing can also change. This is something that Amplitude put out about a North Star metric that I really like and wanted to share. When you're vetting if your North Star is a good one, you wanna answer these four questions. Does it definitely signal that you're executing on the product strategy? Does it speak to your goals you've already talked about with your team and agreed on? Does it represent a customer getting some unique value from our product? So it really hammers in on our unique value proposition. Is it a leading indicator of an outcome that my business success is measured on? Often that's monetary. And can it be broken into actual input metrics? You wanna make sure it's not just one thing that's very hard to act on. All right, so next we're gonna talk about how you're going to answer questions you don't have the answers to or vet your assumptions. So you wanna keep track as you go if there are things that you're not really sure about. Like, okay, we're making assumptions is what users want, let's record it. And then maybe come up with a way later to test and see if that assumption was correct or not. Same thing with just general questions you have about your market or your user base. And you wanna prioritize those things. So if you have a question or an assumption that you're pretty sure you know the answer to and it's pretty low impact, then you don't have to test that assumption. But if it's something that's kind of a blank space and you're not 100% sure, you might not even be 50% sure and the decision is gonna impact all of your users and be really core to the core value proposition of your product, I would probably prioritize that really high at the top. So I've redacted some things that would identify what client we were working with but this is an example of just as a project went we kept track of some assumptions and then prioritize them. And then the reason I say align these with releases is you can have a goal of different milestones in your backlog and say, okay, these are the things we're gonna vet with this release and we're going to see how users react. And those can be some of your tests as you go. So more about tests. You wanna develop a process and cadence now that you're moving along about build, measure, learn with Cooley to find experiments. So this allows you to do two things. You can test new ideas like new offerings or new features. You can also validate some assumptions that you might have about your market or how people are currently using the product. So both of those things could roll in. This is an example of a way to track experiments on a project. Again, I redacted some things so you couldn't tell what client this is we're working with but this is the very beginning of a project where we had some kind of big questions about an existing kind of legacy product. It had lots and lots of features but only usage on a few features. So we wanted to understand how people were using it from metrics but also from speaking with clients. So we have this flow set up. What's really important is what we wanna learn and what we've learned and then the rest is kind of just a combo on board of what we're going to do next. So we have metrics around what we think we can measure for each qualitative or quantitative experiment. Show it some progress, explain what we've learned. And if we have some ideas then that will go into the last column and then it will cycle back around to another experiment. And so in each of these cards we want to show the hypothesis. What we wanna learn is also the title of the card. So I think it helps the group stay focused without having to have an opinion to start with. The method is it's gonna be a user test is it's gonna be something we launched in the product to whom that's the audience. The metrics would be which metric do you expect to move based on this? Like should it increase sign up? Should it decrease people falling off at the stage in the funnel? Or often it might be a qualitative feedback metric like a binary like this is useful, this is not useful or something that we have to analyze more verbally. And then we'll record results in the next steps when it's done. All right, for an existing product the kind of things you might, these are things that we might be testing that exists, we might say, how can we improve UX? How can we simplify this offering? We might change the information architecture in a way that people might understand it a little better especially with the legacy product. Thinking about what integrations users might want. So trying to understand what are they using now? What are they using upstream and downstream of this product? Test new offerings of course and maybe attract a new market with a new product offering that's similar to what we have. For a scaling product, again the growth focus is gonna be really important. So a lot of what you might be testing here is optimization related. We're trying to add complimentary offerings to your core product. And then a startup or, I'm sorry, I have a slide typo. Startup or MVP, you're really focusing on core value proposition that they finally looked at. You could use those in this board to try and work through. Those are pretty big questions. You could break them down into even smaller ones and bet them here. And even messaging, like are we explaining our value prop very well? Could do tests around what's most successful and the way you're talking about your product. All right, and lastly, you wanna be able to review what you have and re-kick as you need to. So if you need to reassess your North Star metric, you should if you need to reassess if you're aligned with stakeholders and their expectations of success, like just re-kick. So I like to start off projects saying that, we're here, we're gonna work a few months and then we're gonna have to reassess where we are, look at our goals again and maybe reassess what metrics we wanna talk about for success or maybe even reassess the questions and assumptions we wanna solve and test. And that's it. Thanks so much for joining this Facebook Live Talk. Thank you so much for coming to talk to our audience. That was an awesome presentation. Really cool examples as well. So I don't see any questions here in our Facebook comments just yet. If our audience has any questions, feel free to post them. But one thing we like to ask all of our featured speakers is if there was one piece of advice you could give an aspiring product manager, what would that be? Sure, I would say resist the urge to let tools define your process for you. I think it's important to be really thoughtful about the process you're putting in place, evolve it over time and find tools that fit it and even bend tools or make a board on the wall or use Trello or something to make it fit your process so that you're always optimizing for the work you're doing and the team you're working with and not bending to a tool. Awesome, that's great advice. Well, I don't see any comments yet. So let's see, I'll give it another minute. Let's see, do I have another question for you? What's your favorite part about product management? Sorry, without any warning, but what do you like about product management? Sure, I like a lot about product management. I like that every day is pretty different and often we get curve balls and challenges we haven't dealt with before. So there's a lot of research and study and experimenting just to figure out what to do next, which can be fun. And probably my favorite part are the teams we work with. It's really fun to work with people to make things and everything we make is better because we're coming at it from a different vantage point. That's pretty fun. Awesome, that's really cool. Well, I think that about wraps it up. So before we go here, I wanted to give our audience some more information about our product schools, upcoming courses and events, our product management, product leadership, coding, data analytics, UX design and digital marketing courses are taught by industry experts working at companies like Google and Facebook. In addition to that, we offer weekly online and onsite events at our 16 campuses worldwide. And if you're located near campus, make sure you stop by one of our weekly events every Wednesday and Thursday. Also you can find us on social media at product school and be sure to keep up with the latest product management content at theproductblog at productschool.com. Thank you all for joining us today. Enjoy the rest of your day and I hope to see you next week. Thank you so much, Allie. Have a great day.