 This is a really quick intro on product management. It's designed for people who are early in the role or who are trying to get into the role. I'm going to touch briefly on what the role is and what the actual work looks like. One of my pet peeves when I was first starting in was everything I found about the role seemed to just be a laundry list of things. Like, here's a laundry list of things PMs do. Part of that is by the nature of the role. PMs tend to bring structure to otherwise an unstructured office or unstructured project plan, but part of that I think is just there is a way of looking at all of the work in a structured manner. So I'm going to try to go through it on at least how I picture the role. I'll also touch very briefly on the difference between smaller companies and larger companies. I have worked at a startup before this. I was at a company that then got acquired by Microsoft before that. So I've been through a few different places. So I'll touch on the differences I've seen across the different places. And then at the end, I'll touch very briefly on the interview process. So there is a similarity across the interview process, at least in the B to C roles. The B to B roles tend to be a little bit different, but the B to C roles there, you'll find some commonality across it. So I've interviewed at a ton of places myself. And so through the process, I've definitely noticed a pattern. So I'll touch on that. OK, so yeah, so that was that. So the background, I'll touch on my background. I'll touch on what a PM does. Here I'll touch on three different things. Sort of setting up a roadmap for your project, how the execution process works, and then how do you want to do the analytics and iterate through your project. And then I'll touch on getting a PM. So are the skills companies look for? What is Google's grading rubric? Some of the other places I've worked at in the past. And then sort of what questions at least Google tends to ask and other places I've seen ask for each of those roles. So that's me on the left. This is a little bit about me. I went to Duke. And I don't know if anybody here is from the East Coast, but went to school in the East Coast. That's in the 2010 finals when we won the NCAA championships. Was in Indianapolis. That's pretty awesome. Right after Duke, I worked on PowerPoint in office. I was one of the early days of Office 365. And so this was PowerPoint online or online web editor. You guys, if you use PowerPoint either on the client or the web, you might have seen those alignment guides that come up when you're moving shapes around. I put that there. In fact, me and my team have a patent on some of the stuff that on the logic behind how that works. Because when you have 200 shapes on a slide, you need to figure out which ones matter and which ones don't. Went to business school at Harvard. After that, that's where my wife, she's actually where she's sitting back there working, not listening. We have a cat. Cat's name is Mochi. After that, I worked at Yammer. For those of you who aren't familiar, it's basically an online work collaboration tool. Think Slack or Facebook for work. Got acquired by Microsoft, now part of Office. Worked at Ernest, which was a student loan refinance, personal loan startup in the fintech space, probably the biggest after SOFI. They just got acquired by a servicer called Navient. That's had split off from Sally Mae, if you guys are aware of that space. Been at Google since last year. That's sort of the hat you get the first day you join. And then, it's called Nouglors. They get new Googlers. And then I've been on the payments team, on Android Pay specifically and consumer payments. If you guys saw the rebrand Monday that we announced, we'll be rebranding to Google Pay and merging a bunch of things across the company. And so that's kind of what I've been working on for the past year. All right, so let's get started. So what does a PM actually do? This is a common thing. I'm sure many of you have seen. I kind of hate this diagram because it's kind of like saying, where is this, if somebody asked where the Statue of Liberty was, you wouldn't say it's between Manhattan, Jersey City and Governor's Island. Like that's a very not helpful explanation. And so, but you've probably seen this a lot. So I wanted to put this out there. The way I think about it is a PM owns the what and the why. Specifically what problems are worth solving and what features or products should we be building to solve those problems. Engineering and design often ends up owning the how. The boundaries are soft. So you'll often, you'll work across the two, but generally that's at a high level, at a principal level, what the PM role does in a software company. But that's a great at a high level, but which PM owns what? And so depending on where you are in the high, in the level of the company, your scope changes and the breadth of the problems are tackling changes. And so this is true at large companies. This is true at small companies. It doesn't matter at large companies, it might be the VP, but at small companies, it might be the founder. I'll just talk through an example of how this works out. For example, at my last company, Ernest, before I moved over to Google. So the founder there comes up with sort of things about what problem do we want to solve at a broader level? What market are we trying to attack? And so there the thought was student loan process is broken, let me hire a team, let me get some funds, raise money to go solve it. At a company like Google, it might be the head of Google Photos saying, the VP saying, all right, photo sharing and storage is broken, let me get a hundred engineers to go fix it, right? So at that person, they're basically putting a pillar in the sand and sort of nailing the broad market we're tackling. Then the level before that, the head of product is going to say to themselves, all right, I've been told we're gonna tackle student loan application, the student loan process, how should we do it? All right, we'll build an online application or a phone app or something else to go address and tackle this problem. Then below that, you'll have the individual PMs, senior PMs, junior PMs, they'll think to, all right, I'm working on this online application, let's build a feature to make sure people can refer their friends to this online application because we think our acquisition costs can go down that way. And then a junior PM might say, all right, I'm working on this referral feature, let's make sure inviting friends is easy so you can upload contacts, you can do things like that. And so that's how at every stage you're owning the how, it's the scope that sort of changes as the level changes. I'm gonna focus on this one. Obviously, that's sort of the target of this presentation, the earlier roles. So we'll talk through some of the how day to day work ends up working on that level. By the way, if you have questions where something's not clear, just stop me in the middle, I'm happy to answer them, and then we can just also do a round of questions at the end, so that should be. All right, so as a PM, your job is ship grade feature, ship grade products, ship grade features. And so the way I look at it, this is broadly, every feature just goes through this structure. Like there's some pre-building phase, there's some execution phase, and then there's some phase where you launch and iterate. Any one feature is pretty neat along this sign. The reason why, if you look at a PM's calendar, it's all over the map, and I'll show you what mine looks like, it's all over the map, it's just because you'll have a multitude of projects at every stage at any point in time. But basically, you have these three stages, and there are key deliverables on these stages. So the stuff in white, you basically have a roadmap initially. The goal is to figure out what you're building and how you're prioritizing. Then in the execution phase, you're gonna have a PRD, and then you're gonna work with engineers and designers to design it. The exact term at your company will be different. Some companies have an FAQ doc, some companies have a PRD, some companies have something else. Generally, conceptually the same thing. Then you're gonna launch, I'll talk to you some of what goes into actually launching, and then you're gonna analyze the results of the launch and sort of iterate based on that. All right, so pre-building. Let's talk about what the output of this phase is. The output of this phase is a roadmap. That's the most important thing that you as a PM should control. Your job as a PM, you're getting paid to prioritize what features to build. So your job should always be to have a clear list of, here's the prioritized list of things I want to do. Small companies and big companies do this slightly differently. Big companies will prioritize, individual teams will prioritize things, and this will roll up to an OKR. I'll show you what that looks like at Google. It's just a way of saying measurable results that are tracked across the company. Startups will have a backlog that they use in backlog grooming and agile, if they're doing agile. So that's the output. The question is how do you get there? So there's a lot of inputs that go in, but you're basically trying to figure out what features to build and how do you prioritize them. So there's some vision and strategy coming in from the top, from the people above you that I just mentioned. And then within that, you'll think through what does the market look like? What are my competitors doing? You'll do some customer and user research. I call this the soft stuff, but you're looking for qualitative data that will help guide you and bring out new feature ideas. Depending on what stage you are of the company, this can be easier or harder. I actually find this a lot easier to do if the company's earlier. You get locked in into a machine when the company's harder, doing it yourself. But it's important to get out there and do this. In fact, if you're in a product that is pre-product market fit, it's actually one of the best times in your product. You can just sit down with every single user and understand what are their pinpoints, what are they doing? Because once you're at thousands of users, then everybody is just a number on a dashboard. And so it's actually a pretty nice place to be. But there's a customer user research aspect. There's the analytics hard data. As you start to scale, this becomes easier and easier to gather. You'll get cross-functional asks, especially if you work in a larger place. There'll be BD, marketing, sales. They'll all have ideas, you wanna pull them all in. And then your understanding. So part of the reason you get hired is for a little bit for your gut sense. And so you wanna make sure that once you're in the industry, you'll have a sense of pain points and things that work and things that don't work. The important thing there is to validate it with the data and the research. So you don't want, you want to use that as a guide, but then you always want to validate it. So all of that will go in. So here's a roadmap. This is what my roadmap would look like at earnest. I put it in a Google tricks and format really doesn't matter. These are real projects we did. The ideas basically came from all of the sources that I just mentioned. So if you look at this list, there was the idea of launching a new student loan project. So we did refinance. The idea was we would launch a product for new student loans. At this point, I was doing copy experiments on an application page to improve conversion. I was starting with a basic referral program. And then there was a request to build a new blogging platform for our content team so that they could improve our SEO. So this was an actual roadmap I had at some point. One thing you'll notice is I try to keep big bets and small quick wins at any point in time. And I try to categorize things in those ways. Just found that it's nice to make sure you always have a mix of those on your roadmap at any point. If you only go for shooting the moon, you miss out on a lot of little things that you should be improving and quick wins. But if you only go for quick wins, you'll get into a local maxima and it's really hard to shoot for the big projects. You also don't want to go for two quarters without any wins. And so the quick wins sort of help balance out if you're taking big risks and nothing ends up panning out. You still made a lot of headway by accumulating a bunch of quick wins. The potential impact, this is backed by user research and analytics. So you'll run data, you'll run user research and you'll try to figure out, all right, like I think a new student loan project that can open us up to an entirely new market. So huge impact, copy experiments, medium, actually copy tends to be pretty strong in terms of one of the quick wins you can have. And so medium impact potentially across all of them, referrals, similar thing. Here's, you want to measure your cost. Here's where a basic understanding of the technical infrastructure of higher product works helps a lot. You should be able to fill out this column at a high level without having to run to engineering for every single row. Ideally, you always want to validate it with engineering and if you're not sure, then you want to have a conversation with them. But this allows you to, when you have, when this list gets really long, it allows you to scrape out a bunch of things really quickly because you know instantly, oh yeah, the system can't support this today. This will be a lot of work. You combine the impact and the cost to your ROI and come up with a priority and then you basically just draw the line somewhere, right? So you figure out what you can do this quarter or this period and you're like, all right, that's my road map. How does this really differ between small companies and big companies? Small companies, you basically just pop the next thing off your road map once you have engineers. That's, and then you work agile typically, every company differs. But that's essentially how it works. And so where does the urgency come from? The urgency comes from because at small companies, everything's always urgent. Like you'll have aggressive growth targets you're trying to meet. You'll have launch dates for big projects that investors want to see, your board wants to see. So there'll be plenty of urgency and so you're just popping stuff off the top of your queue the moment you can and putting them in the hopper. Big company like Google, a lot of big companies do this differently. This is how Google does it. We have OKRs. OKRs are basically measurable results. They stand for objective key results, I think. The important thing is they're measurable. And so this scale goes top to bottom. So this is last year genuinely how payments OKRs would roll up into the company. So at the top level, if you looked at our company OKR, I've obviously gotten rid of the numbers. But if you looked at our company OKR, I would say something like, all right, the payments team should do why, or Google in payments wants to do why dot million dollars or billion dollars of gross volume of transactions. We want to launch in-store payments in X number of markets. The product area, which would be the payments actual payments team. So this is owned by the CEO of Google. The product area owned by the individual team would then own, all right, so we're actually gonna go and launch this product in XYZ markets. We're gonna launch online in Y websites. We're gonna ensure that maybe some percentage of people are ready to pay state, which is what we call it, which is you have a card and file, you're ready to go. The idea is all of this should help this metric. And then individually, as a PM, you would own features that roll up into that. And so you would go through your backlog, figure out what you wanted to do and what would contribute the most, and then you'd put it in your OKR list and make sure it just matched up, tied to an OKR above you. So it might be things like improving conversion of card adding or launching campaigns with stores and merchants, so things like that, with the goal of improving increasing transactions or gross value or whatever the top line metric was. I think all of that should, hopefully I've been straight forward, yes? All right, so execution. You have your roadmap, you know what features you wanna build. Now you're writing your requirements doc. This is basically where you write down what is a feature trying to achieve and really the details of it. It should be a collaborative process, honestly, between with you and engineering and design. Like what you don't wanna do is write a PRD and throw it over the fence. That rarely ends up working out. So you work with your engineers to understand what are the questions they really need answered. You wanna make sure you lay those out. These are the sections of the PRDs I write these days. It changes depending on the company. Different teams matter. Here you'll see sections for risks and legal and security, we're in payments, those things matter. So we'll have things like that. High level use cases and detailed design this right here, that's a meat of it. That's where all of the key sections go. Again, this should feel like a team. The important thing to remember is you, your designer and your engineer, you're on the same team. If it feels like you're not, if it feels like you're pulling teeth and you guys wanna go different ways, you need to step back and figure out how to align the team. There are obviously two reasons teams cannot be aligned. You may not be aligned on the goal or you may not be aligned on how to get there. And if you need to figure out which one of those two things it is, because if you're not aligned on the goal, then trying to align on how you to get there is a bad idea, it just won't work. So, but you start, make sure you figure out if there's a mismatch where it is, and then you align and then you write the PRD. Once you have your PRD done, often gets broken down into tasks. If you are in some place that's agile, it'll go in some sort of backlog. And my current team at Google is pretty, is not so great about this. We just go with quarterly OKRs, but I've worked, I know teams at Google that are. At my startup, we used to follow agile pretty closely. Typically, you'll have, if you have a TPM or an Eng lead, you might help aggressively with this. In that case, I just recommend staying involved in the stand-up so you know how much progress you're making as things go on and you can sense when there's going to be issues earlier on. But yeah, we just track what's defined, what's in progress, what's done. We've actually used JIRA, but I didn't have access to JIRA right now, so I just made up a screen chart on Trello. Okay, so you've done that, you've executed, you've built broken down everything to tasks, you worked with your UX, you've put it all together and now it's getting close time to launch. So how do you do that? So first thing, obviously, you're gonna run across is bugs. Like what are the bugs we're launch blocking? This is my day right now because we're working through this big rebranding project, we're pulling everything together under Google Pay. So I'm working together through to figure out, all right, like every day, what are the bugs? Triagem, make sure you figure out what's launch blocking. That's the most important thing at this point. You're looking to classify bugs as launch blocking or not, aside from it, you'll prioritize a bunch of things, but ultimately that's the decision that matters. And then are all the cross-functional teams ready? So for us, this is legal, risk, customer service, that sort of thing. Might be marketing, BD, other things in your company. And we'll do something like this. So we'll have, we actually use a tracker called Ariane or LaunchCal, goes by a couple of names within the company, but basically you can tag which teams you think are involved in that project should be involved and they'll have to check it. And then other teams just get marked as FYI. And so, and the feature doesn't launch until everyone's checked it. The last person to check it tends to be the VP. So the VP, our VP won't look at it until every other box is checked, and then he'll put in a final check and we'll start rolling out the feature. All right, so you launch it and then now it's time to measure the results. So if you launch it as an experiment, you either launch it as an experiment or you launch it as a feature. Sometimes not everything can be an experiment, but let's say you launch it as an experiment. So your next job is to analyze the results of an experiment. And you can do a whole course on this, but I'll just give like a quick two-minute on sort of like, as from a philosophy level or philosophical level, what do you look at at the high level? So generally some stuff's gonna go down, other stuff's gonna go up, and you're gonna have to figure out what to do. It's not a ton of experiments where everything is positive. There's usually trade-offs in a product, make something bigger, something else gets de-emphasized. So you need to figure out what to do. So here's an example of an actual feature we launched at Yammer. So for those of you that who are not familiar with Yammer, think of it as a Slack or a Facebook for work. And this was my job two jobs ago. Basically an enterprise communication tool. So we had groups in Yammer, kind of like Slack channels or Facebook groups, I guess. And we launched a feature that let users search for a message within the group. And this is effectively what happened. I changed up the numbers a little bit, but generally this is what happened. So these are the metrics we looked at that were on my dashboard. We had one day retention, which is users coming back, how many users come back the following day. That seemed to go up. The number in brackets is p-value, the lower the better, it's a measure of how statistically strong your results are, what's a chance it's actually by chance. You generally want that number low. 3%, 0.03 just means that 3% chance that this result, this positive result is chance. You run this experiment a hundred times, you're gonna get it 1.5% lift by chance, even if the two things were equal. So that's why you want this number to be lower. And that's sort of the lift, the delta between the test and the control groups, the two groups you launched it. So anyway, so one day retention went up. 1.5% is actually a fairly strong boost for a mature product in retention. So the one day retention went up a lot. Days engaged, which was another metric we measured, sort of tied to that. It was how many days did you engage in a week? So that went up slightly. The number of likes that we got for posts went down actually substantially. Messages that people wrote went up, but the p-value was slightly large. So not hard to read into that too much. And then posting binary number of users who posted and then thread starters, the difference between messages and thread starters is thread starters is just the initial post. Messages includes replies. That was also flat. When the p-value is flat, we just attributed it to chance. And so we say, or the p-value is large, we just attributed it to chance. So we say this number doesn't matter. It's just, it's flat. So question. So how would, does anybody wanna venture on how they would interpret this? Like, would you ship this? Would you not ship this? Any ideas? Yeah? Yeah, why? Yeah, that's a good answer. Oh, so the N here is a number of people in the experiment. So that's a sample size. And once again, what's p? So the equation to calculate p-value is complex. You, it's a statistical curve. So, but it's basically the number is the percent that that result is purely by chance. So for example, this means that if I run this experiment a hundred times with making no difference between the test and control group, 12 out of the hundred times, I will see a lift of 0.4%, right? Just out of pure statistical chance. So you want that to be lower. We've looked at, we typically consider p-values of 0.05 and less. So 5% or less good, p-values of 10% or less okay, 15 or less, all right. Fine, you're stretching it though. Lower than that, we would just say it's purely chance. If you have a higher sample size, it can get even closer. So Facebook, I'm guessing they're gonna have, go shoot for really low p-values. Retention is good. That's like the main, the main reason why. Yeah, yeah, so that's the right answer. So the comment was the retention is good. So let me talk about how to look at this generally. So the wrong answer is more things are green than red. So we should chip it. You can always add more metrics to your dashboard and you'll end up with more green or red or more red or green. That's not helpful at all. The correct answer is trying to figure out what the global metric is. So there are global metrics and there are local metrics. Global metrics are the metrics that most closely represents the top level goal of the company. And so I'll talk to Chi Chi really, but what happened to your global metric? And then local metrics is basically, do they support what happened to the global metric? If not, then you need to think through what is your hypothesis for why that might have changed? So a quick cheat sheet on what the global metric is. Companies tend to fall in two buckets. They're either typically, I'm sure they're exceptional, but there is typically transactional or engagement driven. So if you're transactional, you're trying to get people through a funnel. That basically means so for example, at earnest, we gave out loans. So loan applications or the number of loans we gave out, that was transactional. Amazon's gonna be transactional. It's how many people make it through a funnel. Sure, searching on Amazon is great. And so you might measure some engagement metrics as local metrics, but if you're not moving orders, that's a problem. Like something that gets searches up but moves orders down, that's probably not a good thing. Or you could be engagement driven, where you wanna make sure people come back to your product or spending time on it. Facebook's engagement driven, Yammer in this example, was engagement driven. And so if you're an engagement driven company in this case, you're gonna measure engagement. And so the top thing you're gonna look at is things like time spent on site or number of days you're coming back or things like that. And so given that retention and engagement went up, that's your symbol for why you're gonna ship this. Likes went down, you should think through why it could be coincidence. There's always a little bit of chance. Pay P value is still not zero. But it could also happen, our hypothesis here was, well, we let you search, so you're doing less scrolling. You're doing less scrolling. You might come across less random posts that you might like. So likes went down. So we then said, all right, let's validate that. What happened to actual searches? And so then we ran a custom query because searches wasn't in our dashboard. And we said, all right, yeah, it looks like searches did go down, sorry, it looks like scrolling did go down a little bit. Sorry, scrolling wasn't in our dashboard. So we said, yeah, scrolling went down a little bit. Yeah, that validates why likes may have gone down. And so that was our hypothesis. But we said, hey, it seems like people are getting to what they want quicker. They're posting more, engaging more. Good thing. Right now, I'm on Android Pay. We're going to be Google Pay, and so a lot of our metrics have been changed. But last year, one of our key metrics was TAPs, how many times do you tap at a store to make a purchase for in-store purchases? So that's a transactional product. And so that would be your North Star. All right, so yeah, I started my presentation with this. A single project may follow this nice schedule, but your schedule doesn't. This is what my schedule looks like. Ends up starting at 8 AM, 9 AM, sometimes goes down to the meeting with my manager at 9 PM. This Thursday night meetings with Singapore, so a little bit all over the map. The time, at least when I looked at this was a week in November. I tried to figure out how I spend my time. Roughly a day overall was spent on figuring out the product strategy and planning, a day on partner and user research, and then meeting with partners. Engineering, time of design, metrics and analysis, and then general coordination. General coordination is obviously higher in a bigger company. You're sort of the glue, and so there's a lot more of that. So since you have products in every state, that's kind of what it ends up looking like. All right, so how do you interview? Let's talk about that. So this is what Google will grade you on, and Ernest, too. Our lead PM was from Google, so Google and Ernest were basically the name. We had just taken Google's grading rule break. And I put sort of the list on what we looked at while we were at Yammer. Google doesn't really break it into hard skills and soft skills. That's just how I look at it, so I sort of put in my lens on there. But on the hard skill side, we look at analytical. We look at technical abilities. We look at product and strategic insight, and then communication, creativity, and culture fit for soft skills. Culture fit is, so you can probably figure out what communication and creativity are, culture fit is basically, does the interviewer get along with you effectively? And for Google, it's also a heavy dose of, does this person feel like a team player? I remember when I started, one of the folks I worked with at Google told me, you're not gonna get ahead at Google by working really hard because there are plenty of people who work here really hard, and you're gonna burn yourself out. You're not gonna get ahead at Google by being really smart because we hire really smart people, and I'm sure there are plenty of people smarter than you. You're gonna get ahead here by trying to be a team player. And so that is, I think, one of the key things I took away from my first few days, and it's one of the things that they look for, we look for on the culture side. And then I've sort of mapped those to what Yammer looks at. You're phrased in a lot of different ways, but a lot of the same elements are there. So I guess my point is there's a level of consistency that you'll notice across the different companies. At least the consumer-facing ones, B2C companies. All right, so, yeah. Yeah, I'll come to that. I have examples. So here are some questions that I would put in the hard bucket for each of these. So for a product, two very common set of questions, and the nature of the company actually affects which one you're more likely to be asked. So if you know people at the company, I would try to figure that out. This actually, does this laser pointer work? No. Okay, so the two types are sort of design X for Y, which is like design an alarm clock for a blind person or something, I call it a blue sky question. Or how would you improve product X? And I sort of call that the improvement question. There you're given an existing product, or you're told, pick your favorite app on your phone. How would you improve it? How do you improve an existing product? And I'll sort of come through how to answer those and what we're looking at. I think there's a structured approach that generally makes sense. Everybody has their own style. I'll give you mine. Some analytically focused ones. Metrics focused. Talking about some of the things I just talked about around metrics. We ran an experiment, this happened. What do you do? Math focused. We still ask, I forget whether Google still asks some of those. I forget whether I was asked this, but we certainly ask these a lot at earnest. How many people fly out of Esfo a day? How many bottles of shampoo are sold every year? Basic estimation questions. Oh yeah, I was at, yeah. I think they're still there at Google. Technical questions. Two types typically, high level technical insights. So for example, what factors would you consider when deciding which videos to show and how to rank them on the YouTube column in the related videos tab or in section? And then SQL basics. Like, given the following tables, how would you produce this other output? Depending on the company, there may be more or less of this. I think it's good to brush up on some basic SQL queries. You can do that offline. You can do that. I know Stanford has a great online class. That's where I first picked it up before I started getting into interviews where this was a thing and then it was useful in all my jobs. About the soft skills, you're not gonna get asked a question about this. I mean, there'll be general behavioral questions where they'll judge some of this, but there won't be a specific question for this. They'll try to get at this as you answer these other things. So I'll just talk through these four really quickly. The two product ones and the two analytical ones and that's, and that would be the end of the presentation. Just because those tend to be the most common and structured from an approach standpoint. So, all right. So the blue sky serve question, like you're given some things to design. The best piece of advice I can give you and this was sort of advice I got when I was interviewing for my first PM job. If you're asked, regardless of what you're asked to design, think about what are the first five questions you'll ask the interviewer when you're told to design something. So for example, if I tell you to design a toothbrush, I haven't really told you anything. Like, who is the person who's using this toothbrush? Are they using it traveling? Are they using it at home? Are they, is the person disabled? Is the person not disabled? Like, how much are they willing to spend? Like, I haven't really told you anything about the factors that like, are you designing on a time crunch? Like, what's going on here? So you need to figure out like, what are the, think about, what are five generic questions that you can ask right away? And that's useful because when you're getting a question, there might be a moment where you're, where you have a, I guess a deer in the headlights moment where you're not sure what to do. It's interview, it's high stress. If you're prepared, you'll get the ball rolling quickly. The most important thing generally though is to start with the user. As a PM, you're the voice of the user. You probably heard that. So start with the user. Figure out who is the user, what are their needs. And if you're not given a user, you should list some options and then prioritize and quickly pick one. So, and I'll give an example of that in a second. Think through ideas and features. Try to have at least four or five on things you could do. And try to have at least a couple that are out there. This is where you get your creativity points. So you wanna go for some things that seem obvious and things you could improve but some things that are creative and get out there. And then generally, I think this is really smart. Like, this is where your communication skills comes in. You should always be structuring. So anytime you give an answer, no laundry lists, always structure whenever you can. That's super, super important. So if you're prioritizing, you should say, I'm prioritizing based on these two factors. I think this will drive usage and this will be easier to implement or something. You should have some logic or structure to answering each of these sections. So, this is a question I got when I was interning at Microsoft back in 2008, 2009. Design a music system for a car. So, start with clarifying questions. Who is the user? Where do they live? What type of car do they have? So, for example, you may have, and let's say you're not told. You answer, you get those answers and then you're not really told who the user is. So, some options, there could be a student in college. There could be a working dad in the house. You could be somebody driving an SUV. List some options, pick one. Tell them how you're picking one. You think the market's bigger for this or you think their current car, these are the people who tend to own cars or drive them more or whatever. Have some logic, pick one. Then feature ideas. So, what features should this music system for a car have? Where does it get its music from? So, is it from your phone? Is it local storage? Is it sync with home Wi-Fi? How do you control it? Things like that. And then, for each of those questions, you should prioritize and then pick your answer option, right? So, for example, if you wanna decide where does it get music from, you may say, all right, it could get its music from your phone, your local storage or sync with your house. Where are the pros and cons of each? And then based on that, you'll pick some options and then it go forward, right? And then at the end, you'll have an answer of a thing you've designed, basically. How would you improve X? So, this is sort of what I call the improvement question or like pick your favorite app, how would you improve it? Very similar to the previous one with some added complexity around, it's a product that's already out there. So, you need to know what's the current vision or goal of the product. And then what is the top-level metric based on that? So, once you know what the vision or goal is, what is that thing that they should be trying to improve anyways? And then who are the users? What are some unmet needs? And then how would you structure an experiment to improve those? Like, when you're doing improvement, you can experiment. You can't, it's a little harder to do that when you're designing something new entirely. A quick tip here, before you go into the interview, there are lots of interviews where you'll get asked to pick a favorite product and improve it. Have thought through at least three or four products ahead of time. Three or four products are phone apps, three or four products that are web apps, three or four products that are B2B apps, and then sort of think through and come up with three or four ideas to improve them. The important thing is if you've thought through apps and you have ideas at the interview, you still need to go through the steps. You shouldn't say, oh yeah, I love this product. Here are four ideas. That laundry list will not work. You need to start with and say, all right, this is a user and this is what they need. So, I'll give an example. Let's say I'm told pick a favorite app, improve it. Let's say, all right, look, I've recently been using UberEats. I think it's a great app. It's designed to help people get food they want quickly. I'm smirking because I have a friend back there who works on UberEats. So, I might start with a user and say, all right, so this app is designed to help you get food you want quickly. Who are some people who might use it? Well, there might be single working men and women. So, they get home for dinner. They don't want to cook. They just order. There might be office workers during lunch. Might have a few of those. Pick one. So, let's say we pick single working men and women. They care about time. They care about convenience. All right, so what are some improvement ideas? Well, you want to help them get their food quicker. You want to make it more convenient. So, maybe there's a pre-selected menu of top nearby items that you can check out with one click, right? So, it's really quick to check out. Quick to make and deliver because they care about time. Or maybe there's a one click order option for the items that are already there. Then you'd rank and prioritize this and you'd pick an idea. One of the things you'll notice here is, again, like I might come up with ideas that I've thought of at the end of the day, just having done this exercise in the past, but it's important to go through this, okay? Don't just jump into ideas. The sort of metrics piece on, this is often a follow-up on the improve X question, which is, all right, great. You thought of a great idea. One click order of popular items on Uber Eats. How would you test it? How would you test if that's a good idea? Well, you can obviously build a full feature. One thing to think of is, can you do an MVP test, a minimum viable product test? So, is there a piece you can pull out and test and see, check your hypothesis and see if you are down the right path? So, for example, the goal here is convenience. So, maybe, and speed. So, maybe, you have a one-click repeat last order button, and that way you're not dealing with creating quick order menus and things like that. You're just putting a single button that lets you repeat the last order. That would be an example of an MVP experiment. The goal is, at the end of the day, think about what top-line metric is. Uber Eats is a transactional product. They probably care about orders. So, that's probably gonna be your top-line metric. And I think that who would you test on? So, would you test on users everywhere? Are you gonna, to the extent you can, you wanna test on your entire population base? If it's a product that's local to certain markets, then you may wanna test it location by location. But, otherwise, you try to test on your full population base because you'll get to statistical significance quicker. You'll have more people in test and more people in control. Here's an example of an analytics math-based question. How many planes fly out of SFO every month? I was asked this in earnest or when I was joining earnest. The most important thing is lay out the complete formula first. Don't do a single calculation until you've laid out the full formula. So, you might say number of flights is number of days per month, times minutes per day, times takeoffs, plus runway each minutes, times number of runways at the SFO airport. And then you might make some guess for how many runways there are at the SFO airport. But lay out the formula first. Then do the math, then gut check, and say, all right, does that look right? Or does that not look right? And then you stress test it. And I'm surprised by how many, how few candidates and interviews actually do this. This is one of the reasons why laying out the formula first is super useful. Like you look at it and you end up with a number that say, all right, a hundred flights fly out of SFO every month. Well, you're like, well, that can't be right. There's clearly something wrong. So, go back to this and you can say, all right, like which assumption did I probably screw up? And then you can stress test that and you can come up with a range. So you can say, all right, let's assume the runways is not, they're not two, but it's four to six. What does that mean? That means I have a lower bound and an upper bound and four at six. So the lower bound and upper bound of what I end up calculating changes. And so laying out the formula helps a lot. Otherwise, at some point, you'll get lost in the calculations. And then if you're stressed test, it won't work. Ideally, you stress test yourself before your interview or stress test the answer for you. Some resources I've found helpful going through the interview process. Most of these are pretty common knowledge, which is really awesome. Cause when I was interviewing for the first time, none of these books were there. Cracking the PM interview by Gail McDowell. This book I found, and by the way, I have no vested interest in any of these resources. So this is things I personally found helpful. This book I found very useful early on if you want to understand just what the PM role is. And I found this good for the technical parts. So if you want to brush up on some basic technical algorithms and things like that that you forgot, it's good for that. Decode and Conquer by Lewis Lynn. This is good for practice questions, like to just go through questions and questions and questions. I found Lewis's Lynn better for basic Q&A stuff than Gail's book, but I found Gail's book better for the technical stuff and an overview of the PM role. And then definitely do a Quora Medium Google search for prep on the specific company. Not just on interview questions, but see if you can find out what their head of product has spoken about in interviews in other areas. You'll understand how they think through their product at a high level and what sort of metrics they look for and what their vision is and their goal is. But yeah, that's basically it. The one of the nice things about the interview process for PMs is I think it's actually a very preferable role. Like if you do this and spend your time, you can really prepare and out-prepare. And then beyond that, it's a numbers game. You'll never be a perfect fit for every company. There's always gonna be a 40% luck in the interview process regardless of how well you've prepared. The interviewer doesn't like you, the interviewer was having a bad day, you stumble, you blank out. Like a lot of stuff happens. So beyond that, once you've prepared, it's just a numbers game. So that's it, questions. How to get things in the supposed best way to do that. Yeah, that's a great question. So it depends on, so it depends on what you're trying to do. So typically I've found that changing industry and function both is hard. So if you're already in tech, it's easier to make it over to PM if assuming you're not a PM already. If you're not in tech, then it's often easier to move to a similar function in tech that similar to what you're doing and then move over to the role. Couple of options, if you go to a startup, there's typically enough for everyone to do, particularly regardless of how young the company is. So I've found a lot of lateral transfers in startups as well just because you end up doing, if you show an interest in that sort of thing, you can do a lot of projects and then move over or interview at a place and say you did all of these other things at that company. At larger companies at Google, we also do lateral transfers. So there it comes down to like one of my teammates who worked on the customer facing side, what we call the partnerships team helps partners implement Google technical solutions just moved over. And the way he did it was he was basically working with me for a while. He started doing more and more things that PM would typically do, which was very useful for me, particularly at times where I had a lot on my plate, but he would work closely with partners. He would write, he would put up the dashboards to analyze metrics, start doing things like that. And then it was a pretty clear process from there to then interviewing for the role and then transferring. And so there's no, one of the odd things about PM is it's a fairly flat organization typically. So if you're a company like, if you're a company with a hundred engineers or typically I have eight or 10 PM. So while there are like three levels of PMs, or sorry, on engineering side, they'll only be one or two on the PM side. And so what that means is hierarchy doesn't matter so much, the scope of ownership matters. So regardless of which company you join, try to find a company where you can do some of that job, and then you can move over, whether within that company or outside. That's typically the post I found at the easiest. At Yammer, for example, we had 10 PMs while I was there. Two of them came over from marketing, one of them interned in marketing, but then joined full-time as a PM. Two of them came over from our own BD team. One of the, three of them were engineers and then two of them I think were full-time, or more PMs before that. So I don't think that adds up to 10, but roughly it was a pretty broad cross-section of folks. Yeah. Oh gosh. So happiness. So let's talk about that. So I actually, I like shipping stuff. Like the best thing about a PM is shipping stuff. Like seeing your products in the wild and actually shipping things is pretty awesome. It's by like being able to look at something on a phone or something at a laptop or something on someone's device and say, yeah, I made that, that's really cool. And that's really exciting. The thing that's great, not so great, let's think. So the process can get tiring sometimes. So, and this is genuinely true, regardless of whether the company is small or big, different companies have different pros and cons. At a smaller company you'll have bigger scope of ownership, obviously, but you're typically perennially in a state of urgency and everything's on fire. At a larger company, that happens less, but you have more layers to work through managing above and managing across. So different things at different companies, yes. Yeah. Oh, funny. I was trying to get out of PM and then I got to business school. I was going to go into consulting. I worked at Bain for a summer, a strategy consulting, and then I decided I like product way more and so I basically came back. Business school is one of those weird things where I think the soft, it gives you a couple of things. I think the soft skills it gives you is way more valuable than many of the hard skills. It's kind of like, I mean, all of you guys probably went to undergrad. How much of your undergrad do you use today? Maybe a little bit if you did CS and you're now in tech, but otherwise probably not that much. Was it useful? Did you learn a lot generically? Yeah, probably. So business school I thought was useful in that same sort of way. I think it was good to look at the world and products and things holistically. It is somewhat of a funnel to get into some PM jobs at large companies. People do tend to evaluate you as, okay, you're graduating from business school more than, okay, what is it that you are just doing right now, but depending on your motives, I would say if you just to get into PM role, I think it's useful only to just learn and figure out like how things work. I told you what my motivation was, but probably different for different people. Yeah. Do you have a computer science background or something? Oh, good question. So no, I wasn't, so undergrad was electrical engineering, not computer science, which means I did a bunch of stuff with wires and circuit boards. So a little bit of programming then, did a little bit on the side afterwards too, which where I picked up a little bit more, but undergrad was not CS if that's a question. I think as a PM you need to know, and if any of you are thinking about doing coding, I think more useful than like hackathons and things like that, what's really useful is understanding the basics. Like it's more useful to understand the basics of how things work, so that you can piece together an architecture in your head systematically than it is to be able to hack together something. Because as a PM you're mostly doing resource allocation and thinking at a high level what makes sense, you're never actually gonna be, well, you may be, but typically you're not gonna be actually writing the code. So. I wonder if this would be all right. SQL, you'll do, it's useful to know how to do basic SQL. At Yammer we had a very strong data science team, so I ended up not doing it so much. At Google there is a little data science team that works on the business side, at least on my team, but otherwise it's the PMs and engineers coding, sorry, writing the SQL scripts themselves. So it's very useful to be able to get your data yourself. Yeah. Good question. So let me put it to you this way. It is true that at Google, it's one of the companies that looks for more of a technical background than perhaps other companies. Exactly how much will end up depending on the engineer who's interviewing you that day. But it certainly looks for a little bit more of a technical background, or at least a technical understanding. Like you need to be able to answer in a structured fashion whatever question comes up, whether it's like how do you rank videos on YouTube? Like how would you think through it? Or you're writing an algorithm for solving Sudoku, like how would you analyze? You don't actually end up writing the code though, you could write Sudoku code, but it's things like all right, I would go through it row by row, I would analyze for this. And so how do you build that out? Does that, do you have a way, a structured way of answering that, that makes sense. So yes, I would agree that from a, at a high level, yes. Google probably looks for slightly more technical, respective, more than say for example, I've been through interviews at LinkedIn and Microsoft and Facebook, and some of the other big companies, a little bit more so. Program managers, the role at Google is much more amorphous than what product managers do. I think product managers is somewhat standardized at Google, even though still a huge amount of variability within teams, but more so than program managers. Program managers in some teams, so we have a few in our teams, they help mostly with project management. So my job is to figure out what goes on the train. Their job is to make sure each train ships, leaves the station on time. And so, and I put stuff on the train. And so depending on the team that could differ, I've seen folks, program managers who also own actual growth metrics and things like that. And so it really depends on the team. There's a little bit more ambiguity. And if you go through our job section and you filter for program managers, you'll actually see that. The definitions and the descriptions will be varied. Yeah. I was going to let you talk about something for the people at home. Yeah, so I actually haven't worked with any technical account managers on my side. My understanding is that they work with the partner to, so they work with the partner to help integrate the technical solution. So in fact, in many cases, they're more technical than product managers. Their job is to basically, if you have a solution, you have APIs, you have big partners on the other end, they're trying to work against it. You help them interface and you help them understand. They're actually a fairly, a very strong resource, but, and they tend to be even more technical, to be honest. Yeah. Yeah, that's a good question. So B2B, the biggest difference is you have a big sales team or you have a sales team. And the sales team is close to the user and your users come in lumps. And so individual users end up mattering way more. And so you'll often make product decisions because there's a $10 million deal on the line. Like that's something that happens in a B2B context that doesn't happen in a B2C context. So your sales team ends up being a huge impact on your roadmap and ends up being a huge influencer of your roadmap. That's probably the biggest difference. Your users are, your voice of the users, sometimes you're working through the sales team. You want to get on the customer calls yourself and understand things yourself always. So you can understand what's common versus what is being proposed just because of this one deal. But the sales team I think is probably the biggest voice. And then on the other side, you're translating that to engineering. And so one of the things that actually does happen is you end up being an even stronger voice of the customer or of the business to engineers because in B2B roles, unless you're building a dev tool, in B2B roles, you'll often find that there are less opinions of what to build. Everybody can look at and loan application flow and say, oh yeah, that doesn't look good, but it's a little harder if you're building a B2B tool. Yeah. Oh, yeah, so that's a good question. So yeah, so you'd have a, so like for example, at Ernest, our business metric at the end of the day was the dollars of loans that we gave out. That then translated, that then obviously, and then and the average interest rate on the loans that we gave out. It's sort of the delta between our cost of financing and the loans we gave out. And so, but then effectively that would then, the way you end up driving that is through some change on the user side. That is the metric you end up owning as a PM. And so that's why I was focusing on the user benefit. So it'll be a marketing team that'll focus on cost of acquisition and things like that. And so, but a lot of the business metrics will then end up coming down to you as a PM in terms of a user behavior or metric that you're trying to drive. Cool, let's take one more, yeah. Thanks, so thanks for explaining how a lot of you said impact the PM role. Yeah. How do you call the team that said the makers have to use it in this message or something? Yeah. So qualitative three ways, two ways to get it. You either run a survey where you're not in front of the user or you sit down with the user and you talk them through the product or mocks or whatever. If you do a survey, you can typically get quicker responses but you're more limited on what you can ask and it's a little harder to get a feel for what the person is doing. When you're doing an interview in person you can ask a question. If they react to a screen or a product or say, oh, this flow doesn't make sense or I don't understand what to do here you can follow up immediately and say, how does it make you feel? What's not right about it? Like what would you do? Like if you had a magic wand and you could wave it how would you have done something else instead? You can have follow up questions. Those are obviously more expensive not in terms of money but just in terms of your time. The rule of thumb I've seen in the past is that you start getting diminishing returns after like five or six in-person interviews and then after that you want to take your learnings back iterate on the mocks or iterate on the design and then go back and put it in front of another set of people and do it again. And what happens if as a result of those sessions you find that the right thing to do would be color-acted? You'd have to give me a more concrete example. Typically, so. I guess what exactly would be let's say transactual let's say you can increase transactions but the percentage of fraud would increase or something and use your experience with the grid as a result of that so it won't be principal at what point in time you should be able to find that. So the question was like if you improve, if you do create a feature that improves transactions but the rate of fraud goes up, what do you do? Good question. That's a trade-off where you have to make a call. So I'll give you an example. Let's say you're a PM at LinkedIn. You put more ads on the home feed. You'll make more money but let's say engagement drops. How do you ultimately make that trade-off? I asked a friend who worked on the LinkedIn feed and he basically said, the head of product said we're willing to take a 5% cut this quarter for ads, how much ads can you make? So at some point that basically just become the trade-off you have to make a call. It comes down to the business at that point. Where let's say the CEO or one of the bigger clients have a great idea of the day. That's certainly not a startup-only issue. That happens just as much at big companies. In fact, at big companies you have more people above you who can have the same thing happen. So definitely not a startup-only thing. The question was what do you do at a smaller company if your CEO or big client comes to you and says they want a specific feature? How do you react to it? You have to figure out where, as a PM you need to step back and figure out how it lands in your roadmap and then you need to be able to communicate that up. This is part of your job is managing up. Will the roadmap get influenced by the people above you? Absolutely, it's going to happen. There's no question about it. But it's your job to make sure that it fits in the broader objective. Data helps a lot, user studies help a lot. That's the sort of thing where you take back, you put it on your roadmap and then you try to do, get some data and some user studies to validate it. Is it only this customer? Is it all the customers? Do all our users feel this way? And then you can take that back and do a report and decide whether or not it's worth it. See if you can break it down into a small feature that you can quickly build and do a run it as an A-B test. Also helps, that's not always possible if the idea is really out there. But that's how I would approach it typically.