 So at Isogroup, for those of you who aren't familiar with the brand, we have several brands. The one that we really focus on is the Knot. The Knot helps couples plan weddings. You may have heard of it. My goal is going to talk a little bit more about it later. We also help connect couples to their wedding vendors and also more recently to their guests. Tonight we're going to talk to you about what we believe to be the product to managers superpower. It's user science. We'll get into the basics of what we mean by user science. And we'll also talk about application. Because what's the point of talking about the basics if you don't know how to apply it. I'm a lead researcher at Isogroup. So if you think about how a lot of organizations are structured, there's typically these days when you're working in a lead product environment, smaller product teams that are made up of products, engineering, design, sometimes products, marketing. And there are many of those smaller teams. I as the lead researcher basically work as a service to all of those teams. So it's really up to me to make sure that someone like Michael here is doing the right research at the right time and learning really valuable insights that can make a high impact. Michael is a senior PM. Here, I'll let this go. Yeah, so I'm... We'll figure out how to do this. Like I said, I'm a senior product manager at The Knot where I focus on wedding guest lists for couples. So I think you guys are probably someone familiar with weddings. We guest list to the white people. That's kind of what I focus on. I'll talk a little bit more about it when we get into the application list. Oh, so you guys... You guys... Okay, yeah. I have a question. Chris said tricking it to not spinning. It's something on the screen. Okay, so what is user science? A couple of years ago, we had a new head of product start and I was the only researcher at the time. And before he even walked in the door, the product leads were saying, Oh, he really believes in user science. So kind of like gay or shitting as a person. And I was like, what is that? That sounds fancier than I am. It sounded like just a little bit more academic than the way we usually do things. But it turns out it's really logical, easy to apply and not that complicated. Some of you may already be practicing it. It's really in a nutshell. The craft of understanding user needs and user behaviors. And it's important to understand that user needs and user behaviors are very different. What users say they will do or say they did is often very different than the actions that they take within your product. And Michael's going to get into some details about that in a little bit. Before we get into specifics though, we're used to doing this at work. So we also just want to bring that here and make sure that everyone understands why it really matters. I'm not going to be pedantic every through the bullet points. My biggest things that I want my audience to take away are that in order to build successful products you have to understand why your audience is doing what they're doing. You can look at data and understand what. But until you understand the why you don't really have the full story. And then also I recently married and Michael's actually engaged and we serve engaged couples and we both acknowledge neither of us are the user. You have to really put your ego aside and know that you don't represent everyone in every demographic that's going to use your products. And I think for me from the five managers side the reason that user science really is such a superpower is because a lot of our job as five managers is to get the most out of our resources no matter how many we have. And user science really allows you to narrow in on the right problem to solve. There's a lot of problems that we could be solving but user science allows you to narrow in on the right one to solve and then test your riskiest assumptions first. So it kind of allows you to build confidence and de-risk the project as you invest more and more into it. And I think about it that way you can actually say literally months and months of time by incorporating user science correctly. I think that's really the biggest difference that I've seen between mediocre teams and mediocre VMs and really great teams and really great VMs is the ability to apply user science to really leverage the resources they have and deliver outsize returns for investors. So on that your boss started like two years ago and came into this term user science he also started kind of shopping around this two by two which made my life really easy because it essentially visualizes our philosophy and comes to research at XO and what I really truly and strongly believe as the lead researcher there. Like I said earlier it's super important that you break apart and understand that user intent is very different than user behavior. We have different methods to apply to get at what user intent and what user behavior are. Some of them are qualitative which we have across the top, some of them are quantitative. The ones that we apply frequently and we scale to all of the teams are really interviewing, surveying and usability testing and then we're always looking at analytics and running AB tests and we'll talk more about that. The biggest story to tell from looking at this two by two is that you have to consider the signals that you're getting from each of these feedback collection methods and tell the story of your users in that way. It tells the complete narrative. We would never run a survey or just look at data and say, oh like this is the thing. This is what 70% of users said this, let's go build it. So this really just tells a more holistic picture. No, I mean the only small thing that I might ask to think about why user behavior is different from user intent and we're looking at an example that we talk a lot about that. The knot is you can ask people, hey, would you download this app? People will say yeah, I'll download the app and a lot of times they'll just say that because they're interviewing this or be nice and there's always, even if you're doing some kind of super intense like field study, there's always a little bit of a different context between your research and your observations and when people are doing something and doing a lot of things. So you could be sitting at home on the couch and saying, oh man, I really need to download an app to help you on what you're planning. And so you start researching what you're planning in an app store, you go through and you land on an app and then you get a text and then you look at the text and then you've lost your attention and then you look at Instagram and 30 minutes later you've totally forgot that you were about to download this app to help you with your planning. So that's why it's really important to separate those two things and really key in on the right user need and user behavior. If it's not really, really deep, then you go and have a chance of solving that problem. So you have to be able to separate those two and figure out the right problems and the right solutions to some of them. Part of that, oh, yeah. Are you going to elaborate on why SEO is the label's intent? I didn't plan to, but I can talk a little bit about it. Yeah, I did. I want to say one point to what Michael had said. In getting at the intent and behavior part of making sure that you do that is just upfront being able to articulate your objective. Often people just want to do research because someone told them they should and they're trying to fill a gap but they don't know how to define the gap. You really need to just be able to say I want to learn X because I believe X. Okay, SEO is here because at XO we really think about user science as user research, product analytics, and SEO. SEO allows us to understand where our users are coming in at the top of the funnel on what they're searching for, which we know that many of them are finding and discovering our product that way. So it impacts a lot of how we think about the rest of the journey. Does that answer your question? Is that in some ways a behavior or is it because they're kind of waiting to take any of their attention as well? Yeah, that's why it goes in the intent. It's because it really is an illustration of what their intention is in that moment. I want to find a mobile wedding planner or whatever it is. Yeah, you're welcome. Actually, that was great timing because I was going to open it up to any questions before we continue. Great. So for application, we got to the point at XO where everyone was bought into this 2x2 framework in user science and understood it. But then they were kind of like, there's so many things we can do and we're not sure when and everyone had all these tools but didn't really know how to apply them. And in practice, it's not as simple as three little circles that are connected. But when I'm forced to add some kind of framework to how research really powers the product development process, this is how I think about it. If you're going to start at the top and you're going to think about, okay, we're building something completely new. It's really, or we're doing some innovative iteration of something that currently exists. It's time to discover new things and often the best way to do that is really just to talk to people. So have some type of interview session but really the interview session is a conversation. It's talking to the people that you've identified as your target audience so you can really understand at a deep level what their problems are so that then you can make an informed decision about how to solve them. And Michael here, they have done this once or twice. I have done this once or twice. And so, isn't it going to be Mary here? Engaged, at least engaged. Yes, congratulations. You should talk to Chris tonight. They don't have a lot to hear about your experience. Has anybody ever been to a wedding? As I guess, yes. Yeah, probably people, right? So we probably have some sense that when there's a wedding, people make a list of the people that they won't invite. Invite those people, then those people have to reply. So my job is to simplify that process as much as possible. And when I started on it, I didn't actually know anything about the process. I didn't know, I just described it as like a very simple process. It turns out to be actually very, very complicated. And we conducted, I don't know, a dozen interviews and two things kept popping up across every single person who comes in. One, which is that figuring out who would invite is really challenging. You're negotiating with your partner. How big of a wedding do we want? How much do we need forward? You're negotiating with your parents, your partner's parents. Do we really need this like random 30,000 that I've never talked to to be there? This like aunt, I don't really want to be there. And I think it's really tricky depending on who's paying for it. And then a lot of times people's parents aren't like, like a really good example and I'm not our users, but I'll post right this with my story. My mom could not figure out how to use Google Sheets, which to me seems like such a simple product. But bless her heart, she just couldn't use Google Sheets. And I've heard that like many, many, many times over that it's very hard to get the list from your parents. So that's one thing, is that figuring out who to invite is really challenging. The second thing is that once we invite people, they think their RSVPs is really challenging. Usually there's a date where you have to have all of your RSVPs in because the venue needs to know how many people are going to be there, your caterer needs to know how many fish should check in, whatever. But your guests, think about it, they get the invite in the mail. Probably when they get home from work one day, it's been a long day, they open the invite, they're excited, they see, they figure out if they can afford it, if I'm free, I should figure out who else is going, I'm just going to put this to the side right now and then you just forget about it, right? And so suddenly you have to text 100 people who are friends and you're going to have an awkwardly problem to send invites back or to send their RSVP back. It's a huge pain. So due to really big problems that we pay them back. Just based on this and what I said, does anybody have a sense of where you might start tackling which of these problems you might start with? Yeah. Is there any other information that you would want to know about this product before you start? Yeah, you need to know that to send the invites. So luckily we had a bunch of existing data and kind of an existing funnel product that told us what's happening. So you can kind of see that even if the collecting RSVPs happen so far down on the funnel and there's a lot of drop off before you get there that there's just not that much room for improvement. Even if we make a perfect RSVP collection tool, there's just not a lot of room to grow. But if we make even a marginal improvement to the column where people are adding guests, then those improvements cascade down the rest of the funnel. So we have a lot of room to make improvements. So we leverage that step in funnel to make the product better. So this is where we zoned in on, zeroed in on this problem and then we move forward into the execute phase which Crystal will enjoy. And I just want to point out that like I said earlier, in real life none of this happens as clean as it looks on the slide. And Michael's example is a good one. He got a lot of qualitative data that allows him to understand how hard this actually is for the couple and then he went back and he looked at our analytics to understand what would be the best step forward. So he kind of if you picture the two ways he looked at earlier he's kind of like moving around that using it to make the right, most high impact decisions. So let's say you do talk to a lot of your users you have a better understanding of the problem then you start getting ideas about how to solve it. You come up with different concepts. It's time to talk to people again. You can do that in person, you can do that remotely but you want to start getting a signal, are we moving in the right direction based on what we've learned in the strategy phase? Are the solutions that we're coming up with resonating with people? So we do a lot of concept and usability testing in this stage and Michael will tell you about that. So once we had zeroed in on the right problem to solve then we needed to figure out what's the right solution to solve the problem, right? So we brainstormed we brainstormed with the whole group we narrowed all of those ideas down into a few different concepts that we really liked and then we started to test those with users. I think the really important thing to stress here is that no matter what ideas we came up with or what concepts we had at this stage they're not going to be like 100% right so the really important thing is to test opposite ends of the spectrum and get signals as to which way you might go. And over time you will find your concepts through a system. So we have a couple of these different concepts one where you help people fill out their list by relation of the S team so I'm going to start with my family and we're a partner's family and then we'll go to our parents and see who they want to invite and then we'll go to our friends and so on and so forth and then this other idea that well what if we just started right away with like sending a link to your law and having your law fill out the list and send it back and so people didn't like that idea as much and we got a lot of strong feedback that that was not a good direction which is super useful right it's really good to know that we were wrong before we started building the thing and so we said okay the kind of like groups ideas things like that's a stronger concept and so then we started working on that going into the assessment phase doing a test and Crystal will intro that we wouldn't be like passing it back and forth like that we didn't literally have to do it but I think it's making us like reintroduce each other so now you're at the point where you have your concepts you feel strongly about one and you want to release it to a portion of your audience you can get feedback at scale at this point how freaking cool is that so you've learned a lot you've gotten a lot of qualitative feedback upfront now you're really moving into the behavior quantitative piece of things where you're going to be able to understand what users are doing at scale and you can A, B test and Michael talk about the details of that I also as a researcher just have to say there's no excuse at this phase to not also continue collecting qualitative feedback it's super easy to throw up even if it's a one or two question survey because then I get you that really rich information you know the what from the data but then you understand the why also if you get that qualitative signal as well and so like Chris said we need to start entering like kind of leaving the realm of user intent and then putting this in front of real actual people who are in the product and see how they really behave so usually you run an A, B test which we did and I'll show you here so usually you run an A, B test it doesn't work and then you run another right and it almost never happens that it like hits right away and then you release the test to your law usually you run it it doesn't work but you learn something you always learn something when you run A, B test and then you make changes and you do it again so that's exactly what happened for us where usually in any good A, B test you have one metric that you're focused on changing and we actually saw really positive results with that with our experiment where people were adding a lot more guests but then we started to look at the proxy metrics in other parts of the product that we didn't necessarily want to move but we didn't want to hurt we notice we were making the worst right so we were making the product better in some areas but then worse in other areas and in such a way that the trade-off wasn't worth it and so this is another really good example of the point where those three circles said that we've been showing all happening all the time so we ran into the column we said what's going on we dug more into the data to figure out exactly where it might be happening we ran usability tests with real users trying to use the product and we identified a few different areas where people were getting stuck or where it wasn't making total sense and then we again came up with some design ideas and then literally as we're speaking right now we're running our second version of the test and so hopefully if Chris and I are fortunate enough to get invited back we will have the results from that version of the test next time just an example to try again I guess okay you may talk now thank you just take on so we really wanted to explain to you guys what user science means to us and how we really believe as PM's you can use it as your superpower whether you have a researcher on the staff or not like just to be completely transparent I don't even help at many parts you just really trigger good at it that's not true Chris that helps me all the time which is amazing and then we kept talking about user intent just always keeping in mind where the differences are between those and making sure you're aware so they make high impact decisions and you're not surprised when you launch an app and nobody uses it and just you're always going to apply what you're learning and you should always be testing and always be iterating and always be getting a signal from the people who are really using your products because if you're not I'm not sure what you're doing any questions other than where did I get my dance moves I just wanted to ask a couple of questions how do you decide what would the control and focus would be what would that number be you can give me a range and how long would it continue cool so the I guess to repeat for the video the question was how we decide on and make sure I understand the question was how do we decide size for the experiment groups and then how we decide how long to run it for so the size is basically just a math question there's some existing conversion rate that we know since we're testing in this case we're testing proportion right so the percent of people who are doing something so there's just some existing conversion rate with the control product and then depending on if we're looking for really big moves or really small moves like we're going to see a 20% difference or we're going to see a 2% difference there's like a formula that you can apply and just plug in the numbers of like this is how we look at existing conversion rate this is the like minimum detectable effect we want to see and this is how confident I want to be that the number like it's a true result and you just probably so long story short it's a math question and there's a whole bunch of resources that you can find online to just do it like I used to know the formula but now I just plug it in the line there's one called EvanMiller.org that's the best one that I've seen and it helps you figure out how many people you need did the results that I find are really significant and it has different for if you're doing a portion or looking for that or value or something so that's kind of how we do it and as far as the timing goes I think that often depends on the product that you're using so for us like we've pretty safely made sure of things where people tend to get engaged on the weekends and they also aren't planning until Monday or Tuesday when they're at work so it makes sense for us to run it for at least a week or two so that we can kind of like flatten out any effects that you might have from that timing so I think the answer is that it kind of depends on the product you have and if there's any kind of cyclical danger to it does that help? Yeah, let's put things in perspective there's something to add how do you decide in terms of for your first test group would it be a known cyclical order or would it be customers who would be in a short hold? Let's see the question and so if we were going to ask internal stakeholders I think you would probably want to do that earlier in the process like closer to when you're strategicizing or testing con I always say this tragedy if you're strategicizing or testing concepts that's where at least in XO we would bring in internal stakeholders and make sure that everybody is on the same page then once we start an actual A.B. test then we would run it in the real product with real customers and at that point you can decide what's the audience for this is it existing users is it new users is it people who have been an app user talking on the guest list and you kind of figure out what problem am I really trying to solve with this and then target audience that way I make all of this sound like it's really easy because I want everyone to apply it but there are scenarios that was a really good question about testing with internal stakeholders there are scenarios in XO that are hard for us to talk to people and each people want to talk to us so that's not hard on the other side of our marketplace we have wedding pros who are super busy planning weddings and it's not always easy to get them on the phone or to get them to hop on a video chat so then in that case we may test with our customer service team who's in very close touch with those people all the time and you can come up with a proxy to test your actual user that's better than not testing at all I think your two ways to know whether you're a SEL customer and whoever has a strong inbound that is a best use of your journey you're getting a certain cohort with demographic or it changes the way you approach design products the one recent thing I could think of and maybe use something to add we launched a new product recently that most people were actually using Pinterest to solve the problem and now we've seen this major uptick in the search term that is our product so it's an indication to us that we've reached our audience in a meaningful way and we're changing behavior other than that though I would consult with our SEO expert to answer that question is there a retention between the product manager and the user researcher where you're like let's test some more and he's like no I just have to build this now so let's just go ahead and start building we need to look into this for me because I'm one to many basically depends very much on the PM that I'm working with and Michael is one that I don't ever actually think we haven't seen eye to eye usually sometimes I'll have questions about like our interview do you really need to do more interviews can maybe you just move forward and there there was a time before everyone was bought into this user science thing it was harder we were budding heads more often someone will launch something before doing all of that upfront learning and we kind of be like why would you do that like we have all these practices in place but for the most part we're having healthy debates like is this the right research to do now and as an organization we're actually at the point where it's okay to say like effort ship it because we know so much it's time to just move faster in certain cases yeah the knot we are really really fortunate to have a user researcher like Preston because we have a lot of knowledge but then there are just some things that you just don't know and one of our company values is to make fast decisions and it's based on this principle that no idea is worth a lot of effort and so you have the best idea that we're going to actually put it into implementation then you haven't changed it you haven't done anything for your users or for your business so we try to use the threshold of do I have 70% of the information that I need to make this decision and as soon as you get to 70% of the information you need you make the decision and you go and so I think that's one way that we kind of like try to design and of course it doesn't always work that well in practice but it's one of the ways that we try to design the organization and the culture to kind of all align towards solving those problems before they happen so I think that's kind of our way around it is to have a bias sort of action and they just try to get 70% of the information you need and make the decision for it so I'm in the enterprise to solve our space and one of the reasons that I'm looking for new opportunity is because my boss did science he just leaves and put all this in so I'm trying to pivot and all the companies like I'm in research technology and all the companies I'm talking to are insurance space or fintech space and they're very focused on this so I have used some qualitative analysis through everything I've done and I've kind of had to use my gut and it's been successful but I'm trying to see if there's anything I can do independently to educate and talk about it I thought first you were going to ask how to change okay there is a good book it's called Sensitive Respond that's written for the purpose of educating people at lower levels to then like manage up about this stuff and make bigger broader changes but on the qualitative side I would say to read some books like Erica Hall just enough research it really breaks down different methods and when to use them and why and then Steve Portugal's interviewing users is just really about how to talk to people and do field research that will get you really valuable insights so if you're able to talk about how getting signals from different methods is a way that you inform your product direction I think most people who are following a lean method and believe in research that's going to resonate with them and really just knowing how to talk a little bit about when and how to pull each of the levers that we've been talking about will get the data versus when to get qualitative feedback yeah I thought you were going to ask the same question that she thought you were going to ask some people just are going to change this okay so I think one you've probably already done a lot because you're coming to meet ups and things like this and so you're learning a lot and you can bring that knowledge back into the building some other stuff that like very concrete things that I might recommend are you can read the book statistics for dummies that's super helpful yes you can also read naked statistics that was by Charles Wheeland the statistics for dummies one those are two really good kind of like intro primers to help you just kind of understand like why the formula for making the right number of people for a test is the formula like why it gets you there so help you understand some of those basic concepts and then I think a really good book to to reach an understanding like the product development cycle is it's by Marty Pagan who also runs Silicon Valley product group inspired I think it's called inspired that's a really good book where he talks about the 10 mistakes that people make I think Marty and Becky are a very like well like it was cool and so I think he's got like a demonstrated track record of success so that actually might be another thing to do is try to like with the internet now anybody can write anything you can just be garbage so if you look for people who have a demonstrated track record of success that's also super valuable there's another guy who's Michael Moros he gives in middle he was the VP of product networks he's also transitioned into streaming and he has a website and has all these talks and articles that are super helpful he's actually in top me user sites so there's a lot of good stuff that are there any tools that you guys specifically are or is it just in XO? we at XO product readers don't we just use a mix panel which is like just it's like Google Analytics where you can go in look at stuff one tool that I was going to suggest is user testing.com I think it's free right to up to an extent is that correct? I had no idea I thought it was free for up to a point but it's not it's like 50 bucks and help you go in like practice doing tests they have a bunch of really good templates depending on what you want to wear that's actually pretty helpful to see like this is the right way to do this and user testing.com if you are in a situation where you need to convince your leadership that people may not feel the same as they do it's super easy to get valuable feedback and then make clips out of that you guys or do you guys have an example where you've got your qualitative and quantitative data that's going to be pointed in a certain direction and then it turned out that I was not the right person if I can we end with a once built an app that you recently sunset and I can't even an example of if you got the signal you ran with it every day I think of a situation where we didn't listen to the signal we ran with it and then he had to sunset the product to the public a few weeks later so that's the signal that I wasn't looking back and looked at the data and saw the signal so what happened was basically leadership wanted to build another app for wedding guests and a PM at the time did market research and other people had apps like that and a lot of downloads so that was her signal that well if we do it as a strong brand it's going to be successful I didn't agree with that knowing what I know about weddings and people but at that point my numbers are small I do qualitative research but I'm pretty loud but they still didn't listen to me so they built it and you probably know the specifics of why it failed but it essentially didn't sound right yeah I think in this particular situation I was trying to think I don't know of an example where we heard data going in one direction and then that was the wrong there was wrong or whatever I think usually when you fall into that situation it's because you either have the wrong solution or the wrong strategy and or you just looked at one data point and not the full picture yeah definitely and so I think in this particular instance the fact that there was a demand for some app that could help you manage your guests that was true that was right the solution of building a app that was separate from the rest of your wedding planning where your checklist is and you're searching for vendors and you're giving pleasure separating those two was probably not the right solution because I think that's actually where the human played because if we had the right data we shouldn't have to apply the right strategy and the right solution I work on educational products for kids and a lot of it is measured in formal measurements we can't play around and have kids have one version of data and another version of data and see if there's a difference it's not possible but it would be good to see if we did something differently and we get our gains out of and they're going to have something to see but we can't do that they're an actual user-based partner and so I think I do I do but yeah so the argument would be in any context already the argument is just that you can test prototypes for the kids and sort of do a user testing or an interview session it's going to be harder for you to get a scale have you done that? I would do but to get anything scale comparable to your world research we'll talk to you after this I'm so interested in this now I have no idea that's a really difficult I'm glad that you said that we can't do this and like teach you a reading source and stuff because Facebook we're just like that we can totally make sure it's going to be this because you get to the point where you learn enough qualitatively or you want to it would be ideal to run these into a smaller percentage where you can't do that and you can't prove it at scale or whatever so we have that user-based but you know what we do it doesn't it first needs to be tested but every school's goal has to take our time for you to consider this is there any way that you can like always have the original version and allow a portion of the kids to opt in to try another one so that it's not like necessarily they're a real baby it's just like a test environment and then they go back to I don't know maybe I don't know if that's an MSG it's just like a test environment and then they go back to I don't know if that's an MSG several programs there are one on the state and district levels and everything you need to don't have the discretion to do that on the teacher side of the same discussion standard we need an education expert yeah so and then there's always this teacher who has an idea of all these facilities but we're about like a hypothesis is it let's say outside of the company you have your son and dad do you hypothesize before you start interviewing and then hypothesize again based on the interview so they're just when you start actually testing things how do you zone in testing specific things and not like water falling on top of all everybody that you're trying to test at the same time before you start interviewing the premise there is that you have questions you want to ask so you have something that you want to learn so what we do a lot is we try to make a list of things that we want to learn and that's what you start with and then you can break that down to what's the best way to learn these things sometimes it's an interview and sometimes it's some other method that was in the 2x2 there and so the 2x2 one and so you can break down the things that you want to know and then what we'll often do is order them in what we think is the riskiest upfront and then try to learn that immediately and so you might, if you do an interview and you only have 5 minutes you'll start with only the one riskiest question that you have but you have longer that you can actually spend more time doing the other things but the last question I think that's really the important to take away the needs to start with what you want to learn and then you can kind of break down the riskiest is and then you can build questions to ask in the method to do it from to help answer the question yeah I think a lot of the time it's just your ideas kind of jumping in so you don't use the person actually like can you try to solve a problem and also try to have some information and connect them and make sure that they're the same thing yeah it also just like always be I'm kind of a stickler about stating your hypotheses upfront or else you could you don't focus in you're going to get a bunch of sloppy data when you do qualitative research that's what's awesome about it but you can't forget what the main points were that you set out to order and stating your hypotheses upfront that sort of force you to do that so you could go into it knowing okay we're being really focused but we're going to talk to a human in the case that you're doing an interview and like the human is going to tell you things that you didn't anticipate because we're also different and awesome can you do some kind of tests like over the summer when kids are in school I really want to tell possibly if we have a these schools where you play outside of that because part of it is they all start at the top but it's a different thing our question is how do you keep kids anchored in the problem space when you're having these conversations and not in the solution space but also in the solution space so like you know that's something I've learned how to work through but I'm not sure I know what mine is it's to be loud and to repeat myself like whenever we're having conversations that don't tie back to the original problem then I will just say hey can we all take a minute to restate what the problem is here because it seems as though everyone's got it and then also just we're out of time we could call people out or like I do this there's no solution out there but it'd be cool if and then just kind of pull back because sometimes it's hard not to do that because we know so much but I really just think calling out remember here's the problem or like writing it out of a white word and saying like this is where my character is I don't know if you have anything no I think that's exactly right is continuing to repeat yourself and ground yourself in the problem and what I found successful working with different versus stakeholders is to explicitly state your kind of like first beliefs and your kind of like first principles that you're working on so I think in our space technology solutions change rapidly and so problems the really important thing is to ground yourself in things that don't often change so like for my guess list people are always going to want faster, simple and that's not going to change whether it's today or 100 years from now but the solutions aren't going to be around and so beyond that the other thing is that most ideas don't work I think that's like a fundamental like foundational element of product development is that most ideas don't work and the ones that do work take many iterations to give light and so I think you have to repeat yourself a lot of the same things that's just the way the world works and instead of finding it's that we should actually embrace it and then design our product development processes to test a lot of ideas and so we get a signal that something is going in the right direction and it's not linear like that you have to repeat yourself a bunch make mistakes sometimes get outvoted and keep repeating that and keep being outvoted then you guys use costumers oh man personas for us have a kind of shitty history really we've done them poorly we've we've gotten very too focused on different demographic groups and then we realized this isn't it's not that different planning a wedding isn't actually that different so now we're doing segmentation so we can understand at scale what are really the levers is it budget and style or shouldn't you be looking at something different but it's the wedding planning is really about what level of personalization you want and how much money and time you have to put into it on the wedding pro side of the spectrum we actually have four really well-defined personas that have allowed the team to better understand what their design is and us how about this how frequently do you guys review those four for personas they're relatively new like year or so and we probably updated them at least two times but more to more to iterate on the original versions that we came to then that we learned anything new we just realized oh it would be helpful for product managers to have this data point let's add this and based on the feedback we're getting from PMs and from marketing we're actually collecting data points in a more standardized way so that we have a better idea of who these four are on the full side of that I've actually never really used personas successfully just for a rapid reason but one thing that sort of we are getting better at is segmenting our actual users so not by persona type but by behavior so for instance if you looked at the average number of guests that are added to a guest list you'd see some number but if you looked at the raw data you'd see there's some percentage of people who are adding hundreds of guests and then some people are going to the guest list but not adding anybody and then some who are adding just a few and so then you can break it down by those user types and figure out well where are we going to have the biggest impact on what has the easiest to solve and so I think on that it's not only a persona but breaking it down to segments to actually work more about this and I just want to add one thing to that we have found has anyone heard of the drafts to be done in Framework that has been more actionable for us than thinking about personas so just thinking about what a user wants to hire than not to have published and coming at things through that list has been actionable that's Clayton Christensen who's like a professor of corporate business school it's like Aldo Jofsky known he's the same guy who wrote who I don't even know any of these people I just think it's helpful it's just Google but he's the same guy who wrote innovators till I'm up so there's so much to see so it's an interesting quote I don't know technically it's all I've been thinking about talking about how some absolute sexist racist kind of deconstructed by user for some of them so dangerous in that it's like it's making a lot of assumptions and then you think that you don't understand who these users are the users are basically the clients so what was the quote technically wrong is a photograph of a dating channel to check it out I think one more question you guys I will tell you where I took dance lessons hahahaha because I'm more into sex all this time so now I'm trying to identify my users is a powerlifting athlete how he was protected I love it he was protected by my favorite so I was wondering in terms of like active users I'm not sure which event to use as a proper identify if this user if they open the app and play live with friends and exercise with it should that account be active or should he finish doing an exercise that he finished without being counted as active so basically the general question should be I'm not sure what I have a matrix and I want to figure out the answer I don't know which specific event to use as a great point of saying it's cheap or not yeah I understand a similar issue with this on your guest list is on your what point do you know it's been useful for them so I think that that relates to the case to ask ourselves repeatedly what's our goal and then how like what behavior do we want people to do such that we know that they've achieved right so for us like it's really useful if you had all the people that you're going to fight on your guest list and then you like press some button that says I hate that so I think it's a question of repeatedly asking yourself what's the goal and then wrapping out all of the steps that you have to do to reach that goal and going like as deep as you possibly can into that funnel so that you don't have like the behavior in the app but you have to go as deep as you can as close as you can to the end behavior and then that should go right and oftentimes that's really hard and that number is really low which is why people always you know they matrix like balance and stuff because it's easier to like suck