 Thank you so much. Good afternoon to everyone here and everyone online. My name is Dan Mascola. I'm a product advisor for AWS. I work with product teams that run on AWS to help them with product strategy, roadmap development, and even bringing resources to the table like designers and software engineers to help accelerate product velocity. Got a great panel here to talk about what's working and what doesn't work in product sometimes. Francesca, do you want to give an intro? Yeah, hi everyone. I'm Francesca. I am the imposter here. I'm head of growth marketing at Amplitude. We are a product analytics platform. I work across marketing and product and spend a lot of time, think about activation and monetization journeys for our customers. Ryan Glasgow here, former product manager now founder and CEO of Sprig. And we're a research development platform for conducting research across the product development lifecycle. So pre-launch, we have concept testing, usability testing. You can take your Figma prototypes, add them to the Sprig, ask questions, slow them to your customers. And then post launch, you can then run in product surveys to really hear what's working for your customers, what's working and not working from them, and build better product experiences. Awesome. Hey everyone, I'm Jeff Chow, CEO and former Chief Product Officer at InVision. You probably have all crossed paths with InVision with your design teams, but we've also expanded our visual collaboration platform freehand to empower product managers to work cross-functionally with your teams to plan, iterate, ideate, synthesize, review, and really deliver your best products. Prior, I was led product teams at TripAdvisor at Google and co-founded a bunch of startups in the consumer and marketing automation space. Awesome. So before we can actually start and think about what makes a product successful or what makes a product a failure, we have to define what that actually means. So Jeff, let's start with you. How do you define product success in your organization? Yeah, I mean, it's a great question. We, you know, there's like a qual and a quant on all these things, right? Like, so success for product teams, you know, we do an OKR process, really focusing on what are the kind of key measures we want to push and really honing in on these like big needle movers, right? So it's not 23 different data points. It's one or two that we really want to empower teams to work on. But more importantly, it's on the qual side, it's the objectives, right? Like, we really want product leaders to own is the mission, the problem to solve. How do you galvanize a team of cross-functional individuals to lead to that impact? And honestly, oftentimes coming up and inventing that next inflection point. So success to us is when they actually create that momentum and, you know, my personal litmus test with product teams is, you know, the next OKR process. Usually they're the ones delivering the impact. They're the ones delivering the like, we know exactly what we want to do. We know exactly what that impact in the OKR process is not painful, right? Because they're already driving it. Yeah, you want them to come and actually have a say and come up with a new OKR. Yeah, and really be innovative totally. Ryan, what about at Sprig? How do you define product success in my career as a product manager? And, you know, as a CEO, it's always been tell the team results matter most. And so are you really driving success with your product line and for the organization overall? I think we're all familiar with KPIs and metrics. And so I won't drill in too much there. But I think the big misconception that I see is when a team that have 10 KPIs or 20 KPIs, it's not really a key performance indicator when you get more than three. And so I think a lot of the course correcting, you know, might see or do at Sprig is really making sure that each team, each function and the company overall has one to three KPIs. And that really helps drive that North Star and define success. Yeah, yeah, just to add to that, we do something very similarly. But we generally have one North Star metric that we rarely change. And then we have a few outcome metrics that we are driving towards each quarter and they can change, but there will be very few of them. So we're all focused and then we'll change them over time or make them higher or lower depending on what's happening. What is that North Star metric? And maybe just talk about like how you came across it because it's often not the first one you pick. It depends on the unit. I can talk about our activation metric as an example. We have evolved that a fair bit as we've learned more about our customers. The journey looked a lot at our data and having actual data analysts just sitting and running analysis and figuring out what does an actual retained user or customer look like. Our metric is weekly consuming orcs. So we believe to be a retaining customer of amplitude. You have to log in four out of seven days in a week. And that's what we see driving retention. Yeah, KPI is often tend to be like very company and user or market specific. So one question around data driven. We seem to be in a data driven everything kind of world. Can you give an example of the audience where data maybe has led you astray or maybe doesn't paint the full picture? Yeah, I think I think it's not painting the full picture. I think many times we collect a lot of data, but we're not collecting the right data. Even at amplitude, a data company, we're pretty good at event tracking. But sometimes you will run an experiment of doing analysis and you're like, we don't actually have the full picture with so much data. But we need to dig a little deeper. We need to track something slightly differently. And so not being forced into that box, I think it's important and make sure that you consider event tracking as part of every experiment and part of every release and then add qualitative research to that. Yeah. Any other thoughts on being too data heavy? Oh, man. You know, data is one of those things. Again, back to the sharing the problem statement, the mission, right? There's like this, we have to be cautious of the data rabbit hole, right? Where someone's like, well, that data, but what about this? What about that? And all of a sudden, everyone's trying to solve not the mission and the objective, but the data and you're finding one day you wake up three weeks later and you have 53 more events firing, trying to figure out how do you solve the ambiguity of it, right? And I think that's a little bit of the where data could definitely lead you astray is making sure that, you know, we can't lose the plot, right? And I think sometimes when you start debating the data and not the actual problem to solve, that's when you lose the plot. Yeah. Yeah. Often see a lot of teams focus on the behavioral data or the revenue data or sometimes, you know, just the research data. And I think looking at all three data sets together is does give you that complete picture of how the business is performing, how are you users are using the product, but also the user sentiment. And here at Sprig, our goal is really to help bring research data and sentiment data to the same level of respect that maybe the behavioral and research data is getting. And so I really encourage all of you to really look at how, you know, research data, qualitative research can also sit alongside those behavioral dashboards, those revving dashboards that you might see because that's really going to be the predictor of your long term success. If your customers are happy with the product now, it's going to drive that word of mouth. And ultimately, you'll be more successful in the future. The coming months and coming years. I'm a big fan of telling PMs on the team to close your close your computer, go out and talk to customers. Yeah. One of the things that we have that Amazon, one of our leadership principles is if an anecdote, a customer anecdote and the data disagree, you've got some work to do, right? There's something you should dig into. And so if retention looks awesome and then a customer says, I haven't logged in in a month, well, you better go figure out why because it's going to be important. OK, so let's now turn to a product that failed. So maybe if you could each give an example of a product that failed and then follow up questions to that, how do you know it was a failure? Like at what point do you realize? Yeah, this is not going well. And then what was the follow up? Jeff, why don't you start? Oh, I got a great one. This is like the greatest hits of failures. So I had a startup that was in the note taking productivity space. It was Spring Pad. And it was back in the early iPhone days, right? Like I'm talking skew morphic, leather stitching, like everything looked exactly like that. And all of a sudden everything switched to like the flat design. So we're like, OK, Spring Pad 3.0, we're going to redesign it. It's going to be this awesome redesign. And then while we're at it, my CTO was like, hey, there's this new open source technology that we should move to. Instead of my sequel, why don't we use this open source thing? It's an alpha release, but it says it's going to be really great to go after all these great things. So we're like, OK, we'll replatform. And then we're like, you know, there's this really interesting trend going on where it's like productivity will be a little bit more social. So like you can share your knowledge with each other. It's like, OK, we're also going to make it social. So you can imagine, I'm telling you guys, like this is my like greatest hits of mistakes. We're like, OK, great. So we're going to do a redesign. We're going to do a replatform and we're going to add social elements to a productivity deal. So we launch it in crashes because it's on an alpha open platform. Right. Everyone is the people who could get through were like, where is my leather stitching? I like my like I like my notebook. And then the people who got through that were like, why are my notes public? What are you doing? These these are my recipes. I don't want to share my recipes with anybody else. And it was one of those moments where if you guys want to hear Twitter trolls, there's like the note taking Twitter troll is maybe on par with like the Elon Musk Twitter troll. So so like they let it rip on me. Anyways, so we our lessons learned right like my lessons learned. And I honestly have taken this throughout, you know, our time. One is obviously replatforming is one where you can't necessarily do like major advancements and replatform at the same time. You have to be very cautious. And if you can componentize and structure it better, that's always better than big monolithic switches. Second one is, you know, you really have to kind of like test and learn with the customer, right? You have to show it. And even vision, any vision is a great vision. But if you don't roll it out to customers, see how they feel, understand the change of version that happens. Then, you know, even if it's a great idea, it'll fall. It doesn't sound very agile. No, no, it's not very agile. Exactly. Yeah. Any other examples that you'd like to share of some product or feature failures? I've certainly had my share of failures for product launches. And before Sprig, I was the first product manager at Weebly, a popular website, Drag and Drop Builder. And we were growing really fast. We had our aggressive growth goals. And, you know, we were looking at why people were signing up and were unsuccessfully the ones who didn't create their website. They didn't hit that publish button. They didn't get their first set of visitors to their website. And one of the main two key reasons we saw was, you know, we were serving small business owners. They were yoga studio owners, maybe restaurant owners. And they said that they didn't have enough time or expertise to create a website. And so they didn't know how and they were very busy people. And so we got into a room, we came up with what we thought was the big idea of offer professional services. And so we spent over a year on thinking through how we can offer and roll out professional services for the Weebly small business owners, which is really our core customer. And we had brought designers under contract and we built a whole client review workflow and they could share the designs that they were doing for the Weebly customers. You could do 800 or 1,000 or $1,200. And, you know, finally, the team was so excited. We put it on the homepage and we, you know, had that banner. We now offer professional services. And the first week, no one clicked the button. And so we made the button bigger. We made the banner bigger. Classic move. Made the color brighter. And I think week two, we got one person who paid and week three, I think we got one more. And, you know, after three or four weeks, we just said, we have to, we need to actually ask, you know, our customers, why are you not using our professional design services? And they said, we're small business owners. This is too expensive. We can't afford $1,000. And so we deeply understood the problem, but we missed the mark on the solution. And it was just a really cold, cold water on the face, makeup call of, you know, taking too long to launch, spending too much time, just too much in the launch. But I was just missing that customer feedback and really hearing from customers like as soon as you can about is that solution the right solution or maybe other other ways to solve that. So definitely a deep, deep, painful learning experience for me that have always kept in my mind how to avoid that going forward. You didn't try the blinking button. That's the best that's bottom of the barrel. The scrolling marquee was the marquee one. Yeah, changing the color always helps. Yes, one of the things that I learned doing a start up a couple of years ago is just go sell it. You know, it doesn't matter what it is. You can find someone on Upwork, do a $30 one pager. It doesn't matter if it's a $10 product or a billion dollar product. Go and talk to somebody and try to sell it. Or if you're at a bigger company, find that pioneering salesperson, have them go try and sell it. In the 30 minute conversation, you'll learn all the loopholes that you would have avoided then planning for a year and then launching a website, increasing the button size. OK, so kind of our conversation a little bit before this was, you know, oftentimes CEOs or C level people or people that aren't familiar with product are focused on features. Right? Oh, it's just the right features. It's going to be a successful feature or even a successful product. And that often isn't kind of the right conversation. They're going to make bigger conversations. It's just the right team, right? It's just the right product culture that we're building. So I want to ask each of you, like, what makes a good, successful product team versus an unsuccessful one? And why don't we start with you? Yeah, I think it all comes down to aligning around the same metric. Going after the same thing. You could be working on different parts of the product, working on different features, but you're all driving towards that same North Star or key metric to move every one format. So you are walking in the same direction. I think that's really powerful. Totally. Yeah. And make it simple too. Super simple. Like you said. Yeah, exactly. Yeah, yeah, yeah. You want to go for it? You can go. In terms of success, I would say for any product team, I think the rate of learning is really going to be what we've seen as the highest correlation and it could be learning through any data type. But also when you're, you know, every change you're making to your product, really understanding how it's impacting your customers, both from a user sentiment as well as the usage, as well as the revenue data. It's really those three types of data. And really the ability to document those learnings and why they share those learnings as wide as you can within your organization really will help you prevent from, you know, making mistakes again, but also building on those learnings. And so, you know, all of our features are not always perfect on day one, even though we might like to think so. And so it's always identifying those gaps, always continuously iterating and improving. And so really that rate of learning as well as that rate of velocity are really those, what I see for a successful team really have two dimensions to optimize on. And identifying those gaps, can you like give examples of how you do that at Sprig? Is it meetings? Is it, you know, a formal kind of function? The gaps that we identify, we do use our own products. So we use the in-product surveys and we have those wired up throughout, you know, our core flows, launching studies, you know, integrating Sprig, getting it set up. And so what we're often doing is running those in-product surveys, you know, when we launch a new feature, it might be in an AV test, running an AV test on our homepage right now. We also have in-product surveys running to see, you know, how the different versions are performing from the behavioral as well as the sentiment. And looking at those both together has helped us close. Here's where the drop-off is, but here's why there's drop-off. Here's how we can close that gap and have this product be more successful. And ideally you can do that in the design phase, right? You wanna do that before the product launches and get that feedback early. Exactly. Yeah, I think that's where the concept testing and usability testing comes in. The more you can do that, the more you can de-risk those decisions before writing a line of code. The engineers will always thank you. You can earn a lot more respect with your engineers by saying, hey, we already iterated quite a bit on this mock-up before you've already gotten started. So the more that you can de-risk and identify and close those gaps before engineering is always optimal, but I know a lot of us are moving at light speed and so it might have to happen post-launch as well. Engineering, rework. They'll give you the dagger eyes for that. They don't like that. Jeff, what about you? Yeah, you know, I feel like especially now, I'm sure a lot of you have distributed teams and cross-functional teams. To me, the key indicator of success for product leaders and product teams is being able to pull your team together, cross-functional. The engineers, the designers, user research, copy, anybody, stakeholders, right? And, you know, I think the key, the three-legged stool that I always think about product leadership is great problem solvers, pace-setters, and communicators, right? And I think that's really important and they all are intertwined, right? You can't be the pace-setter without being a great communicator because then everyone's gonna hate you, right? Like in there and things like that. And so, but I feel like when you do that and when you're really focused on galvanizing a team to figure out what the metric is or what the problem is, then you have those magic moments where say a junior engineer is offering an idea that's gonna solve the problem or you have a designer and an engineer working better together saying like, two ticks to the left, that's six-month project. Two ticks to the right, that's a two-day project, right? Like getting those moments where you're actually truly collaborating is such a powerful tool. And I think that's really, I feel like the key muscle for product leadership nowadays is because when you're distributed or you don't have those ad hoc moments, you really have to really mine that communication. Something in a... There's a bug in this. Just tell me when to dock and I'll do it. All right. So you guys described kind of product, success attributes in a product team, but what are some like red flags that you might see in a product team or product culture that would lead to unsuccessful? That you're like, oh, that's something that, as a CEO or a CPO, I would need to go address right away. Why are you guys looking at me? No, I can go. I mean, the product is a weird thing, right? Because you build this relationship with a team and you really want to protect the team, right? And then the team sometimes gets a little kind of cranky and you're like protecting them, but product, you're a little bit fan-sitting, right? Your goal is the impact and the delivery and solving the customer problem. And I find that oftentimes you end up, what I find in kind of teams is they kind of lose the plot a little bit and they're sort of saying like, well, they get a little tribal, right? And they're like, okay, this is my team, I'm gonna protect them and things like that. And I think that's just generally something to look out for, especially for people who are managing multiple squads or multiple other things. Like, you're sort of entering this phase where you really wanna make sure that you're growing that like leadership level to push pace, make sure that everyone's there, but they all, everyone understands the whys and the what's of what's going on. So when you lose that kind of communication muscle, being able to have the hard conversations, that's when that happens, you know? And you usually were kind of plot a couple times, in the last few minutes. And to me, that's like the customer, right? And for me, it's when I don't hear them talking about the customer. When they're talking about data, when they're talking about the business objectives, when they're talking about the next sprints, when I don't hear like, the customer said this, when I was in a customer conversation the other day, the data analytics on customer conversion rates, right? When you're not hearing that kind of stuff, these red flags that you're not close enough to the customer. Yeah. Definitely when I see a spec, you know, handed off design and it's clear that the team did really understand the customer problem is definitely a big one. And you know, just pause right there, don't go any further, send it back, talk to customers, see what the problem really is here, where we're looking to solve. I think the other one there, when you think about the, you know, products with the cult followings, you know, Lume here is I think one that we can all kind of point to who I know as a sponsor today, is really how often is a product manager saying no? And you think around Steve Jobs, you know, one of the philosophies at Apple is a thousand no's for every yes. And you think around the products that do have those cult followings, the product managers, they really took a thousand no's to say a yes because the simplicity, the core product is so well honed and perfected, they're able to really get out all those ref edges out of the product so they can really hit that escape velocity. And that internally means you have to really play that role of saying no to a lot of people and really make sure your product remains as simple as possible. And so when you see, sometimes a product manager, perhaps saying yes too many times or adding too many features or too many requests are getting into the backlog, then it's usually a symptom that, that focus and lack of clarity is my saying. Yeah. I like to tell people whenever you add a new feature, you need to take one away. Yeah. And it reframes how people think about the kind of the user experience and that may not happen exactly practically, but maybe it's combining features or maybe it's, well, what is the priority of features that we should be focusing on? They're just changes to that conversation. Okay. So let's talk about product priorities, right? There's product teams have to say no a lot because they have a lot of stakeholders and deal with a lot of priorities. So how do you think about prioritizing that for your PMs and your product teams? You want to start on this one? Yeah. I think a lot about is how you prioritize learnings for the sake of learnings, like running an experiment. We run a lot of experiments as a growth team. Sometimes we fail, often we fail, but we learn so much. Which is good. But we didn't generate any value. So how do you balance continuously generating value as a team and moving the metrics, but also being comfortable with failing and learning and then redoing? What percent? Do you have a percent? Like is it X percent? No, I don't have a magic number, but I think you got a structure your quarter or a year in a way that you have some bets that you are like, this is going to work. Or we have very high confidence. And then you have some leeway to run some of these more on certain experiments or things that you want to learn and have a learning roadmap for that, which can sometimes be a separate thing to like the core feature work or things that you have to do based on customer feedback and everything else. Yeah, there's like exploratory work and then there's like all in, we're doing this work. And make sure that you're okay with failing and that there's a forum to talk about the failure, why we failed, what we learned and how we can then take that into the next experiment or the next feature work. Yeah, totally. What about it Sprig Ryan? You know, as CEOs we're scaling one thing that has started to really kind of come up for us is being more open about the hits and the misses. And you know, there's actually a study that Harvard Business Crew did. And what was most motivating for teams is actually not on the analytical performance of a team but actually that the team is sharing not only the successes and the wins and really giving that acknowledgement and shout out but also acknowledging the misses of that team and really sharing those learnings. And so it's something that we're starting to get to be incorporating into our weekly all hands is a hits and misses section. And also hits and misses channel because you do want to really balance here's really acknowledge hard work and really acknowledge success but also be okay with, you know, failure and know that a great product experience requires failure to a certain degree and really kind of socializing and really setting that psychological safety for the team to say that this AB test fail and we're actually going to share what we learned and how we're going to move forward. So I think at a team level, you know, hits and misses is something you know, certainly consider to really incorporate the wins but also the learnings along the way. That's awesome. I've been building on that. I mean, I think one thing that is a really great culture around, you know, celebrating failures is sometimes you actually want to move that sentiment of like, what if it did nothing, right? Like those are the ones where why did we do it? You know, it's like you either swung and it worked really well or it epically flamed out. That's great. But if it did nothing, like what did it do? That's a failure actually. Yeah, that's the failure, right? The failure is wasting a lot of calories on doing nothing, you know? On our side, a lot of what we think about is, I think another trend is sometimes you'll wake up and one team is really focused on optimization only, right? Like they launch a couple of features and they're just optimizing, they're running a bunch of AB tests and another team is now known as the Big Bet team and they're like taking massive swings and they're like visionary and all this other stuff. And I think that's a bad mode, right? Like I think that's like very lopsided and you're sort of like branding one team one way. And so what we try to do is make sure that there's a healthy balance of like bets and follow-throughs and making sure that we are kind of making sure that all teams, because what I've found is that when the pendulum swings too far, then the next quarter they wanna do a bet then the pendulum swings out and then they're like not letting, following their features. So, you know, sometimes we do percentages like, okay, this quarter 70% bet, 30% follow-through and optimizations or, but we try not to make it like all or nothing because we really wanna build that full muscle for the teams. Awesome. Okay. So in the final couple of minutes here, 30 seconds each, what advice would you give to the audience for to think about their product and product teams for 2023? Let me go first. Yeah. I wanna plug marketing in here. So for all the product people out there, work more collaboratively with your marketing teams next year, it works really well. Marketing can do wonders for you. They can help you design journeys. They can help you get more feature awareness. And yeah, it works really well when product and marketing come together and I see a lot of that amplitude, but I also see that it can be done better, so. I agree, Brian. You know, the Frank situation next year, it's not gonna be easier in the economy. It's gonna be more difficult. At the end of the year, everyone's kind of buckling down for a tough winter. And so the stakes are higher than ever. You know, whether your product is critical or nice to have, you know, the line is significantly moving and the boundaries are being redrawn for what success and failure looks like for a future. And so I think the more that you can just de-risk all the features, all the launches that you're working on, it's gonna be even more important than ever because the excess of last year is no longer gonna be with us. And so that means it really have to buckle down. Yeah, my one bit of advice is try to have the hard conversations with your cross-functional partners, right? And oftentimes we find ourselves kind of combing over those just by talking in process, but be direct, right? And you'll be shocked at how much better things turn out when you actually have direct conversations around getting to problems to be solved. The hard conversations will happen whether you want it to or not. So you might as well own it. Yep, yeah. Awesome. Well, thank you all in here and online and have a great rest of the conference. Thank you.