 Hello, everyone. Welcome to the session. I will be the moderator today. I'm Seda. I work at N26 as a senior product manager. N26 is a digital bank, and I'm responsible for the insights team, where we are building features, showing users around insights and statistics on their saving and spending behavior, so that they can feel in control of their financial lives. One example would be showing the user how much they spend on groceries, month-over-month view. Outside of work, I enjoy spending time with family and friends. I like working out, and I'm also passionate about mentoring and coaching people. Now I would like each of our panelists to introduce themself, or maybe mention a bit what you would like to do outside of work. I can get started. Hi, everyone. I'm Eshwarya. I'm a product manager at YouTube. Outside of work, I do really like painting. I have a blog where I post some artwork anonymously, and I am also very passionate about education. Before YouTube, I was working at a startup which was focused on democratizing software education for everyone, which is why I've also been looking for opportunities to mentor and coach people. So I'm really glad to be here. Great. I'm Travis. I'm a product manager at Netflix, and I'm responsible for our experimentation platform. So that means that all the tooling and infrastructure and analysis methodology that folks at Netflix use to run experimentation. Prior to working as a product person in various tech companies, I was a physicist, which was fun, and I got to work on experiments of a very different kind. Outside of work, I really enjoy hanging out with family, playing soccer, and fortunately I get to coach a soccer team that my daughter's on, so I get to combine both of my fun passions. Hi, I'll go next. I'm Patricia. I'm a product manager at Dropbox. I've been there about four and a half years and based in Seattle. In my time at Dropbox, I've worked on everything from search to the desktop course, Inc. Engine, and most recently I'm driving core AI strategy. Outside of work, I am definitely a very stereotypical Seattle Pacific Northwestern-er. I go hiking. I love being outside. I love the mountains and the water and mostly just getting my family outside. We're also avid musicians, so we play a lot of music as a family, and so we pretty much love all the things. Thank you all. Again, welcome. So without further ado, let me get started. My first question would be, what does a data culture mean within the context of a product team and what are the benefits of creating a data culture within the product team? Travis, do you want to go for it? Sure. Yeah, I have some thoughts here. But yeah, to me, data culture can mean so many different things. You hear people say things like data driven, but I don't know. I often like data informed a little bit better. I feel like really what it comes down to is, are people on your product team being curious? Are they being skeptical? Are they trying to find the answers to questions rather than trying to drive some agenda? And as I mentioned, I come from a science background, so I often think of it as a science mindset, but that's really just my own personal experience and perspective. I suspect there are other mindsets that similarly approach sort of that asking of questions, seeking knowledge, rather than necessarily coming to the table with a clear idea or agenda that you want to drive. And once you have that, then data becomes the way you answer those questions sort of naturally. So that's kind of what I mean when I say like, oh, there's a data culture there. I think to like build on to what Travis said, data driven culture can mean so many different things and as a product manager, like I've worn many different hats. I used to work as a growth PM before and then also as a platform PM and the way you leverage data in your day to day work can look very different depending on like, what kind of a role you're in. So for like growth, data was our bread and butter. Everything we did like started and ended with data. We would find opportunities in the user funnel, figure out where was the most headroom, optimize for those and then measure it through data. But now that I'm more of like a feature core slash innovation PM, whatever you might want to call it, where we're thinking of like creating user value and making the product enjoyable, like YouTube as an app more enjoyable and fun for the user. A lot of times it's not the data that you're trying to move or a lot of times like the kind of impact you have is can be qualitative and it's not always measurable through like data, like you're trying to make the app more modern. That might just be a user perception thing that you move and you add more value. So I feel like data driven can mean so many different things that you probably take some data qualitative, quantitative work towards it, but you also keep an eye on like, what is like the needle moving or like the step change things that you could do which you may not find always in just your in product user behavioral data. So I think it's on a spectrum and understanding where you fall in that spectrum depending on the kind of work that you're driving is very important. I really love both these answers and I think to add on to both of them. I really like Travis, your statement of being data informed rather than data driven. And I think just like, just like we said, I've worn so many different hats and I've worked on products with various stages of data culture somewhere. It's very intuition based. Like we don't really have the strongest insights to, you know, my time on search, which was incredibly like experiment driven metric focus. And I think what I really have loved and why the reason why I like saying data informed is that data is just one input into many that you use to like Travis said, be curious. And I think what I maybe think more about of rather than being data driven, I think about it's important that you always understand the impact of what you're building and the impact of what you're shipping and doing for your customers. In a lot of cases it is proven by data and that's how you answer the questions that you have in other cases. Just like Eshwara said, it is more of a feelings or an emotion driven or like how do you actually measure the impact of modernizing an experience. And so it's just one point of many that you should be considering as a product manager. Yeah, Patricia, I really like that. And I just had one comment to kind of add on to or something you said really triggered with me that like, I think part of the question was also like, what are some of the benefits? And like, I love the way you talk about, like it matters what the impact to users are. And to me that's the real benefit, right? Is that data, when we talk about data we're almost always talking about data coming from the people who use our products into the people who build the products. And so really another way of talking about data driven or data informed is like user centric because that's what the data is. It's about how people are using or not using our products. Yeah, thank you. I also really like the comment from Eshwara that it can also be qualitative, right? I think whenever like we talk about data, most of us, I think we tend to think about it's always about like a KPR or it's about dashboards with the numbers, but it can also be the qualitative data. One thing in the beginning I forget to mention for our audience if you have any questions, post them in the chat. We will see if we get time. I will try to pick them up and thank you for being with us also. Do you want to add maybe because like Travis already mentioned, do any of you want to add Patricia, Aishwarya about like what are the benefits of having a data driven culture? I can take a stab. I think a data driven culture helps give like a unified language to people working across teams, right? When you're talking about like user centric problems and what problems you're solving, what KPIs or metrics you're trying to move. In that sense, quantitative data does give like a unified language to people and then you could have like similar rubric or yardstick to measure impact against. So I think it helps give some kind of common ground for us to balance out what are the things, what are the problems that we should be solving and how do you measure the impact? So I think that's one very big benefit of data and form data driven data centric culture. I really love the idea of data being this unified language. It just helps drive and helps give people something to actually see and to visualize and I think that helps. Adding on to something that Travis mentioned which is that I think really what we're trying to answer is the product that we're building is the problem that we're solving actually delivering on the value of your customers and I think data can really help showcase like yes and give some answers and insights into like did I successfully deliver that value? Did customer successfully realize that value? And I think that's how you really gain the confidence that you have that you're doing the right things for your customers. I love the language analogy so I don't have anything to add beyond that. That's really great. Thank you. And what might be the potential challenges of creating or sustaining a data culture within the product team and how to overcome them? I'll maybe start with this one and it's been really kind of fun seeing the progression of teams that I've been on and the different ways that data is being used. I think a really common challenge between saying the words data driven we want to be a data driven team versus actually living and breathing it is I think that it really is easy pitfall is that it's easy to think that data is truth and data is objective but I think oftentimes people's interpretations of data can really give them skewed insights into sort of what would actually be maybe more objectively the truth and so even though data is objective the interpretations are incredibly subjective and so you can oftentimes fall into this trap of cherry picking data that looks right and it matches your hypotheses and matches the outcomes that you can drive and I think there is like the challenge in that is really breaking free from always having to prove that you're right to really having that curiosity mindset of saying like what has this taught us like you know what have I learned how have I grown what is this data telling me that doesn't match what I thought going into this and having the humility to really sort of look beyond your personal biases. Yeah building on what Patricia said right like data literacy like she pointed out like not everyone knows about it everyone has a different way of interpreting it that is something that I've encountered as like one of the top pain points that data literacy maybe just not knowing where to pull the data from depending on the size of your organization how to pull the data how to correctly interpret it because as the data grows it can become very very challenging to understand what the data like what that field is trying to tell you depending on how descriptive it is your interpretation could be so different the other two challenges I've noticed is the second one is like creating data silos again like working at a big company that can be that is a very common trap where some metrics are not shared across organizations and when you are working with cross functional teams creating these silos you create dependencies you don't know where to pull the data from and you are you become either dependent on others to pull that for you or you just don't know that the data even exists and then the third challenge I've encountered both at like a startup and at a big company is just lack of resources sometimes it's literally we don't have the resources to store a lot of data because as a small company the more amount of data you store the more cost it incurs right so you have to be so careful with which one data you're storing or not storing and then similarly at a bigger company the resource challenge is the opposite that you have so much data that to actually pull the script you need a lot of machine power to like pull that data so I feel like these three have been my biggest learnings in like what can be data challenges data silos data literacy and just the lack of resources yeah those are all good ones I actually I made a note about like kind of this spectrum of hubris to humility which resonates a lot with what Patricia said of like you've got to move yourself towards that humility side of that spectrum there's a couple of challenges I've seen which I think are frankly less important than the ones that have already been mentioned but but I think they also do come up I think Ashwara talked a little bit about tooling like maybe there's tooling solutions to some of these things and I think you mentioned silos as well there's also the opposite problem of silos which I've seen which is really having like too many metrics where you know like I have this my definition of this metric and you have your definition of this metric and they're not quite aligned and then it actually you know that defeats that common language goal of being data driven there's maybe one other problem I see occasionally which I think is an interesting one that if you become over dependent on data you often may become sort of renaissance to take a big leap or to take a big bet because you know the local you're in some local maximum but there's a global maximum over here you need to get to but every test you do every thing every data point you have tells you that change is bad for you but that's just because you have to go through the the the dip to get to the global maximum and so there is a role to play for these sorts of intuitive leaps that the PMs like to have and if you get over dependent on data you can actually quash that I'd really love to add on to that Travis because I think this was like it's such a good anecdote for some of the challenges we had working on search and my time at Dropbox is we both were of course we're incredibly metrics driven but sometimes the over dependence on a suite of metrics and the fear about seeing metrics move one way versus an other can sort of hinder you from taking big bets and I think that one of the challenges in being a data driven culture is that you can be so afraid of movement of metric in undesirable ways that you can sometimes lose sight of having a really strong product opinion about where you need to take this product or this experience long term which sometimes just like Travis mentioned means that you're going to see a potential dip in these metrics as they move as you kind of bring your customers on this journey to something big and new and so that over dependencies on on your metrics that can can sometimes be very paralyzing I'm sorry I feel like I have to chime in really quickly here to add something onto my bio that I also was a p.m. on search for like four or five years at a previous company so it's maybe not surprising that we have similar anecdotes but I also feel like many people can relate to that I can also relate to that you know like about data silos like lack of resource over dependency I feel like like you like three of you already touch upon the most common challenges yeah thanks for that yeah thanks for that so we already mentioned a bit maybe like we got already like some answers a bit to my next question but I still want to ask what data related skills should productive members acquire or have to contribute effectively to a data driven culture I think like one of the biggest skills should should be like interpreting the metrics a lot of times like depending on where the data is available if you have dashboards I think just understanding and interpreting the metrics is very very important a lot of times people report metrics in percentage usage no one knows what the denominator is and how that's growing so that's the very common mistake that I've seen so just learning how to interpret the data and how to present it is very important like there are some common statistical pitfalls like if you take a very quick course you will learn about those common statistical pitfalls and then I think the second one is being able to write some simple scripts that I've also seen at like most organization regardless of the size of the organization just being able to write some basic scripts to pull data is always very helpful because chances are the metrics you're looking for are unless they're like top line metrics and you have great tooling they're not going to be available off the shelf for you to use maybe I can build on that a little bit and say like I think another like I think the point about like tooling being important is a good one I think another thing you can bring to the table as a PM is empathy and empathy is one of the go-tos for PMs right and you think about it as empathy with users but often in a data culture you're dealing with some people within your organization a data science team, a data infrastructure team analytics professionals these folks and I think like learning just that bit about statistics helps you have some empathy for those folks learning a little bit about writing a script helps you have some empathy for those folks because often these questions feel really simple you're asking a simple question like how many people clicked on a search result yesterday but that's not a simple question turns out like addition at the scale of these platforms is really hard and like knowing what happened yesterday is really hard even defining yesterday is really hard all of these things are hard questions and so having that empathy with the professionals that are doing that work and recognizing that like just because you can frame a question really tightly and well doesn't mean the answer is super easy to to achieve in a confident way but I love about that answer it's just like oh man we were a hundred percent bare it's like definitely you worked on search and I'm just like oh I can have similar similar anecdotes but adding on especially to what Ashwara said is I think you know in addition to a basic amount of data literacy I think it's incredibly important as especially as a role of a PM is to understand what your metrics mean for your customers and I think that was like really really important to us and I'm like pulling back from our time on search is like why is a qualified click through rate important what does that represent from our customer what use case does that indicate that a customer has really solved so really understanding what your metrics mean and what certain movements of these metrics actually mean for your customer experience and your business I think it's incredibly important I think another really important skill set is when you're and going into like a B testing of when you're really formulating what your AB test looks like and the hypothesis that you're testing a really really important skill that I feel like took some practice even for me was really learning what your ship criteria is and defining that in advance of your experiment of really like trying to think about what is the outcome that you're trying to drive what is the question that you're trying to learn and you know really framing as and we will know we were successful when metrics XYZ move in these certain places and being diligent about that ahead of time can really help you set an experiment up for success or set your learning up for success there's a related question like around this topic from the audience which is like what can an organization do to make it easier for PMS to adapt data for their work I think speaking for like one of the key things that I spent a lot of time doing is driving more this experimentation culture at the company and I would say like one of the easy things that we did was actually and this is maybe like seems kind of basic but having a good experimentation template that any AB test can use in reference was such an important tool to help increase like experimentation rigor you know metric literacy just having that template that had a section for like ship criteria or success metrics or audience was honestly such a helpful tool that we still use like we still send out like oh this was an example of a great AB test like look at this template talk to this person look at these metric sets but having those reusable sets of templates of metrics of you know thought leaders I think was just like such an important tool so that's like a really basic thing that any organization can do yeah big big plus one to that I've seen that be very effective at previous companies and I would even say it's so effective even if you don't have an experimentation platform experimentation tool if you ask people to go through those types of questions before they even ship a feature they start getting more data driven like you mentioned like audience like if you ask people like what's the audience for your future you know this is opportunity sizing this is what pms are supposed to do right but often you know they've done it in their head or they've done it in a meeting but they've never really written it down and done the math and looked at it in the same units as somebody else's future and once you do that you start recognizing like oh wait actually there's not that many people oh wait a second you know and so you can just improve the quality of your future just by asking some questions even if you're not even going to experiment on that I could not agree more with Patricia and Travis and that's exactly what I was going to say that creating an experiment template is exactly what I also did at my team and just putting those metrics down saying tell us about like what is the opportunity size who are you targeting what changes do you expect to see in their behavior and then how would that translate to a metric just putting those things down and having people go through those steps was such a big change in inculcating that habit of going to data and we also gave them resources of like here are the top dashboards you can go look at here is how you size your experiment because then when you give them those tools alongside the template then they get into a habit of like looking at those tools regularly and then another thing we do to create like this culture is also we have like regular readouts of like how the product metrics are performing and I think inviting everyone to those like large metric readouts is also very helpful because you are at a very regular cadence you're aware of how your product is performing and then you will hopefully tend to become curious about those metrics and you start to notice dips and spikes and you'll be curious about why those changes are happening we have been talking now about that you know like each of you or like people in the product teams they support like creating this cultivating data culture like one example that you give or this like creating this template about a bit testing or can be like any kind of documents that are well written how about the leadership like what role does leadership play in promoting and sustaining a product data culture between a product team I mean I can say like obviously if there are tools like like templates like that's an easy place for a leadership to kind of put in a choke point or something like hey do you have your template and if not that's the entry point to the meeting or whatever the mechanism is but I think there's another role which is maybe bigger for leaders which is something like encouraging encouraging curiosity and I think a flip side of that is not being afraid of failure like if you really want to learn things you have to try things and you like to maybe something Patricia was saying about like not being afraid of metrics going down every once in a while that's often a fear that comes from above and so that's something that leadership can do is set expectations that yeah I don't expect everything like if every experiment you put out there is winning you're doing it wrong and that has to be an expectation that comes from leadership so the way I often think about that is don't be afraid of failure like be afraid of learning be afraid of not learning from failure but don't be afraid of failure I think that really resonates so so deeply is just embracing this mindset that actually I would say maybe even I would go so far as to say as almost the majority of experiments are sometimes the ones that actually don't ship but you still need to be learning from those you know quote-unquote failures or successful failures as we like like to call them in order to make sure you're gaining value and insights that you can act on later. I think another thing that really helps in terms of what leadership can do to establish a strong data culture is giving clarity and operating with transparency around what metrics your leaders care about and often times they're really big metrics they're things like weekly active users or paid customer retention and no one experiment is likely to really ever showcase movement towards those big metrics I mean unless you're running a really great experiment and like maybe growth like maybe you have some more examples of this but rarely do does a single experiment actually directly impact something so big but what's still really helpful about having that is that as a team I can understand what goal I'm ultimately laddering up to and why my experiment might impact a larger metric in the long run even if we can't see and detect movement on that in the short term I think that really helps give your team really strong opinions about what you need to be accomplishing and what you need to learn in order to impact that greater metric even if it's something as big as weekly active users and the only other thing I wanted to add was I think leadership can also help like lead by example right all of the things of like running experiments that fail but then documenting your learning sensing don't be afraid of failure if leadership is also doing these things in their day-to-day work I think that's a very good example for other people to start adopting the same culture because then it's not just words you see your leaders also doing the same thing which they are trying to inculcate in the team culture thanks a lot all since we have like five minutes left maybe I will pick the last question and I'm also quickly checking from audience okay there is one which I also had a similar question so maybe I will pick that up are there any courses resources that we could share with PM's in general which could help them embrace data culture in an organization I can share a few this was before my time as a PM I took some like courses with spring board they have like a mentorship program which really helped me but that is like a very involved course I if you're already a PM you probably don't need to take like a very involved course in general like just online videos that are available for free are always a great resource for learning analytics and then I think like most of the data or analytical skills that I picked up were just through just watching free videos online and not through a dedicated course like some of the free content available was pretty solid and they'll help you like work through the basic problems that you need to know for brushing up your foundational analytical skills as a PM yeah there are courses and I it's been so long I can't remember what it was I would also have the luxury of working at a big company like Microsoft that offered experimentation tools but they have various courses they have very similar tools like on YouTube of just experiment 101 of just like you know what your p-value is like how to do sample size mismatching and you know how to read results and I would even say like just looking at experimentation 101 YouTube videos I think was even something I did in my entrance into the search team so just like those basic data literacy thing good experimentation literacy yeah I mean the only idea to that is like there's also a lot of a lot of blog posts from a lot of companies like ours like all the ones represented here and although a lot of others kind of talking about this type of topic but really there's no substitute for sort of hands-on like it even if you you know even if you're in a much smaller company you likely have one like analytics person or data science person just you know be friends with them watch what they do ask their advice like that's another good mechanism thanks all since we got a couple of more minutes maybe I will ask last one question again from audience there's an interesting one what about products where we don't collect user analytics as products are meant for complete offline scenarios yeah I think that's a great question and maybe you know increasingly relevant too as like people have different you know different scenarios not only about like complete offline but also like I think somebody else in the chat maybe asked about privacy as well and like there's a lot of reasons why we may not want to collect all the user analytics that maybe we did 10 years ago but one of the things that I find like there's a lot of other things to do is like look at downstream business outcomes and try to try to work to that but one of the things I found is once you start approaching product from the sort of metrics data informed approach you can actually just go build the metric that you want even if you can't observe it and even that action of recognizing here's the metric that I would love to have and like no I don't have access to that data but that's the metric that I would love to have even that action makes you a better PM and makes your product better and then you can think about what proxy metrics you can depend on how deep you want to go you can even try to like ladder up some proxy metrics that you can't observe to get there but even just understanding if I could measure absolutely everything this is what I would want to know that is huge the one thing I would add like Travis gave a good example right like in some cases you're also not just allowed to collect data and as when you're working on education a lot of our work was with kids we're not allowed to collect their data so we would rely a lot on like user verbatims and interviews and like one-on-one talking to people and having people on the ground just talking to users and understanding how they're using our product what do they think about it and that was very qualitative but even that really helped us understand consistently what were the user pain points and how do we get better at solving them of course it's not as high fidelity as it would be if you were like tracking metrics by logging your product usage but that is something that is a tool available to pms when you're not in a position to collect data I agree and I have an example of this where I had a feature that we were desperately trying to test and get signal on but we couldn't maybe test it because the treatment would have actually caused customer harm and so we're like absolutely not like cannot cannot do this and so we had to rely on other forms of data collection one of them like was absolutely qualitative feedback another one we were lucky enough to have you know a really experienced data scientists who did like propensity score matching where we were able to model it but of course like not every team has that resource available to them I was you know incredibly smart data scientist I was able to work with but we had to use alternate forms because we just couldn't physically run we were just not legally from a customer ethics perspective you know allowed to run this experiment yeah thank you all for sharing and with that we also came to end of the session so I want to thank you all for like participating and sharing your perspectives and also thank you to our audience for being here have a good one