 Hello, ladies and gentlemen, product managers and product enthusiasts all. My name's Will Perkins and I'm a product manager at Deliveroo. I'm delighted to be here with you today. Alas, I speak to you over the airwaves, but I hope one day to meet you in person at a product school event or something similar. I'm truly humbled to be able to address you today and steal, I hope, just 20 minutes of your time to talk to you a little bit about how I think about effectively using data, particularly when it comes to you in a form where you may not necessarily be familiar with it and is perhaps presented to you by stakeholders or a third party. So, without further ado, let me take you back to the 1970s Britain where the radical fringe of left-wing political thinking was dominated by the Labour firebrand MP and pipe-smoking aficionado, Tony Ben. Skeptical of those in power, he formulated his famous five questions to ask those with it in order to get to the truth about how they intended to wield that power. What power have you got? Where did you get it from? And whose interest do you use it? To whom are you accountable? And how do we get rid of you? This, he believed, would tear away the veils of obscuration and get down to the truth of the matter. Now, as product managers, we are also interested in the truth. In order to make good, well-reasoned, rational decisions, we need to have a reliable understanding of what is going on the real world and how our products interact with it. And excellent, and some might say, the best reflection of that real world is data. But data is merely a tool, and like all tools, must be wielded skillfully and to a purpose. We must understand its abilities and limitations. It's also not a Swiss army knife. To achieve our goals, it needs to be used in conjunction with other tools in our toolbox. So like an operatic conductor, product managers must pool together the various sections of the orchestra to create that rich soundscape and create that picture. Data may be playing first violin in our orchestra, but it needs the support of the others to really hit those virtuoso high notes. So as Tony Ben interrogated political power to find truth, so can we with our data. Why don't we take Ben's five question framework for defining truth and create our own that we can use to make the most out of our data and ensure we are making the right decisions for our products? I want to know what motivates the collection, processing and presentation of a dataset. I want to know where the metrics I'm looking at are actually the ones I care about and are the right thing to be measuring. I want to know how the data's been generated. Is it high or low integrity? And once I'm happy with that, I want to know how I can validate that data. Ask whether it's really telling me what I think it is. And finally, I must say I'm not quite as much of a revolutionary as I would tell me. So I'll take his final question and turn it on its head. If somebody's interested in using data to inform their decision-making, then how do we get them on board as product managers? Now, I'm not sure you often hear is that the data doesn't lie, but think about that for a minute. Think about the data you were asking me to buy on a daily basis outside of your work. Generally, I find it's an advertising, either trying to sell you a product or a political idea, perhaps. In this context, one could be forgiven for thinking it does nothing but lie. The data doesn't lie, it's a nice sentiment. And I get it, it's inanimate and human interaction is required before it means anything. But frankly, to take this phrase at face value is a little naive. And the one thing that product managers cannot be is naive. We've got to be sharp. We've got to understand what we're talking about. And so we've got to be skeptical about any information we're used to make those decisions. Because ultimately we and we alone are accountable for those decisions. No one else, and we certainly cannot blame the data we're using. Data can be wrong, it can be misinterpreted, and yet on occasion it can be used to lie. Now please don't see this as cynical or dismissive of it. I think skepticism is healthy. Questioning our assumptions will help us to better understand what is true and what isn't. It will help us avoid blunder, improve the quality of our decision-making and ultimately will make us better product managers. So then let's get stuck in with question one. What is the motivation? Let's pick up that thread again of where we see data in our day-to-day lives outside the world of product management if such a thing does exist. We mentioned that people use data to try and sell us things. And often we won't even notice or recognize that data is being used. Now apparently it's one of those rules of the internet that one must have cats in their presentation. Now I don't usually go into these sort of things but rules is rules. So here we are. I don't know if any of you remember this classic, I guess 80s or 90s whiskers advert campaign for cat food. Eight out of 10 owners who expressed a preference said their cat preferred whiskers. I know it's a bit of a mouthful, isn't it? But this advert has some fairly compelling data in it. So it must be true. And therefore we should all go out and buy this product. Well, I don't know if you know anything like me but when I see things like this I have to go have a bit of a lie down in a dark room because my product manager brain just goes off the wall with it a little bit. What does this mean? You ask 10 people and eight of them said so or 80% of 10,000 people said it. And who are these people? Do they often present their cat with a selection of dishes for their supper? And then what do they prefer it to? Do they prefer it to a succulent roast chicken or perhaps going hungry? What about the owners who didn't express a preference? What do they think? In this case, I'm skeptical. This is obviously about selling as a product and we have no way of knowing whether it's true or not. And yet, and yet there's a bit of me that still just nods along and says, okay, next time the cat's hungry, I'll go and buy the whiskers. You know, it's effective. It taps into our psyche somehow and advertiser wouldn't do it otherwise. We face this type of thing as product managers in our jobs. People will regularly approach us with ideas for things they want us to build and a great way to sell that idea is supporting it with data, which frankly is brilliant, far better than doing it out and we should wholeheartedly encourage it. But it's totally natural for them to want to make that data as compelling as possible. It's their job. But it's also our job as product managers to understand why they are showing us this data. Unlike in an advert, we have the opportunity to put our skepticism into practice. We can question what their motivations are for presenting their data. Through this, we can understand why that data has been presented in the way it has. What is present in it? What is missing? What assumptions have we made? We could even do another whole video on this, a subsection of 20 questions to ask when you're thinking about the motivation for things. But that's for another time and another place. This is also our first opportunity to understand whether we're looking at the right data and the right metrics, both in order to understand the problem and also to measure the impact, which provides with quite a nice segue into question two. Are these the right metrics? So join me as we venture further back in time from the 1970s to 1914, World War I and Europe as a flame as the major European powers confront each other on the fields of northern France. In the decades since the last major European war, military technology had taken great strides forwards, but the tactics and use would be recognized by a soldier fighting in those same fields a century earlier at the Battle of Waterloo. Officers wore white gloves and ostrich plumes in their hats whilst leading troops. The mounted cavalry charge was still a common sight in the battlefields. Soldiers of all nations went into battle and were in cloth headwear. Even the rather satisfyingly named German Picklehalber with its distinctive brass spike and crest on the front were just made of a rather thin leather and provide little ballistic protection. As the machine gun made surface war for impossible, trenches were dug deep into the ground to provide cover. Artillery bombardment then became the modus operandi for all armies. Now, when artillery shell bursts, it would blow up into the air huge amounts of earth, rubble and rocks that would then rain down on those sheltering in the trenches. Military commanders noticed that this was resulting in a number of soldiers being taken back to aid stations with head injuries increasing rather dramatically. Now, British commandant in particular had picked up on this as a good metric to try and address. Field Marshal, the Earl Hague then will play the role of our product manager here. He took measures to reduce the number of head injuries sustained by his troops by introducing the steel helmet smartly modeled by the chap here on the left, the Mark I Brody helmet that's called. This is a very good example of product design. It's protective, has a nice rim to shield the eye, to shield the eye from the sun and on the back of the neck from anything falling behind you. It's also got a sort of rather jaunty aesthetic which soldiers of all times rather appreciate. But interestingly and rather alarmingly, once these helmets were introduced, the rate of head injuries actually increased and increased dramatically. Now, why was this? One theory was that the added feeling of safety that the helmet provides led to soldiers acting more recklessly, taking more risks, sticking their head above the parapet where before they might not have. Yes, I suppose this is a viable hypothesis but it's based on an incomplete understanding of the data set. Head injuries per thousand is not the only metric that is important here. The most important metric, particularly for those actually having to wear the helmets is the rate of death. Yes, the head wounds increased but only because soldiers who would otherwise have been killed by shrapnel were instead only sustaining injuries. So here our product manager I found a good metric that got him onto the target but he needed to go one level deeper in his understanding to get to where he actually needed to be. I even think about in terms of prospecting for gold. You find traces of the precious metal on the surface and in streams but one needs to dig deep underground to find that main gold steam. So we think we have our metrics we want to measure and move. Well, before we get too excited we need to think about how reliable this data is. And one way to do that is to think about how it's been generated. We must remember that data doesn't emerge out of the ether, pristine and incorruptible. Like I can think of it like the body of one of those preserved saints in a Venetian church that has said sort of waff the scent of flowers even after centuries in the tomb. Instead we must recognize that at some point somewhere somebody has written a bit of code that creates this data based on a set of assumptions valid at the time and to the best of their knowledge available. Knowledge which may have since developed and changed. Thinking of this data, think of data in this way I find helpful. It is the work of human hands and as such is susceptible to all the failings of human endeavor. Continuous iteration and development can improve the integrity of data but it can also result in errors and misunderstandings get an institutionalized. We should be mindful of this and if we aren't familiar with the data we should check it. Honestly, I cannot tell you the number of times I have been confused by naming conventions thinking that one metric was one thing when in fact it was measuring something completely different. That data was generally created by your user's interaction with a product. When someone presses a button on your app or on your website, it fires an event and then builds up a data set. So again, you need to understand the motivations of your users when they're engaging with your product in order to gauge the integrity of that data set. Let's look at two examples. Web traffic is fairly reliable. You know when someone has landed on your webpage and there's generally little room for error or motivation for misuse. But even here, you may find a deliberate denial of service attack on your website can make it look like your site is extremely popular when in fact it's a malign actor trying to bring it down. And at the other end of the spectrum, a common source of data for us PNs comes from customer support. Now I think this is generally an invaluable way of understanding the pain points of our products. But just remember that often this data is generated through the manual tagging of issues by agents who have to interpret the reasons for the contact. Generally this is fine, but sometimes it could create distortions as agents try to squeeze issues that aren't catered for into the existing categories. My top tip here is that if your most common category is other, then you need to be very, very careful with how you use that data. Also, if you have anything gamified in your product, you've got to be careful. Somebody very worldly wise once said to me, if you create a game and players are gonna play it and people play to win, even if that means cheating, users will do what they need to do in order to achieve their goals. I think a great example of this, and I think a truly awesome way of addressing it is through the dating app Tinder where one arranges romantic liaisons with other like-minded people in their local area. Now, Tinder saw that people didn't necessarily want to be restricted to their local area. They wanted to search far and wide for their sweetheart. And so people began to mock their location data in order to check out potential dates in other areas. Here, users were purposefully creating incorrect and misleading data in order to achieve their goals. Now, if anyone's ever engaged with this sort of thing, location mocking is pretty difficult to stop. And so rather than trying to stop this outright, I think whoever the PM was at the time had a brilliant idea to not just embrace it but make it a premium feature, giving people the ability to search anywhere they wanted. It was understanding how the data was generated, recognizing the anomalies in it and pairing that with the knowledge of the motivations of the user that they found a really, really innovative way to monetize their product. So to our next question then, how can it be validated? For those non-plane spotters and people listening on audio, we're looking at the cockpit of a 787 Dreamliner, one of the most advanced aircraft in the world. Now, it's very, very possible to fly this plane using the data from purely the instruments. And in fact, much of the time, the aircraft can fly itself on nothing but the data. But not everyone is flying a Dreamliner. Most of us product managers are working startups or scale-ups where the cockpit perhaps looks something more akin to this. Now, one could possibly fly this just by looking at the dials with a map encompassed on your lap. But I think it's a good idea here to actually take a look out the window and see what's going on around you. To perhaps push this analogy one step beyond its limit, a lot of the time in product development, we don't even get the plane to start flying. We have to throw ourselves off the cliff and build the plane on the way down. This includes conceptualizing and discovering the metrics that we're going to use before we even build the instrumentation to measure them. Sometimes we have no idea what we should be measuring. And there are infinite number of things that we could measure out there. I think it's here that user research or customer insights or just your knowledge of the user really comes into play. Where you don't know what you should be measuring, user research can provide you those insights and tell you what is important. If you find yourself lacking data, it can provide you with a good read on what you should do next. Even if you're flying that Dreamliner, it's a good idea to have a look out the window every now and again and just validate the data. Make sure it's really telling you what you think it is. So always look out the window, ask yourself, how can you validate data with user insights? And on the flip side, how can you quantify user insights with data? Now, if you're perhaps indulging for a moment longer on the subject, I might take a bit further. I've spoken a bit about how data is a reflection of the real world, but this is really sloppy use of my language because data is not a reflection. It's not a true if reversed image of the world. It's more of a technical drawing that can have an incredible amount of detail, but lacks the color and life of the real world. Counter to this though, user insights is more in the style of an impressionist painting, a Monet or a Cezanne, creating a vivid image of the world. But again, it's very difficult to measure anything on one of these paintings. One wouldn't necessarily want to build a structure from them, but nor would anyone want to live in a schematic. The two complement each other very well because they opposite sides of the same coin. And when you combine them both, you can create a thing that is beautiful, but also practical. I see product user research and data analysis as the Holy Trinity, or for the more ancient Roman mind of you, the triumvirate of product discovery. Everything you need is in that golden triangle. My top tip here is to try and sit between your user researcher and your product analyst and keep up a conversation between you. I think that in the insights you gain from that conversation will really drive your product to new heights. And don't just take my word for it. This guy used to work in Amazon and he says a similar thing. He says the thing I've noticed is when the anecdotes and the data disagree, the anecdotes are usually right. There's something wrong with the way you're measuring it. Now, big Jeff isn't saying dismiss the data for anecdotes. He's literally saying be skeptical. If your gut is telling you something isn't right and as a PM you'll develop a great instinct for these things, then it probably isn't and you should check it. And so finally, where Ben wants to kick you out in the product management world, we're a bit more accommodating and friendly and we're gonna ask, how do we get you on board? I just think if somebody has shown the interest and the spark to take, identify a problem, take a bit of data, put it around it, maybe perhaps come outside hypotheses, we want to think about how we can get them into the community and encourage them to become product managers themselves. I think organizations like the product school do a brilliant job in training up people who want to be PMs. But I know that when I was coming into the industry, I found the first hurdle was even knowing what a PM was and it was a thing that I could do. So I would encourage you to take these people under your wing, teaching them about product management and product thinking. And if nothing else, you have a group of people around you in your business who you can rely on to probably do with excellent material for your product roadmap. So then to finish, love your data, but be skeptical of it, validate it, understand it, use it to its full potential and you will build awesome products. And finally, get out there and spread the good news of product management. Thank you very much.