 Tonight's introduction was supposed to be given by Matt Parker He's a wonderful comedian mathematician and friend. However, he had a last-minute conflict So instead of giving his introduction live, he's going to give his introduction by video And so I give you Matt Parker. Thank you Thank you very much Cindy. It is now my absolute pleasure to give the introduction for tonight's math encounter I always enjoyed coming to MoMath all the fantastic exhibits all around here Of course, I'm filming this in the place where you're watching it for a bit of an extra hilarious math recursion So I was in doing a talk last night if you were there for that as well You're in for an absolute treat this evening. If not, you're still in for an absolutely brilliant night. My name is Matt Parker I Write about mathematics. I used to be a high school math teacher and I'm based in the UK But I'm probably most familiar from doing things in this video format. In fact, those of you who didn't come last night Made the right choice. I am terrifying in high resolution confined to a pixelated screen as vastly superior and it was a real shame I wasn't able to make it along this evening But I'm here to introduce your speaker Jen Rogers and you may know Jen from messing around in the background of my video Jen is going to go and get ready get ready For her talk Which is in 24 hours from our point of view. That's fine. She'll get changed So anyway, what can I say about Jen my goodness? She is a woman of statistics She is a lecturer in statistics at the University of Oxford She's also the director of their stats consultancy unit if you want to consult on some stats She is the person to talk to she's also the vice president of the Royal Statistical Society She is the president of the British Science Association Well, she is for one year It's a very temporary position and possibly most importantly her number one credential is she appears in some of my YouTube videos if you've watched My YouTube is don't watch them now during Jen's talk But later on if you haven't seen them you can look them up. She talks about stats It's absolutely fantastic about so it's my real pleasure to welcome to the stage Jen Rogers Thank you very much Thank you very much to MoMAF for the invitation and math encounter Math encounter story for having me here So Matt and I were actually over from the UK where we're both speakers in something called maths inspiration where we Perform in big theaters all around the UK We put on these big productions for teenagers to try and inspire them about maths and MoMAF very kindly invited us out to give one of our maths inspiration shows out here So that's what we were doing Yesterday and then Matt gave a presentation last night and you have my company this evening and as Matt said I am a lecturer and researcher based at the University of Oxford Have any of you been to Oxford before? Okay, a couple of you and a lot of people think that Oxford is kind of like a set of Harry Potter And it is It's a very very beautiful place to work I am very very lucky to be able to call this my workplace I actually live about 10 miles out of Oxford as well a little place called Whitney Which is just on the edge of the Cotswolds So I am very lucky to come from a very very beautiful part of the world and as Matt said I am a lecturer at the University of Oxford and I am the director of the statistical consultancy unit I'm not actually president of the British Science Association. I'm president of their maths section And yeah, vice president of the Royal Statistical Society Matt actually came along to the Royal Statistical Society conference at the beginning of September And we filmed a YouTube video together where we did a capture recapture to try and estimate the number of people Who had actually attended the conference and I think that's going to be coming out in the next couple of weeks So yeah, do keep an eye out for that. It was a lot of fun. A lot of fun to do And so what am I what am I going to be talking to you about today? So the title of my talk is is calculated risk So I do an awful lot of work with with various media outlets talking about risk And probability an example of some of the work that I did was for This picture here is me on a program called mystery map where they were debunking different mysteries And I was asked to work out the probability of dying from spontaneous human combustion now For all of you skeptics out there It does actually get recorded as a cause of death the last reported case of spontaneous human combustion Was in Galway in Ireland in 2010 This guy's coroner's report actually has on it as his cause of death spontaneous human combustion and I was asked to work out the probability of this happening and you can You can kind of work out the probability of your cause of death being spontaneous human combustion And the way I did that was to look at how many cases that bean was about 200 Since 1650 how many deaths that bean since 1650 and from that I calculated your probability of your cause of death Being spontaneous human combustion is about one in 70 million. So a very very small probability I also talked about other ways that you might die It was a really really upbeat program So, yes, there's an example of some of the weird probabilities that I get asked to calculate And as I said, I do an awful lot of work talking about risk and Communicating risk and the kind of risks that we see in the media all the time So what kind of risks am I talking about and we say that night owls risk early death Excess drinking can take years off your life Just one concussion can increase your risk of Parkinson's disease and even sitting too long can increase your risk of dementia We have put some numbers on some of these So we have the HRT nearly doubles the risk of ovarian cancer Pain killers can double the risk of heart attacks and strokes and a child's risk of brain cancer Triples after just two CT scans and these are all really shocking statistics And we're expected to use these headlines to inform our day-to-day lives So what I thought I would do in this talk is give you give you a kind of toolbox as to the sorts of questions You should be asking yourself when you see headlines like this. What are the things that you should be asking? And I would start out though by just finding out how good you guys are at knowing what risky activities are Let's figure out what I'm working with here tonight. And so whatever I'm gonna do a survey. I'm a statistician, of course I'm gonna do a survey and I'm gonna I'm gonna give you a couple of options And I ask you which you think is the more risky activity So the first one I'm gonna ask you is which animal causes more deaths annually worldwide. Is it? crocodiles or Hippos, so give me a cheer if you think it's crocodiles Yeah, there's a little okay. There's a couple of you. Give me a cheer if you think it's a hippo Overwhelmingly in favor of the hippo, but I can tell you actually that crocodiles cause a lot more deaths than hippos do So it's 500 deaths a year for hippos thousand deaths a year for crocodiles. So Yeah, it's not looking good. Is it? so Which is the more dangerous mode of transport so which one as you're doing it are you more likely to die from? Is it riding a bike or driving a car? Give me a cheer if you think it's riding a bike And give me a cheer if you think it's driving a car That's a little bit more even but I think it's still in favor of the car over the bike I can tell you for every billion hours that you do each of these activities There are 130 deaths if you are driving a car and there are 550 deaths if you are riding a bike So actually riding a bike is more dangerous than than driving a car. Okay, the last one We're going to American sports here. So which sport causes the more most injuries Per the number of people that take part in them. So is it baseball? Or is it cheerleading? Give me a cheer if you think it's baseball. I give you a cheer if you think it's cheerleading I Get you guys know you're American sports I could tell you that for every hundred thousand people that take part in each of these you've got 20 injuries for cheerleading and you got 16 injuries for baseball. So yeah You redeemed yourself when it came to American sports And I want to talk first of all about something that comes up in the UK press all the time It's one of these things that You should never eat, you know, it comes up all the time. It causes cancer. It causes heart disease It's really bad for you and it's it's it's bacon. So in the UK, we like our bacon sandwiches We like our fry ups. I know you guys like to put them on pancakes bacon. It's a good staple food However, apparently it's not very good for you. You've got a higher cancer risk Eating two rashes of bacon a day increases your risk of stomach cancer a daily fry up boosts cancer risk by 20% now when this statistic actually came out when these headlines came out Bacon sales plummeted and people stopped buying bacon Increasing your it was pancreatic cancer increase increasing your risk of pancreatic cancer by 20% if you eat bacon Sounds like a really shocking statistic Now I was asked to investigate this a little bit further Now the crucial thing about this statistic is that it's what's called a relative risk It only tells us the risk in one group relative to another Doesn't actually tell us anything about what the risk is all I know is that it's higher by 20% in one group compared to the other So I thought okay, let's delve a little bit and have a look at the actual numbers associated with this So I had a look on cancer research UK. It says that your risk of pancreatic cancer is a one in 80 lifetime risk Okay, what does that mean? That means if we were to take 400 individuals who didn't eat bacon We would expect five of them to get pancreatic cancer anyway So let's look back at that headline again our daily fry up boosts our cancer risk by 20% or a fifth And what's a fifth of five? It's just one Meaning that if we bacon every single day our risk goes from five in every 400 people to six in every 400 people It's only an extra one person in every 400 But that would have been a very different headline to boost cancer risk by 20% There are also headlines that said that bacon ham and sausages were now as big a cancer threat as smoking the World Health Organization was to warn Now the reason is the World Health Organization produces these lists every year of things that are definitely known to cause cancer well the known risk factors and Process meat had been added to this list for the first time and smoking was already on there as a known risk factor for lung cancer And they were saying that because they were now both on this list. They were as big a cancer threat as each other But these lists are only based on something called statistical significance I'm going to talk about statistical significance in more detail later on in my talk But briefly what this does is it just says whether or not something definitely does or doesn't cause cancer Doesn't quantify the risk in any way. It's just a yes. No does it or doesn't it? So how do the stats for smoking and lung cancer compare to the stats for bacon pancreatic cancer? Yeah, let's take our 400 individuals again If you don't smoke we would expect four of them to get lung cancer anyway If you smoke 25 or more cigarettes every single day that risk actually goes up 24 times To 96 in every 400. So it's an extra 92 individuals in every 400 compared to that one in 400 for the bacon and pancreatic cancer So yes, they may both be statistically significant in increasing your risk of cancer But to say that they are as big a cancer threat as each other is probably pushing it a little bit too far Now you'd think that these newspapers would would learn their lesson So when Matt was giving his talk after his talk and he was having some conversations to people I was you know flicking through my phone had a look on BBC news Let's catch up with what's going on at home. And this was the headline that I saw for assessed meat linked breast cancer apparently Birkin now causes breast cancer as well So there's a newspaper article yesterday saying that eating bacon increased your risk of breast cancer by 9% And so I had a look at the stats on that If you don't eat bacon, we've got our 400 individuals again You'd expect 57 of them to get breast cancer. So women you'd expect 57 of them to get breast cancer anyway 9% increase of that takes it to 62. So it's only an extra five in every 400 But again, they keep running these headlines that make it seem like these risks are a lot worse than one What they are and what they also do is they tend to use the extremes as an excuse for you to never ever do that activity So if you eat bacon every single day Compared to never eating it that increases your risk of pancreatic cancer by one in every 400 people But if you only eat it say once a week on a Saturday morning as a tree that's gonna have an even smaller effect and yet they use these headlines as Reasons for you to never ever do these risky activities What you've also got to think about is that if you're eating bacon for breakfast every day You're not eating fruit for breakfast every day. You may also have a more unhealthy lifestyle in general So how do we know that it's the bacon that's actually causing this increased risk of cancer and that it isn't one of these other Lifestyle factors instead and what we say in statistics is that correlation doesn't always mean Causation one of my favorite examples of this are these headlines here So one kind of fizzy drink a day will make teenagers behave more aggressively Fizzy drinks make teenagers violent children drinking fizzy drinks are regularly more likely to carry a gun Now it could be that drinking fizzy drinks makes teenagers go a bit crazy Or it could be that there are some other social demographic factors Which mean you are more likely to drink fizzy drinks and also you are more likely to be violent Or it could be that being violent is thirsty work and at the end of it you want a fizzy drink You know, we don't know which way around that relationship goes You think about correlation causation one of the best ways to explain this if you think about ice cream sales as Ice cream sales go up. So do the number of drownings. So does that mean that eating ice cream causes people to drown? No, both of these things are affected by something the temperature as The weather gets better and the temperatures increase We are more likely to eat ice cream and we are more likely to go into the ocean Meaning that there are naturally just more drownings and once we account for this in our analysis that Relationship that direct relationship between ice cream and drownings disappears. We call this hot weather a Confounder and when we carry out our analysis in observational studies, we have to allow for these confounding effects To make sure that we're not going to get these, you know causal causal relationships that aren't there And there's a really nice website that you can actually go on if you google spurious correlations It's the first website that comes up and you can look at all these weird and wonderful things that seem to be correlated with each other Like number of Nicholas Cage movies is correlated with weird and wonderful things. I've picked out some of my favorites So there is a 95% correlation between the per capita consumption of corn syrup and the divorce rate in Florida So does that mean that if you want to avoid divorce, you shouldn't eat sugar. I don't know There's a 99% correlation between the Revenue generated by arcades and computer science doctorates awarded So does that mean that computer science doctorates are spending a lot of time playing arcade? I mean maybe I mean that one's probably maybe But my absolute favorite is that there is a 95% Correlation between the per capita consumption of cheese and your chances of dying by becoming tangled in your bedsheets So does this mean that we shouldn't eat cheese before we go to bed because we might die You know, no, of course not these are two things that just happen to be correlated with each other And it doesn't mean that one is causing the other Now you might be thinking to yourself, okay, these are all really nice examples. They're all very funny But we obviously know that eating cheese isn't gonna cause death by bedsheets You know, when do I actually really need to think about this? Well, there was a study in the BBC That I was asked to comment on at the beginning of last year Which said that living near a busy road increased your risk of dementia by 7% Now I actually went and then had a look at this study It was published in the Lancet and if you if you had a look and you looked for all of the things that they controlled for They didn't actually control for family history in the study Now I argued that well family history is a well-known risk factor for dementia And it could be argued that family history also affects where you're likely to live If you grow up in the countryside You may be more likely to carry on living in the countryside as an adult if you grow up in the middle of a big city You may be more likely to carry on living in a big city as an adult Meaning that you've got this family history element to your to where you live and also this family history element as to whether or not You may or may not get get dementia and the fact that they ignored this from their study I thought was a major flaw If you also look at the supplementary material of this study, you can see some other risk factors So if we look at this top row here Oh, that disappeared so this dementia row here We can see that if you smoke that increases your risk of dementia by 30 percent That's not that 1.3 means means you increase your risk of dementia by 30 percent If you are obese you increase your risk of dementia by 64 percent You have a low versus high education. You increase your risk of dementia by 59 percent And yet it was this living near a busy road increase in your risk of dementia by 7 percent That all the newspapers chose to report and my argument was that before you know picked up sticks and moved to the countryside There were other things that you could be doing other lifestyle choices that you could be making that could have a much bigger Effect on your risk of getting dementia So these are things that you need to think about when you're doing observational studies I do an awful lot of work in medical research That would be my main area of expertise and in particular I do an awful lot of research in clinical trials Now in clinical trials what we would typically be doing is looking at a new treatment. So testing a new treatment I work mainly in heart failure and we would be looking to see whether or not You know it improved the number of times you have to go to hospital or the number of heart attacks You had things like that and clinical trials are run very very differently to observational studies I often get asked You know, why do we test these treatments? You know, why do we go through these formal clinical trial processes that take up so much time and cost so much money? And I always say that new doesn't always mean better and in actual fact for every hundred thousand chemicals that are ever researched Only four make it to be licensed for prescription now The I want to I want to talk about the Ebola virus now the the last Epidemic of the Ebola virus the big the big one that was a few years ago now This was when people changed the way they thought about treating Ebola and they were trying to develop lots of drugs Lots of vaccines to prevent people getting it but also vaccines to help you if you got it And I want to talk about some of the stats associated with this as a as a justification as to why Testing of treatments is really needed. So these stats are a little bit old now But they're not too far away from what the final numbers ended up as So there were over twenty eight and a half thousand reported cases of Ebola and over eleven thousand deaths Now when this epidemic first started Everybody just went into panic and was trying to do anything that they could to get handle on this virus on this virus spreading because it was just spreading uncontrollably and One of the treatments that was currently being developed but hadn't actually made it through to the licensing stage Was a drug called Z map Then they started giving Z map to individuals just to see if it would help and they gave it to three people to start off with The first two were from the US and William Pooley was From Britain and they gave Z map to these three individuals and they recovered And everyone thought great brilliant We're actually going to be able to control this everything's going to be okay So they carried on giving it to people and they gave it to two more now unfortunately for these two individuals This treatment didn't work Now if we look back at the numbers that I presented before that eleven thousand case Eleven thousand deaths from twenty eight and a half thousand cases We can see the forty percent of the people who got Ebola died from it Meaning that sixty percent of people recovered from it naturally And if we look back at these five individuals We can see that three of them recovered and two of them died, which is sixty percent recovering and forty percent dying So how do we know that it was the Z map that actually made these individuals recover from Ebola? And they weren't actually just in that sixty percent of people who would have recovered anyway And this is why clinical trials are so important So what do we do when we run a clinical trial? We take a number of individuals we take a sample and we split them into two groups We have a treatment group and we have a placebo group Now our placebo group could sometimes be a genuine placebo where we don't give them anything Or it usually is standard practice So if I'm doing a trial for heart failure It would be unethical for me to take someone's standard medication away from them and put them on a true placebo So what I would actually be comparing is some new intervention with what the current standard practice is And what we would do is we would then follow these two groups up through time And at some point we would take a look and we would see how many people recovered in each group And what we're crucially interested in isn't just do people in the treatment group recover But do more people in the treatment group recover compared to the placebo group As I said clinical trials, they're controlled a little bit more than observational studies First thing that we do is we carry out something called randomization And what that does is it decides which group Every patient ends up in and it could be something as simple as flipping a coin, you know heads in one group tails in the other Why do we do this? So what randomization does is it ensures that the two groups are as similar as possible in every single way Apart from the treatment that they get So it makes sure we've got the same amount of males and females in each group The same sort of age distributions in each group the same sort of medical histories in each group So the only difference between those two groups is the treatment Meaning that if we see a difference in their recovery It has to be because of the treatment and not because of something else not because you know all the women We're in one group and the men were in the other We also don't tell people what they're taking we have blinding and the reason for that is that it prevents Something called the placebo effect now. Has anybody heard of the placebo effect? Yeah, quite a few people that my favorite example of the placebo effect Was a study that was done in 1996 where they looked at a new local anesthetic called Travaricaine And it was just this brown liquid that you could just paint on a bit of your skin And it would numb numb it so what they did was they they took a group of students And they took their fingers and they put Travaricaine on one of the fingers and left the other one untreated Then they put their fingers in a vise and then they turn the vise and they asked them if it hurt And and everybody said that the one that didn't have this local anesthetic on it hurt a lot more Travaricaine is made of water iodine and time oil Basically water it does nothing the iodine was in there to turn it browns that it looked like a medicine The time oil was in there to make it smell medicinal And then they had people you know in white coats and putting on gloves to Administer it and it all looked the part But it was just this placebo effect because these people thought they were getting something that was going to make them feel better They automatically just assumed that that was it that they felt better So blinding prevents prevents this So what I'm going to do is I'm going to give you a hypothetical clinical trial So let's just say we've got this new treatment paracetamol and we want to see if it helps people recover from headaches And we're going to compare this to a placebo group. Let's say we've got a hundred people in each group And we see 65 percent of people recovering our paracetamol group compared to 60 percent recovering in our placebo group Now as a medical statistician working in clinical trials this is the kind of data that I see on a day-to-day basis and it would then be up to me to decide whether or not There was enough evidence here to say that our paracetamol group was doing better than our placebo group And there are a number of things that we need to think about when we see data like this And the first thing is something called uncertainty now Down this row there should have been coins sitting on each of your chairs So could you get those coins for me now? So you've got some nice British two-pence pieces and what I'm going to ask you to do is I'm just going to ask you To flip your coins ten times I'm going to keep a count of how many times you're doing it and I'm just going to ask you to count how many heads you get If you don't feel confident flipping the coin just give him a shake in your hand and put them on the floor I don't care how you do it, but just count how many heads you get and I'll count how many times you're doing it Okay, so one two This is a great thing to do with teenagers on stage four seven nine So hands up if you were with a coin so I know who I'm asking How many heads did you get? seven four two Four so we can see Everybody there had a fair coin with a 50-50 chance of a head and a tail So if you were to flip that coin ten times you would expect to get five heads and five tails But that wasn't actually what we saw I think was the smallest two and the highest seven or eight something like that This is uncertainty so in statistics we talk about the difference between probability theory and statistical inference Now in probability theory, we know what the underlying probability is But when we run experiments we see noisy data So I know that when I flip a coin I've got a 50-50 chance of getting a head or a tail But when I actually run experiments with that coin I don't always see five heads and five tails sometimes I see two heads sometimes I see five heads sometimes I see six seven You know we see this noisy data In statistical inference what we're trying to do is we're trying to go the other way So we're taking samples of data that we know are noisy and subject to uncertainty And we're trying to use that data to tell us something about the underlying probabilities So when I look at my clinical trial, I'm not interested in whether or not 60 is different from 65 Because it obviously is but those 200 people are just a sample And it's a realization that's noisy data that I know is subject to uncertainty I'm trying to use that to tell me something about my underlying chances of recovery from a headache in the two treatment groups If I took a different 200 people, I wouldn't see exactly 60 and 65 percent recovering I'd see something slightly different. A different 200 people would give me something slightly different again So when I'm doing my analysis I have to take this uncertainty into account and I have to think about what is that sample saying about the underlying probabilities Now there is something that I can do to try and help myself in this situation Now there are some of you who've got dice. This is not the time to use them hold fire that time will come I can tell you all get very excited. This is something else So I've run some computer simulations of some dice and If you've got a fair dice, you've got a 1 in 6 chance of getting a 1 2 3 4 5 and 6 Meaning that if I was to roll that dice lots of times I would expect to see about the same numbers of 1 2 3 4 5 and 6 if I saw Lots of one number or too few of another then I might have reason to believe that I've got a biased dice So I've run some computer simulations of dice And I am going to show you what they look like after a certain number of rolls And I'm going to ask you if you think it's a fair dice or a biased dice. It will become very obvious once I start So here's the first one. Here's what it looks like after 10 rolls 20 50 and 100 so what I'm going to get you to do is I want you to all stand up for me Everybody could just stand up If you think that this is a fair dice I would like you to sit back down if you think it's a biased dice I would like to stay standing so sit down for fair say standing for biased Okay, I think you might be about who thinks it's biased dice You want to just shout out the number that you think is biased Three you think it's three. Okay, you can sit back down. Thank you very much And I can tell you that this one is a biased dice But not in the way that you think it's not that three that's biased It's this number six here that actually has a higher probability Than all of the others and if actually I take that up to a thousand rolls We can now see that this number six is about twice as high as all the others that have started to level off Okay, we've got two more. I won't get you to stand up this time. We'll just do hands So up for biased down for fair. So this is what it looks like after 10 20 50 and 100. So what do you think hands up for biased hands down for fair? Okay, we've got a few hands starting to go up. It's a power of the crowd, isn't it? But I think most of you think that it's a fair dice. I can tell you that that one is a fair dice So well done if we take that up to a thousand rolls We can see that they've all leveled off and we can see that a lot more clearly Okay, so last one. Here's what it looks like after 10 20 50 and a hundred. What do you think for our bias hands up for bias hands down for fair? All right, I think that's a pretty unanimous for bias. Do you want to shout out what number you think is biased? It is obviously yeah, of course Obviously It is a bias dice and it is biased in favor of number six But also it's biased in favor of one and three and this last example is a really nice example of the interplay between treatment effect sizes and sample sizes if I have a new treatment That is only slightly better than what I have currently. I'm only gonna have a very small treatment effect There's only gonna be a very small difference between the two and in general The smaller the sample size the more uncertainty you have on your data the more noisy it is So if you think you've got a treatment effect that's only very small You've got two probabilities underlying probabilities that are very close to each other as you take samples and have actual noisy data They're gonna be bouncing around quite a lot And you're actually gonna be able to see that there's very much difference between the two So what you need to do is you need to increase your sample size to decrease the uncertainty So that you can then actually see that there's a difference between the two If on the other hand you have a very big difference between your two underlying probabilities and your treatment effect size is really big Well, even if I've got a small sample size and there's a lot of uncertainty on that Kind of doesn't matter because I can still see that there's a difference between the two They're sufficiently far away from each other that I can see that there's something going on and That's what happens in clinical trials So if we've got something that's only slightly better than what we currently have We need these massive clinical trials with loads and loads of people to be able to see that there's anything going on You can see in this example 4 in 11 is very different to all the others Okay, so 20 rolls you can't really see there's anything going on by the time we get to 50 rolls We can already see that there's something going on with that number six But the one in the three it's only actually when we take it up to a thousand rolls When we increase that sample size that we can actually then see what's going on and they're they're closer to what the true underlying probabilities are So I have to think about all these things when I'm running my clinical trial So this is our clinical trial that we had we got a 65 and 60 with a hundred people in each group Now there is a formal test that I can carry out that takes into account all of this uncertainty in the sample sizes and things like that And it's called a hypothesis test. Has anybody heard of hypothesis testing? Okay, some of you for those of you haven't In hypothesis testing we assume there are two hypotheses the null and the alternative So the null hypothesis says that actually the underlying probabilities in each group are exactly the same there They're exactly the same these two groups and our alternative hypothesis then says that there is a difference between the two groups What we do in our hypothesis testing is we assume that the null hypothesis is true So we assume that there is actually no difference whatsoever between these two treatment groups They're exactly the same your underlying chances of recovery in the two are exactly the same Well my hypothesis test then says is okay if that's true What is the probability of me seeing the data that I've got? So what is the probability of me seeing 65 and 60% recovering if actually the two are exactly the same and I call that a p-value If my p-value is pretty big Then that means I've got a high probability of seeing this even if the two are exactly the same I really have any evidence to say that there's a difference between the two If on the other hand that p-value is very small I'm saying my chances of seeing this if the two are exactly the same is only really small So maybe they're not the same and maybe my paracetamol is doing better than my placebo So what is my p-value for this data here? My p-value for this is 0.47 so 47% it's pretty high 47% chance of seeing this even if the two are exactly the same I really have enough evidence there to be saying that paracetamol is doing any better than the placebo group But what if I took this up to a thousand people now? I've got a little bit more confidence in those numbers that I see a little bit more confidence that they're closer to what the true underlying unobserved chances of recovery are in the two groups my p-value for this Turns out to be 2% Which is small so my chances of seeing this they're two exactly the same is only 2% So I've got enough evidence here to be able to say well Maybe the two aren't the same maybe my paracetamol group is doing better than my placebo group If we still have our 60% recovery in our placebo But now we have 75% recovery in our paracetamol group and we go back to our hundred people in each group Well now we're back to our small sample size But because we've got a bigger treatment effect and those two are further away from each other Chances are that there probably is a difference between the two and they're not the same and my p-value for this is again 2% and so again, I would have enough evidence here to be able to say that there is difference between the two now You probably get used to Seeing you know hypothesis tests and sample sizes all the time So in the UK we have a big issue with the adverts that we show so whenever we have these adverts, you know It will say this shampoo makes your hair amazing This many women agreed with it and it's always some ridiculous number And so I wanted to I wanted to show you an advert and I wanted to show you just how ridiculous These numbers can be so I'm gonna play you an advert Pearl drops cleans Pearl drops whitens Pearl drops protects Pearl drops shines Pearl drops 4d whitening system not only whitens but cleans shines and protects too Ultimate whitening up to four shades whiter in just three weeks Pearl drops to polish go beyond whitening Okay, so that's pretty typical for some of the adverts that we have in the UK And I wanted to just highlight this that was at the bottom So there's 52 percent of 52 people agreed with what who knows? I mean, that's a whole other question No idea what it is that they're actually agreeing with Let's just say that they agreed that it made your teeth white Argument sake. Let's just say that that's what they're agreeing with But this this 52 percent of 52 people that I actually want to look at in a little bit more detail So we can actually run a hypothesis test on this data But we have to think in this sort of situation. What would be our null hypothesis? So you got to think that our null hypothesis would be the situation where people would just Randomly agreeing or disagreeing with some statement if everybody said no to some statement that would actually be informative And if everyone said yes to it that would actually be informative that the non-informative case is if someone just randomly said yes Or no so the null hypothesis where it was just pure randomness would correspond to 50 percent of people agreeing So when we're doing our hypothesis test here What we're actually doing is we're comparing 52 percent with 50 percent with a sample size of 52 And if we work out a p-value for that we can see the p-value is 77 percent So really that is not a statistically significant result And yeah in the UK. They're still allowed to present results that are not statistically significant So one of my roles as one of the vice presidents the role statistical society is to actually we're taking on the advertising standards authority to try and improve statistical standards in advertising and We're going to be doing a parliamentary commission next year and we're hoping to stop these ridiculous stats from actually appearing on our TV screens There's another one because it's not just statistical significance though. That's the issue So we've got we've got another another one where people agreed with something After 28 days 74 percent of 54 men agreed. Let's just say it made their skin nice Because so again, we're comparing this with 50 percent. So our null is 50 percent So we're comparing 50 to 74 and because they're further away from each other even with our small sample size You know, we get a pretty small p-value So yeah, it's statistically significant But how reliable is this number this 74 percent? How much confidence do I have in that number? And we can we can actually put what's called a confidence interval on this We can put a 95 percent confidence interval which says if I was to repeat this 54 men a hundred times and I asked them all all the same thing um What sort of values would I expect to get 95 percent of the time? And it turns out that our our 74 percent is our best estimate our best guess at what what it might be But theoretically that value could take anything between 60 and 85 So, you know 74 percent now How reliable is that? What is it actually telling me? How informative is this? now, you know We're used to in the uk seeing these surveys and knowing that they're rubbish Um, but uncertainty is something that we really need to think about Um, so the bbc just recently changed the way they present their weather forecasts And I I was curious as to what the weather's going to be like back in oxford this weekend when I get home So I I took a look at the weather forecast And this is the weather forecast for saturday and we can see Um, that there's some pretty precise point estimates for our chance of rain here So at one o'clock on saturday, my chance of rain is 71 percent Which goes up to 78 percent at two o'clock and goes up to 83 percent at three o'clock Now there's a very very precise Probabilities of rain But I have no idea what the uncertainty is on that Yeah, the uncertainty could be really small on that or it could be massive on that I'm not really sure how that is informing whether or not I should be taking my umbrella I don't know how much confidence I can have in these numbers And I don't think uncertainty necessarily needs to be really difficult to communicate Because we can do all sorts of wonderful things with graphics We can have these bar charts with our confidence intervals presented on there And that could just instantly tell us how much confidence we should have in the numbers that we're seeing So graphics They have this amazing ability to communicate really complicated statistics in nice easy ways But they also have the ability to do great evil And I want to show you some of my favorite worst graphics And the first one is from fox And here we have a pie chart from the 2012 presidential run where we're looking at the presidential candidates and You can see that this pie chart doesn't add up to 100 percent What I think they actually did was asked people would you back this candidate? Yes or no and then thought a pie chart was the best way to present it Rather than asking which of these three candidates would you prefer? That would have produced a proper pie chart There's this one looking at Americans have tried marijuana Again, they they don't add up to 100 But if 51 percent of people have tried it today Then I don't get how less than that have tried it since 1997 I don't know what question they asked to get these numbers I mean, but I mean they've put an uncertainty on there. We've got a plus or minus four percent randomly But I yeah, I have I'm baffled by this. I have no idea what it's trying to tell me We're not much better in the uk. So here is our his plot from the office of national statistics Looking at our gdp growth. And if you look at the bottom if you look at this axis We can see that it's gone from 0.6 percent to 0.7 percent. So 0.1 percent difference And yet those bars look really far away from each other You can tell whatever story you want by zooming in or zooming out of a plot and showing it on different scales So that's something to always watch out for Um, and I think my favorite though is uh, ben and jerry's who somehow think that 61 is bigger than 62 This seems like a really Bad example of where they had a story that they wanted to tell and they were going to tell it And I think that they were just hoping that you would just look at these bars And not actually pay attention to what the numbers are inside them You know, they have an agenda and they want to tell you a particular story And and they are definitely going to tell it um, I want to go back to this This what question we ask as well, you know, as I said in these adverts it brings along the bottom However, many people agreed, but we have no idea with what Because the question you ask in statistics is actually really crucial to then the probabilities that come out Um, and I was asked to work out a weird and wonderful probability Um, I've got the radio clip from it. I'm just going to play the radio clip and then I'm going to talk about it Our colleague of mine Texted me on a saturday morning to say that she'd just withdrawn from a Cash machine in south london a banknote with my name on it And she sent me a picture of it and said is this yours? And astonishingly it was and it was mine from some time ago So this was a news night producer. He'd written his name on a 10 pound note in 2006 Don't if you're ever in the uk don't write your name on a 10 pound note It's actually illegal and I'm not sure how we then managed to spend it But anyway, that's another story. But yeah 2006 he wrote his name on a 10 pound note And then in 2016 a work colleague in london withdrew it from a cash machine And they wanted to know what was the probability of that happening So they called me up and I was like, okay, let me uh, let me have a think about this Um, so I want to take you through my thought processes when I tackle this problem So first of all, I thought, okay What's the chance that I withdraw a 10 pound note that I wrote my name on 10 years ago And to do that I went to the bank of england stats And I had a look at how many notes are in circulation And how many get taken out of circulation every year? So these are the stats from 2006 to 2000 2016 And using those that are in circulation and those that are taken out I can work these are all millions by the way. We don't have like 559 10 pound notes I can work out the probability That my 10 pound note would still be in circulation for each particular year And then using this column here I can say well, what is the probability that my 10 pound note will even be in circulation 10 years later and it turns out to be about one in a thousand Given that it's still in circulation I can then say What's the probability of me withdrawing it from a cash machine if I go to a cash machine now What would be the chances of me withdrawing that 10 pound note? And there are 777 million notes still in circulation So that means my probability that my 10 pound note is still in circulation And then I withdraw it from a cash machine is this one over a thousand multiplied by one over 777 million, which is one in 777 billion Now that was the stat that they all wanted me to say on the news Because that was the most shocking statistic I was like no no no no that's not quite what you asked me So it wasn't that guy that went to a cash machine and withdrew his 10 pound note It was a friend So what are the chances that one of your friends withdraws a 10 pound note that you'd written your name on 10 years ago? Well to do that we have to make some sort of assumptions as to how many friends people have The easiest way I thought to do this was to look at average number of facebook friends And the average number of facebook friends at that time was was 200 So what is the probability that if you Took your 200 friends you lined them all up in front of a cash machine and they withdrew a 10 pound note The probability that one of them would be yours that you'd written your name on 10 years ago Is this 200 multiplied by the one in 777 billion, which is one in 3.9 billion But it happened at some point in 2016 You know we don't just go to a cash point once a year and withdraw one 10 pound note and that's it was done We take out a number of 10 pound notes over the course of a year So we had a little chat I had a little chat with some people and we thought okay We need to make an assumption as to how many 10 pound notes people take out over the course of a year And we settled on about two a week and we said okay, so we've got about 100 ish Opportunities for people to withdraw this 10 pound note in any particular year So we multiply that then by our one in 3.9 billion Which turns into one in 39 million And that was what I thought the stat should have been You know one in 39 million the chance that this happens to this guy That he wrote his name on a 10 pound note in 2006 and in 2016 a friend withdrew it from a cash point Now that might seem like a small probability But that doesn't necessarily mean that it's a rare event If you think about the lotto your chances of winning is one in 45 million 57,474 I did I did an interview we changed it in the uk we were the same so I know the probability Um, it's a tiny probability And yet you're not surprised when people win it Because of how many people play it So you'd be surprised if you won it because your probability of winning it is so small But we're not surprised that someone somewhere wins it because of the sheer amount of people playing it So just because this is one in 39 million does that mean that it's a rare event? So I explored this a little bit further And there's about 50 million adults in the uk So what is then the probability that someone has a friend who withdraws their 10 pound note from 10 years ago Now I'm going to I'm going to talk you through how we calculate this I'm going to do it in a number of steps But the way that we actually go about doing it this is to calculate the probability that nobody Has a friend who withdraws a 10 pound note from 10 years ago And we use a statistical property that says the probability of something happening is one minus the probability of it not happening so My probability that I have a friend who withdraws a 10 pound note from 10 years ago is one in 39 million Meaning the probability that I that this doesn't happen is one minus 39 million Probability that this doesn't happen to anybody in the uk is then this To the power 50 million So then my probability that it happens to at least someone in the uk in a particular year Is one minus that So that's the calculation that I carry out to work out the probability that someone somewhere Has a friend that withdraws the 10 pound note that they wrote their name on 10 years ago What does that actually turn out to be? Well, it's 72 It's quite high really it's about three quarters of the time You know in a year We have people withdrawing 10 pound notes that someone touched 10 years ago. It's about one in 1.38 We can't track this though because we can't all go around writing our names on 10 pound notes and then put you know spending them You know just get out of control But it really it's not actually that shocking that somewhere this happens But what we can see is that four slightly different questions gave us four very different probabilities And you've just really got to think when you see these statistics What exactly was it that they asked them to be able to get this? I want to move on to talk about Aeroplanes and airlines and I've got a couple of stories the first one I want to talk about is the fact that 2017 was the safest year for air travel as there were No passenger jet crashes anywhere in the world. No mortalities at all in 2017 Compare this to the fact that there were 39 selfie deaths last year What's a selfie death? It's all the weird and wonderful things that people are doing in order to take the ultimate selfie This has actually become so much of an issue that the russian government has a safe selfie campaign And they have leaflets telling you all the things that you shouldn't be doing Whilst taking selfies and some of them may seem obvious Like you know this guy here, but some people have Tried to take a selfie with a gun to the head and accidentally shot the gun and killed themselves You know people are doing all these stupid things to take the ultimate selfies So yeah 39 selfie deaths last year and yet none for air travel So was 2017 actually the safest year for air travel and you know, we've had passenger jet crashes this year So does that mean that that's it our heyday is over and Just going to get worse from here on in, you know, is it all doom and gloom? So I want to I want to do a little bit of a bit of an investigation that explores this a bit further And I'm going to do a demonstration and this is where if there was a dice on your chair when you got in You will need to take that out now Could you please stick your hands up in the air if you have a dice just so I can see Where they are and how many we've got? Yep. Okay, brill Okay, so these are speed cameras In the UK And when speed cameras came in The government had to decide where they were going to put them because they couldn't put them everywhere They needed some sort of sensible strategy So what they decided to do was to identify all the accident hot spots See where all the accidents were happening and put the speed cameras in there kind of seems like a sensible strategy And we're going to recreate that now using your dice so in a second What I'm going to ask you to do is to roll it for me twice although You haven't really got much room to roll it So can I just suggest you give them a really good shake in your hand and just drop them on the floor I'm going to ask you to do it twice and I'm going to ask you to count up What score you get that's it straightforward. Okay, so give him a good roll for me twice That was a four It was a four There you go Okay, uh, did anybody get 12? Did anybody get 11? Do we have any 11s? No any 10s Okay, so what I'm going to do is I'm going to give you guys speed cameras So we're going to put speed cameras in where we've got the most number of accidents So there's one for you. There's one for you. We had we had another 10 over here. Yeah No, was this 10 here? Yeah, and we had another 10 over here. Okay, so We have identified our accident hot spots our speed cameras are in place Now what we need to do Is we need to figure out if they've actually worked So what I want everybody to do is to just repeat that again We're just going to do the exact same thing. So give your dice a good two rolls even you with the speed cameras everybody Okay Those with our speed cameras. What did you get this time? eight five Two and what did you get this time? Seven so what we can see is those places with the speed cameras have seen a reduction in the number of accidents Meaning that our speed cameras have worked. Yeah No, you're not convinced by that Okay. No, obviously you you could put your house down. Thank you very much. Um, no, obviously Uh, you know, you didn't have a higher probability of rolling a high number that then changed when I gave you a yellow hat um, you just randomly rolled something high the first time round And statistically the next time round you would have expected to get something lower by something called regression to the mean Now regression to the mean says that we all perform according to some sort of average But that we have these random fluctuations around it. We have these random highs and these random laws Regression to the mean says that if something is a random high the first time round We would expect it to be lower the next time round Did anybody get a 12 the second time round or an 11? Yeah, what did you get the first time round? Five so, you know, you could easily say those places where we haven't got the speed cameras We've seen an increase in the number of accidents and that only birthing justifies that speed cameras work This was genuinely something that the the government had to think about in the uk when they brought in speed cameras Because they did put them in the places that had all these high number of accidents They saw a reduction and then they said oh, it must be the speed cameras working And it took a bunch of statisticians to say this could all be explained by regression to the mean And what you need to be looking at is the long-term trends The long-term average is that changing is that decreasing and that's actually what we're interested in Now see regression to the mean an awful lot in sports If you think about your favorite favorite sports team, they'll sometimes go on random winning streaks And they'll sometimes go on random losing streaks And when they go on random losing streaks Sometimes their managers get fired and then a new manager comes in see an increase in form and everyone says oh, that's it It was because of the new manager But a lot of the time that can be explained by regression to the mean And a lot of you know that that new manager coming in isn't then you know all of a sudden the teams are playing better It's just a natural return to form After this random law has finished You guys have um, you've got the sports illustrated curse where if you appear on the cover of sports illustrated You know, I think it's well known that you then go on to perform really badly And here we have Matt Harvey at the front who after appearing on the cover of sports illustrated went on to have his worst season of his career You know, if you think about What it takes to appear on the cover of sports illustrated though It's it's a combination of your natural ability But you're also probably riding one of these random highs And when you appear on the cover of sports illustrated, you know, it's not a curse that's then seen your form decrease It's just naturally going back to what you would have been anyway You're just naturally going back to your average ability and it's not actually a curse We come back to our our aeroplanes You know, we say that 2017 was the safest year ever because there were no air passenger jets That could just be regression to the mean It just so happened that 2017 was one of these random laws And because we now seen some accidents That doesn't mean that we need to start investigating things. It doesn't mean that everything's gone horribly wrong It can sometimes or be explained just by regression to the mean We had a spate of stab ins in the uk in in london And our newspapers were saying that the london murder rate was now beating new york As stabbing surged and we know all the newspapers were saying was london now a lot more dangerous place to live than new york And and we have something called a reality check the bbc reality check and they did a full investigation Into this And they said, you know, the claim was that london had overtaken new york for murders for the first time in modern history But the verdict was that a selective use of statistics from the start of 2018 appeared to bear it out But the reality was that new york still seemed to be more violent than london It's the danger of looking at these short term trends and these random fluctuations are not taken into account You know these long term trends. Um, they're what you really really need to think about Okay, so, um, I have one final story on airlines and then I'm going to finish with a little game But my my final story on airlines. Um, I have a little bit of a hobby and and that's taking on Bad guys in british airlines. Um, so we have an airline called ryan air They're kind of like your spirit Um, whereas your spirit have yellow planes hours have yellow interiors And they play a little jingle when they land and they're they're well known for being the ultra low cost horrible horrible airline And I I I like to cause them bother So last summer they changed their seating allocation algorithm And they said that if you so if you booked your flights as a group When you actually then checked in they weren't going to sit you together anymore They were going to randomly scatter you throughout the plane and they were going to split you all up The idea was that if you didn't want this to happen, you would have to pay for your seats It was a way to make more money out of people um But all these people Were saying that there seemed to be too many middle seats Everybody seemed to be allocated middle seats and it didn't seem like it was truly random So we have a consumer rights program in the uk called, uh, bbc watchdog And I went on watchdog. They did an investigation and I ran some stats for them So what they did was they sent four researchers on four flights and on every single one of the flights All four researchers were allocated a middle seat And they said to me, okay, what's the probability of that happening if this seating allocation is truly random And that's not really that hard to work out Statistically, so they had all the information at the time of checking and they sent all that information through to me So for example, one of the flights When they checked in had 23 window seats left 15 middle 27 aisle 65 in total So theoretically those four people should have just been randomly scattered throughout these 65 seats But they were all given middle seats. So what was the probability of that happening? Well, you've got 15 middle seats 65 in total So the probability that the first person gets the middle seat is 16 over 15 over 65 Probability the next one gets a middle seat. There's then 14 to choose from from 64 in total And then we carry on and carry on so 13 over 63 12 over 62 That works out to be about 0.2 percent, which is 1 in 500 So that's the probability of all four researchers getting middle seats on that flight 1 in 500 if it was purely random, which isn't really that That's striking a probability if you think about how many flights these airlines have every single day, you know Probability is 1 in 500. We should expect that to happen quite a lot But it wasn't just this one flight that this happened on it happened on the other three as well So I carried out the same calculations for the other three flights I then combined all of the four flights together and I worked out the probability of all four researchers on all four flights All getting middle seats was 1 in 540 million So we're really tiny tiny probability You are more than 10 times more likely to win the lottery than you were for this scenario to happen But as I said as I said small probabilities don't necessarily mean rare events So, you know, how much of a rare event was this? So I looked at Ryanair's Figures that they they publish and they say that they carry 130 million passengers every year Meaning that this probability was tiny for this this scenario to happen And so I said I was pretty confident that they probably weren't allocating seats randomly And there had to be something going on Ryanair said, you know, I had my stats and they had theirs. I was like my stats are right But anyway But it all kind of died down a little bit And then there were these 12 women went on a bachelorette party And they were all allocated middle seats on their Ryanair flight And they got back and they got in touch with a newspaper and they said or what are the chances of it happening? So I got the phone call and reeled out all these stats again And they asked Ryanair to comment and Ryanair at this point actually admitted That they tried to keep window and aisle seats free when randomly allocating seats So they admitted that they were keeping back the window and aisle seats because those were the most desirable seats And the ones that people were more likely to pay for So this random allocation was actually a random rule from within middle seats So it wasn't really a true random allocation So I managed to get Ryanair to admit to their customers that they had lied And I also managed to upset them in the process Because they said that they didn't believe that some of the negative media coverage including a bbc panorama investigation was warranted So we all feel sorry for them that I managed to get them to admit that they'd lied and upset them in the process So I think that was a big win for me I upset them again earlier this year They released the results of a customer satisfaction survey So they said that 92% of their customers were satisfied with their flight experience Oh really? Ryanair So I decided to to take a further look at this and I had a look at the survey So it was an opt-in survey and I I thought to myself Well, you're only going to opt into a survey if you're either extremely satisfied with the service you've received Or you're dissatisfied with the service you received So then I I actually went and looked at the survey So here's this 92% that they reported If we look at the options that they give people when they're filling out this survey They range from excellent to okay So if you're dissatisfied with your Ryanair experience and you want to fill out this survey to express that dissatisfaction You have no way of telling them that you are dissatisfied with your service So basically all they're doing is asking a group of people who are extremely satisfied with their service Just how extremely satisfied they were with their service And then they combined all of these three columns to make a 92% It was absolutely rubbish And so we I was quoted in the Times They're saying that I thought that this was rubbish and Ryanair had a Really adult comeback and they said that 95% of Ryanair customers haven't heard of the Royal Statistical Society 97% don't care what they say And 100% say it sounds like their people need to book a low fare Ryanair holiday So a very grown-up and adult response from them Um, but one of our one of our members One of our members of the rss pointed out that Ryanair carry a hundred and thirty million passengers every year And if five percent of them have heard of the Royal Statistical Society A six and a half million Ryanair customers and do you know what? We'd probably take that to be honest So There you go. That's uh, that's what I like to do in my spare time. Uh annoy Ryanair Okay, so I want to finish uh by playing a game with someone and as I said, I've talked a bit about risks and calculated risks Um, I want to know if someone in here is ready to take a risk Um, so I'm gonna you don't even know what don't put a bully in Take a bet with someone in this room And I'm gonna give you an advance warning and I'm gonna put down real money on this bet And I'm gonna want the person who comes up to also put down real money on this bet Um, if you want to take my bet you will need to put down a dollar So that's prior warning if you want to take part in this, um, you will need a dollar. Okay So what is what is the bet? I bet that somewhere in this audience, there are at least two people who share a birthday if you know this then Please keep your mouth shut Um, I bet somewhere in the audience. There are at least two people who share a birthday Um, this is just day and month of the year. Um, and as I said, I'm gonna put some money down on this So I am prepared to bet 50 dollars That there are two people in this room at least two people in this room who share a birthday And I'm asking if anybody would like to take this bet and challenge me and say that actually everyone in this room Has got a different birthday and put down a dollar You want to do it? Yeah, come on then Uh, what's your name? Valerie nice to meet you Valerie. So, um, I have your dollar. I've got my 50 dollars. I'm gonna put them over here in plain sight There's the money Um, so I bet that there's two people in here that share a birthday. You say no, you don't think so Do you want to shake on it? So we're gonna shake on it. There we go. So there's our bet The way this is gonna work is um, I'm gonna point to people I want you to tell me your day and month of birth I will repeat it If you hear your birthday, can you please stand up for me wave make a lot of noise? So that I know that you're there and I don't lose 50 dollars. Um, okay It would be a bad way to end my trip Okay, so what's your birthday? Oh, actually first of all, are there any twins in here? Because I'll be kind I'll be kind and I will exclude them. No, no twins. Okay. We're all right. Sorry June 6th, June the 6th No March the 15th December the 31st the cameraman stood up there and I got excited December the 31st April the 4th April the 4th. We got a match. We got a match. Oh, no, I'm sorry. I'm so sorry Um, yeah, I'm gonna take your money though, but yeah, no, I'm sorry about that. Let's give her a round of applause Okay, um, I am sorry, but um How fair was that bet? You know, what are the chances of this actually happening? We can actually plot this probability Against the number of people and this is what the plot looks like. So it goes up really quickly And you only actually need 23 people before you've got 50 50 chance of this happening So if you're in a room full of 23 people about half of the time At least two of them will share a birthday and it only takes at 58 people it reaches You know 99 percent and there's more than 58 people in here So, um, I I mean it was like a 99.99 9% chance that I was going to win that bet So, yeah, it wasn't really a fair bet. I should have put more money down But yeah, so there we go And this but this is something that actually surprises quite a lot of people In the last world cup football world cup, actually The squads are around 23 people and quite nicely about half of them had At least two people that shared a birthday. It was a really really beautiful Nice Demonstration of the birthday problem that was rather wonderful Okay, so, um, I would just like to conclude and say, you know, whenever you see statistics In the press think about the risks, you know, you see these relative risks Think about the numbers that are associated with them. Think about the sample size and the uncertainty Is this, you know correlation or just or causation? And then I guess take all messages are eat bacon Um Don't eat cheese you risk death by bed sheets Um, and never take a bet with a statistician unless you know the odds in advance. Thank you very much All right, we have time for a couple of questions. If you have a question raise your hand. I'll bring the microphone to you Are the two pens complimentary or are we expected to return them? Are the what? Oh, do you know what? You can keep the two pay. It's fine It's the end of the trip. I don't need them anymore But I do need the dice back, please How does working in this field affect your decision-making process in your personal life? That's an interesting one when I see newspaper headlines. I definitely Do think about the absolute numbers. Um But do you know what I I did an interview recently about the lottery and about playing the lottery and things like that and people I do occasionally play the national lottery and people always say, you know, why You know what the probability is but I think sometimes Do you know what sometimes you have to think beyond the stats and I think sometimes We all have our own personal relationship with risk and I think that's also sometimes missed in the media You know, I know that eating a bacon sandwich increases my risk of pancreatic cancer by one in every 400 For me, that's an acceptable risk But for somebody else in this audience that wouldn't necessarily be an acceptable risk We all have our own personal relationships with them You know, occasionally I like to play the national lottery just because you know for that hour after I buy a ticket I get to think about what I'd spend the money on You know, it's entertainment value and I think that sometimes when we think about statistics and we see all these things in the newspapers We forget that we have our own personal relationships with them The the current president of the rss. Uh, david spiegelhalter He calculated that every bacon sandwich takes 30 minutes off your life And personally for a good bacon sandwich, I'd give 30 minutes of my life, you know All right. I'm a numerically oriented amateur. Uh, this is not really a numerically oriented question But it is about risks of everyday life. Uh, I invited my wife here who worries about a lot of things. She didn't make it This was about driving in the dark. She'll often say, you know, wherever we're driving We should make sure to get there before it gets dark and then not come home while it's dark and I disputed her whether it's riskier and then I went online and I found the national highway traffic safety and various other reputable organizations saying yes most traffic fatalities Happen after dark and but I didn't see any reference in any of those stories to Factoring out the alcohol factor because obviously you're more likely to encounter people who are alcohol impaired In the dark because probably many people more drink at night than in the daytime I'm wondering how you would weigh in on that and if you're aware of any statistics that do factor out the The alcohol factor from driving in the dark risk Um, so basically you wanted me to say to your wife that you're okay to drive in the dark No, uh, this is exactly one of those uh, correlation versus causation. This is where those confounding issues come in Um, I mean it may be that driving in the dark is more dangerous even after those The alcohol effect is taken out But you can carry out analyses that control for these confounding effects and then look for any sort of residual effect That's left over once you've accounted for all of these things. It's really interesting though when you when you talk about, uh How these risks can inform your decisions? I was talking to someone before about riding a bike And whether or not you should wear a helmet and we know that if we come off our bike wearing a helmet is gonna you know decrease our risk of death or injury But that's conditional on us having an accident If we wear a helmet Does that mean we actually sometimes cycle more dangerously and we take more risks? You know, there's all those sorts of things that you need to think about that the difference between these conditional probabilities That then get used as sort of unconditional probabilities Because you know the decisions that we make then affect the way that we behave You know, if I choose to cycle without a helmet on I might be a lot more careful So it's it is really interesting and it's not necessarily straightforward to try and tease out all of these little intricacies and subtleties Let's have another round of applause for our speaker