 Well, it's that time of the week again. It's time for chitchat across the pond. This is episode number 766 for April 22nd 2023, and I'm your host Allison Sheridan this week Our guest is your favorite psychological scientist. Dr. Marianne Gary of the University of Waikato in New Zealand. How you doing today, Marianne? I'm fine Allison, but I want to know if I'm your favorite psychological scientist Well, wait a minute though. Isn't Mrs. Gary also a psychological scientist? Yeah, but who's so? You're going to make me choose between the two of you? Choose wisely. Who are you interviewing? There is that. Hey, so I wanted to get Marianne on here because one of her grad students and some of her colleagues and she published a paper in the Royal Society about a very, very interesting topic and I want to tease Marianne first because she sends me these incredibly long form things. I mean like 30 pages of reading to do when my attention span is about as long as a mastodon toon nowadays. It's just so hard to read so much, but this one was fantastic. I'm fascinated by this study. I think it's really, really interesting and let me just give the we'll give it the 30,000 foot view of this is that a lot of people believe that given the opportunity or given the circumstance of a pilot being coming in capacitated and they're all they're the only one on the plane, a large percentage of people actually think that they could land the plane. When I knew I was going to be talking about this study, I mentioned it to my brother Grant and he immediately went, oh yeah, I can do it. I just thought that was hilarious, but this has been talked about before this concept and this paper and research, you and this team of people took it a lot further. So maybe talk a little bit about what we already knew and where you went with this. Yeah, so here's what we knew before and by way, I mean the scientific community. What we knew is that some people just think they can do all sorts of things they can't. We usually think of these as features of a person. So for instance, some people are just straight up narcissists. Now I know that that word gets kicked around in everyday parlance, but you know, it's reasonably accurate like some little blowhards, you know, that kind of thing. They can do things, all kinds of things. So that's a kind of stable personality characteristic of somebody. There's a name for that, right? That's the Dunning-Kruger effect I've heard about? No, that's a different thing. So yeah, so Dunning-Kruger and I have to say ironically, Dunning-Kruger is typically misreported by people. So Dunning-Kruger is very interesting because everybody knows about it, right? It shows up in everything except that what it means is that the people who have the least amount of skill tend to be the worst calibrated when you ask them to talk about their actual skill, right? So you have, because you're an expert in what you do, you have pretty dialed-in awareness of your own abilities and also awareness of what you can't do in the area in which you work, right? So like let's say tech stuff or coding or something, right? But someone who's just starting tends to be the worst calibrated and thinks they can do more than they can, typically, typically. And as you learn more, you know, expertise is sometimes described as the process of learning what you don't know. So you start to become more and more dialed in. So Dunning-Kruger refers to this chunk of people down towards the bottom that they're just least dialed in to their actual skills and abilities. And sometimes the same guy, Dave Dunning, identified that there's this period of when you're first starting a skill you have what is sometimes called the beginner's bubble, where you think you have the things figured out, but that's only because you don't know a lot about it, right? And so as you start to learn more, you start to become more dialed in and be like, oh, well, that's not true. That's not true. That's not true. So basically, what we knew before this paper was that it was a lot of evidence that some people are just, let's just call them blowhards, overconfident for various reasons. And the thing is, is like, we also know, though, that most people in specific areas will think they're unusually good at what they do. So the majority of drivers, for instance, think they're a better than average driver. So this is called the above average effect, right? Oh, yeah. We've talked about that together on Chichett across the pond. I always thought that was fascinating. There's some number larger than 50% of drivers. 90% of drivers think they're better than average. I must be a truly terrible driver because I'm pretty sure I'm below average. So I must be at the bottom 3%. Yeah, yeah. 40% of software engineers in firms think they're in the top 5% of the software engineers in their firm. Oh, really? Oh, I like that one, too. Most students think they're above average. And 70% of professors think they're above average, right? No, I think they're actually in the top chunk of professors. So it's just funny. So that can't really be explained by rampant narcissism. It's just like, what is that about? So there's, most of us, so the many of us, at least in specific, we'll call them domains, areas, right? Think that we're just not dialed in to how good or bad we are. Okay. So we don't really do personality stuff in my lab or stable features of clinical characteristics like narcissism. We don't do any of that in my lab. But what we do is we know that the features of the, some characteristics of the task that you're doing, like the environment you find yourself in right in that moment, can turn people into temporary narcissists, can turn people into temporary blowhards, make them really confident about things, make them think that something that they're reading is true or easier than it is or, and this is a phenomenon that's related to the stuff I've talked to you about before in previous chitchats about people remembering things that never happened to them. And we thought, well, maybe there's a role here for creating situations like that in which people think, oh, well, this is easy for me to imagine doing, for instance. So something you could, you could insert that, that would make them think they could do something they can't. Yeah, just ordinary people, right? Just ordinary people, not like just your random, like specific blowhards, but ordinary people. So could we take the people who, for instance, you know, the 90% of drivers who think they're above average and could we create a situation like that out of anybody, but for having to do with something we thought, well, we need a really preposterous skill, something that everyone should know they can't do. And we thought about a whole lot of things. And so one of the, one of the skills that we tinkered with was eye surgery. Yeah, exactly. And we just thought so many people don't think they can do that. Well, we, we didn't, yeah, we didn't actually go down that road because it was kind of grossly, I mean, unsurprisingly, the videos of people doing like corneal repair. I was like, ooh. Oh, I know the one you should have done. You know, if you've watched MASH, you're pretty sure that you could open an airway by punching a hole in somebody's throat and sticking a big pen in it. Yeah, we did think of that one. Did you really? I'm positive I could do that, and I would know to do it in the right circumstance. I know it. Yeah. Yeah, we did think of that one, but it didn't have enough steps. And also when you see it, like when you see it on YouTube, so we can't find a video, so we see it on YouTube. It's like, here's what to do if, you know, and it's not a real person, obviously. Right, right. So we rejected those things. And then we thought, what about those people who sometimes say, I could land a plane in an emergency? And because you hear about this, like apocryphal, right? Right. What is it? Mythbusters even did this. They did a show on Mythbusters. So first we set out to say, well, is it really a myth? And it turns out, so Mythbusters did this show where they taught, I forget the guy's names on Mythbusters because I don't watch it, but I watched this episode, and they have the guys who, they went in, not in an actual plane, but they went in a flight simulator like they trained pilots on. And they tried to land a plane. And one of the guys, I think he landed in the woods 10 miles away from where he was supposed to go and like killed everybody on board. But they're testing it on themselves. They're having themselves try to do it. Okay. Yeah, yeah. In the flight simulator, right? And the other guy, I think landed the plane, but it did a whole lot of damage. I mean, shocking, I know. And then they wouldn't let them, then they had another part where they did it again, they wouldn't let them talk to the tower. And they just both crashed and burned, crashed and burned, crashed and burned. So they're just like, here, and the Mythbusters conclusion was, no, random person, you can't land a plane. Okay. And most pilots will tell you, they'll just snort and say things like, I'm really tired of hearing people say, they can do this. And, you know, yeah, like, I'm sure everyone's going to write to me. So let's just preempt this right now. And instead, you can write to Allison saying, well, what about that guy in Florida who landed the plane? And I'm just like, yeah, okay. He, I think was like a flight simulator. You know, that game. Yeah, right, right. A fair bit. And also he, maybe new terminology. Yeah. And talked to him by the tower. Okay. He was talking about down by the tower. So I'm not not to take away from what the guy did, right? But it's just like, it's a, it's a remarkably lucky. Yeah. And it was a particularly unique set of circumstances. Right. So in general, no, you can't land a plane. Probably people listening to this right now are saying to themselves, I could land that plane or maybe even saying it out loud to us in their ears. I could land that plane. And we call these people men, but we'll, we'll get back to that in a minute. I was wondering how long it would take us to get to the men bashing. Yeah. Stand by. Stand by. It's science. No women listening to this is going to, they're going to be like, no, that's shocking. So here's what we did. We conducted two experiments, almost 800 people. And we asked some of them, but not others to watch a video of a guy landing a pilot's landing a plane. And, and there were commercial pilots. I forget the name of the aircraft. In the video. Yeah. In the video. You can't really see them doing anything. There's a whole bank of control. And their hands are covering the controls. So it's like from the back. Yeah. It's from the back. Like you're just peering in the cockpit. So you see them doing like, basically it looks like they're driving a fancy car. And then the plane, I think they take it over hills and mountains and overhouses and then onto the tarmac, right? And they land the plane. And so literally no instructional value to this, right? None at all. No. In fact, right. Yeah. So one of our, one of the co-authors, Rachel Zaynance, at the University of Otago's dad was a pilot for Air New Zealand for years and years and years and years was involved in training pilots and stuff. And he watched this video. So Rachel said, Dad, look at this video. Would it help someone learn how to land a plane? He said, this video is absolutely useless. Okay, good. Oh, I also want to jump in. The way you chose the people that you had do this was use the mechanical Turk, which is a, is a tool you can use where you can pay people miniscule amounts of money to do little tiny tasks. And so you eliminated anyone who, you asked them what, something like what kind of pilot are you or something like that. So if they, they responded, well, I'm this kind of pilot, they were out, their responses didn't count. And you did ask their, their sex or gender, I don't know which, but whichever you knew they were, who was male and female in this test. We asked them how they identified gender wise. Okay. What language they speak, their age. And we eliminated it. We also asked them questions about if they played like flight simulators or we, and we got rid of those people. So these are people who don't, they got to do it, but you ignored their results. Yeah. Yeah. Yeah. Okay. And we had, you know, the age of people, the mean age was 40, which is pretty good. Right. So it's not like we're getting just, let's say college students who are not as calibrated about their own skills as, frankly, they should be across a number of dimensions. All right. So these are actually like a legit grown-ups and a good chunk of people, 40, you know, plus or minus. So that's good. So, yeah. So watching this video, you don't show the video to all of them. You show the video to half of them. No, some of them, but not others. Yeah, because they, we don't show the video to the controls, right? Okay. It's a little less than four minutes this video. As you said, no sound. So you see from the back of the flight deck, you see the view of the flight deck. So I assume you're going to put this article in the show notes because it's open source. So if people want to go and click on the, we link, we point to the materials and the data in case you want to totally nerd out in the article. So you can see exactly what this is. So you can see their hands and what they're doing, but it's somewhat obstructed just because the angle it's shot at. So it doesn't, it doesn't teach anybody to do anything, right? It's really just you watching someone sort of glide in and land to plane. Okay. So the people who see the video, they see the video. And then people don't, obviously they don't. And then, then they're asked these two questions. And one is how confident you can land the plane without dying and how confident are you that you could land the plane as well as a pilot could, and they have to answer on a scale. And we varied. Sometimes we do the scale of zero to a hundred, like not at all confident, very confident or not, you know, that kind of thing. And we asked them at the end, for instance, have you ever flown a plane before? Have you ever landed a plane before? And then here, this is important. We asked everybody, we always ask everybody, how much expertise do you think is involved in doing this task? So here, like in landing a plane, so like no expertise to a great deal of expertise on a scale, right? Do you ask them that before they answer the question of how confident are they? Afterwards, afterwards, afterwards. Yeah, afterwards. And everybody, all the time, is on what we say in, you know, it's in data analysis, the ceiling. So they're up, banging their head on the ceiling of, it takes a lot of expertise to land a plane. And that's important for what I'm going to tell you about. So they're not delusional about what it takes, they're just delusional about their own abilities. Exactly, because our results would be far less interesting if people thought, man, it's just easy to land a plane because you could just say, well, you've got a bunch of dumbasses in your experiment. We just had like actually ordinary people in our experiments. So anyway, then we, you know, we're going to, we ask them these questions right afterwards. And we assume that what people are doing is operating on a kind of gut feel or hunch. And so what we know when people operate and make these kind of gut hunch sort of decisions is that they make them on the basis of how easy it is to bring to mind thoughts and images and feelings of doing the task. So if you go, if you think this through and you say, well, people have seen the video now, it's easier for them to bring to mind thoughts and images and feelings of maybe them doing the task, whether they see themselves in the situation or it's just the task itself, which is landing the plane successfully, then you would predict then that people saw this video, even though it was just four minutes and not instructional by any measure would be more confident that they could land the plane. And that's what we saw. Now remember there are two questions, right? So could you land a plane as well as a pilot could or could you land the plane without dying? So let's call this without dying question, the low bar and as well as a pilot could the high bar. And so almost thankfully we get different responses here. So people are less confident they could do it as well as a pilot could. So at least they're throwing the pilots a bone, which is, you know, I guess nice. But they still think, you know, they're still more confident that they could do it if they see the video than if they don't. And that's what I thought was pretty amazing, right? And, you know, since we foreshadowed this and because it's fun, men are more confident than women, even if they don't see the video. We're at a margin. I mean, on average, do you have any numbers on that? I looked for that in the paper and I didn't see it. But as I recall in the news articles I heard about this study that it was a pretty, or it might not have been about this study, but in other studies that it was a pretty significant margin difference. Yeah, I think it depends how you calculate it. And I don't remember it exactly. But if you're thinking about maybe 20, 30 percent higher, more confident, is that what you? Yeah, I think that's what it was. Yeah. So it, right, it doesn't mean that they were, it just, it's compared to women in the same condition. So yeah, it's very interesting and fits with some other work and even just survey stuff that men tend to be more confident about their abilities than women are to the surprise of no woman anywhere. You know, when you, when I got to that part in the, talking to you about it too, it started to make me think about two angles to that is as a, in the workforce, as a woman, if you are not portraying as much confidence as the men that you know you're at least as good as probably possibly better, that you're going to be perceived as not as good at it because they're showing this confidence and you're not. And it's a, it's a pretty common thing to know, to hear that women don't project themselves above what they think they can do and men can, can, you know, blow hard it better than, than women can't. So as a woman learning that skill of, you know, pretending you know a little more than you do, which sounds really hard. So there's the flip side as a leader looking at your employees, looking at two people who say they can do the task or, you know, trying to get a promotion or whatever that, that take into account the fact that the women are probably not over stating what they can do as much as maybe the men are on average. Yeah, there's a whole literature and I mean, I don't work in this area, but there's a, I know there's a whole literature, for instance, and social psychology literature, particularly that applied to the workforce in that women, for instance, are terrible, terrible negotiators. And I would urge every woman to read a book called Women Don't Ask, written by a proper social psychologist who's got an expertise in, you know, the, the research about negotiating. And it's a related idea is that, you know, so men are much more comfortable at trying things on than women are. So what we don't know here is whether men are somehow responding to this question differently for different reasons. Like, are they just trying it on or kind of blowharding and women are more calibrated? So you don't know, it's kind of interesting, like, well, who's more accurate, you know, what's, so I don't know what the calibration issue here is, but it's interesting because, you know, of course, this finding fits with, we have this in the paper, you know, that YouGov survey, which is that 12% of men thought they could win a point against Serena Williams in a game. And 12% of men and 3% of women who I want to kick out of the club made the same claim, but at least it's, you know, 12% of men, only 3% of women. And then there was another YouGov survey. I always show these whenever I present this work in a talk, right? It's that they asked men and women to identify which animals they could beat in a fight. Really? Like cobras, bears and eagles, yeah. Yeah, more men than women. Yeah, more men than women claim they could beat every single animal. So, you know, there is another, another angle to this. Maybe they can, Marianne, maybe they can land the plane. Oh, yeah. Maybe that's it. That's right. Maybe the men are just more accurate and women are underestimating their abilities, or maybe men are just more capable. That's probably, you know, we got to take that into account, right? That's probably right. That's probably right. But it is, you know, it is interesting that we see this, this gap. And like I said, it's not my area, but it's always the thing that people want to talk about. Right, right, right. It does, it is interesting. One of the things back on the, on the study, looking at video and no video and the comparisons, the, I know this is nerdy to ask about, but I was really intrigued by the way this is graphed in the article. So. Oh, the violin plots, yeah. Yeah, they're called violin plots. I call them bowling pin plots, but violin plots. So in the, in the vertical axis is the confidence from zero to a hundred. And the width of the bar, let's call it the violin bar, the width of the bar is changing as you go up to show you what the distribution looks like. So instead of just in a horizontal axis and a single line, you know, it's, it's, it's two acts, it's in the width of it. And I thought that was a really neat way to graph it because you can look at it. If you look, for example, on the video versus no video on, can you do it as well as a pilot could, without the video, basically the graph, the bar is really fat down at the bottom. And then there's a little slate bulge above 50% where the crazy people live. But then if you look at it after taking the video, that bottom isn't super fat anymore. It's, it's pretty narrow. It's like half width. And then it, and it stays pretty high, like 25%. The mean is 25% of the people think they can do it. Right. So as you think of like after seeing the video, right, right. So these are nice, these plots, because as you say, they show the distribution compared to a bar graph, which is fairly useless. And at, and at worst conveys the idea that all the points are the same, the distribution is the same throughout, throughout the bar, right? And so it's only relatively recently, in my career, I would say five, 10 years that these have been become more and more easy to do. So R, which is a language, data and analytic language that was, that's free and is very popular now, just swept around. It's just the letter R is the name. It's just capital R. And ironically, quite ironically, although no one here likes when I say this, but I don't care, Allison, because I'm talking to you. And it's just us. But R was developed at the University of Auckland. So it was developed in New Zealand. And I always say, who would pick the one letter that New Zealanders can't say? And name their software after it. It's just. Do you ask Mrs. Gary to say it just for the comedy? Yeah. Okay, okay. That's pretty funny. But anyway, and it's great because if you use R, if you use R code, and you can write an R, then of course, chat TPD can help you. You can do all kinds of things. And you can also go to an employer and say, why are you paying for SPSS, which costs a squillion dollars a year when I can do this for free, right? So you can make these nice plots. And then you can see if you're if listeners want to just look at these figures, like figure one figure two, you can think of the no video control conditions is like a balloon of a certain shape. And then you can kind of imagine that the video condition is that balloon squeezed differently. So you're moving people around and you can kind of then visualize the effect of the video. Yeah, I got excited about these graphs and I tried to do it in Excel and then I did some searching and it said, okay, there's this this plugin called Excel stat. You can try that. Okay, I should put the plugin in. I couldn't find the thing that said how to do it. And it was going to use Python. And then I went online and I found, oh man, I just I went down about 42 rabbit holes. I probably spent two hours trying to do this. And Mary Ann kept saying use chat GPT, but I was asking chat GPT. But the language I didn't know what it was. And apparently it was R. I never did succeed at it. I was able, like she said, you can download the data from their from her from this paper, you can download the source data and then try to run R against it. I did not succeed. And I'm disappointed myself. I may keep trying. That's okay. I wanted to make those graphs on men and women. That's what I wanted to see. Oh, right. Yeah. So you can start by, if you get an app called R studio. I think it's to R studio. It's just a nice front end for R. Oh, okay. So you can see the coding you can do whatever and then R will ask you it will just, you know, it just runs R with a front end. And so it you'll be asked at some point to install different basically libraries that do certain kinds of things. And when you run some code, if it doesn't have that like library in it will say that it needs to go get it. And then you just allow it to go get it. So it's pretty good. I'm pretty sure I did install R using homebrew, but I didn't know what to do next. So by that time I'd spent two hours. And that's why the article that I was writing yesterday and the day before and the day before isn't done yet. Well, this is what this should be your next thing that you do when you teach people how to do R. And that way you can learn to do it too. My next obsession, right? Yeah, your next obsession. So there were actually two separate studies. And I didn't quite understand why this was done twice. You did it with a small sample size. Is it the same study twice just two different sample sizes? It was the same study with a larger sample. But the first time, and that's good because we want to replicate, science needs to replicate, right? So if we had just done experiment one experiment two and experiment two with a much larger sample to make sure that we could replicate the basic pattern, but then also with a larger sample size, you get a more precise estimate of the size of the effect. And that's the those error bars that you see in the plots are around the mean. And the these error bars, which are called confidence intervals, they give you an idea of the plausible range of values out in the population. Because it's always what we're trying to do is estimate what is the actual value in the population. So so that's one reason to replicate something. So science always needs to replicate. But well, what we had noticed in the first experiment is that we had this effect where if you were asked the as well as a pilot could question first, it kind of threw cold water on your confidence. And we're like, what is that about? So then we tried to manipulate it. We just did it again to make sure that that really was a thing. And it turns out it was a thing. So it's almost like being asked the question, could you do this as well as a pilot could before you were asked, could you land a plane makes people have some kind of reality check, and they calibrate better and they're and they're their confidence isn't boosted as much. Well, that's kind of looking at the graphs and it looks to me like it says the other way around. So if the question asked first was as well as a pilot could, the confidence bars are higher. Look at the top, look at the top on page eight, figure three on top. So the question asked first without dying, right? So they're, so if you're asked it without, if you're asked without the with done, if you're asked the without dying question first, now your confidence. So now look at those. Let me see, how do I say this? Look at that left panel, the left side of the dash line. So right, what you see is the confidence is higher than it is if you go over to the without dying question, when that comes second, which is on the right side of the panel. I don't know if I'm looking at, I think I'm looking at the same graph, but to me, like if we just look at asking without dying first. Yeah. Oh, I see. Boy, this is hard. So what, what Marian and I are looking at, which you can't even see, which makes it even harder to follow is eight of these violins. The ones on the four on the left are question asked first without dying, the four on the right are question asked first as well as a pilot could. So within the set of without dying, there's four different scenarios, video and no video twice, one without dying question, and as well as a pilot could question. The results of that. Yeah. When you see it, when you're asked first about, could you land this plane as well as a pilot could, which is the bunch of plots on the right hand side, you see that there's no, I mean, there's confidence, but there's nothing really going on. So it's as though people thought, hang on, you know, like, am I really as good as a pilot? So that's as opposed to just, could you do this without dying, which we consider to be, and I say this with some irony and easier task, right? And so it's as though the, as well as a pilot could question first is like, hang on, could you do this as well as a pilot could makes them go and recalibrate everything. So it's the only thing that we've found that kind of arrests that overconfidence and doesn't totally, right? Because if you look, you just see on a hundred point scale, people are still roughly at about 25, which is not great. Yeah, that's astonishing. So what do you do next with results like this? What does this, I assume what you want? It makes you ask other questions. You're going to, there's, there's something else you're going to want to know. Yeah. Well, first of all, I just want to say, I'm happy to crush this dream, because this is a dangerous dream, right? I don't want someone on my plane thinking that, and I'll just use this pronoun as the placeholder. He could land the plane, right? Well, see, I do. I do. Because if there's only, if there's a person on the plane who thinks they can do it, does that increase the probability that they'll be able to do it? Oh, and you're all going to, yeah. Well, if there's no other option, yeah, I suppose, if there's no other option, I think you should try and land the plane. And look, you know, like just, just to circle back here to this, why, why are people overconfident? Like in general, if you think about it, overconfidence probably has some adaptive benefit. Like back when we were just cave people, you know? So you had to be able to run with a pack that wanted you to run with it. And so it was probably beneficial to think that, or to present yourself as someone who could do things that we might today call stretch goals. Yeah. So like I can do this thing. Yeah. A stretch goal. It would have been beneficial. Yeah. So you could, right. I wouldn't say that landing a plane is a stretch goal, but, but maybe, maybe, right? But it's just the same kind of mechanism. So that, you know, it's, it's good to present yourself as more confident if you want to be able to have a pack, this is safety and security of a pack that's going to protect you like in, you know, prehistoric times. Because if you, you know, back in those days, if you didn't have a pack that you could run with, you didn't have access to resources like food and safety, and you could be picked off from the pack and killed. So it's good to appear confident. So that's, this is why the idea goes that people have a disposition to be maybe confident about things that they can't yet do. And then also, if you think about the relationship between confidence and maybe optimism, you want to get out of bed in the morning, wake up every morning thinking that you can do some things that maybe you haven't done before. Right. So all this stuff is probably an adaptive characteristic. But we, we try to push on it and take it to its extreme. Because sometimes like when people say, why, why would people think this? And the fact is, it's like most of these weird phenomena, beliefs or tendencies or claims or even the way memory works. When we really push on them and take them to extremes, what we're doing is taking something that we have probably because it's sort of some adaptive function most of the time. So then we're distorting it. So this is a distortion of things that probably most of the time are good. Right. So where are we going to take this next? Women need to get more of it. I think women need to get more of it. Yeah. Yeah. Well, again, I mean, I'll tell you what. If you read this, I don't know if you have read it. And now that you're retired, you know, I guess you don't need to, but this, this women don't ask book. I read this, my friend told me to read this once. Because I was negotiating for something and my friend told me to read this book and she said, just be careful because you're going to get angry. And I read it on the plane on my iPad. And if it had been an actual book, I would have thrown it across the plane of half a dozen times. Oh no, because you didn't, you didn't agree with it or just made you angry because you knew it was true. It made me angry that I, that I didn't know that I could be behaving that way. Like in a good way. Like here's things I should be doing that I'm not doing. Here's things men tend to do that I don't do, that a lot of women tend not to do. And that may be angry. It may be angry that I, because I don't think of myself as like a pushover, right? It made me angry that I didn't know these things. And so that's why, that's why I now proselytize and tell all sorts of women to read that book. Women don't ask, read it, read it, read it. And there's a follow-up call to ask for it. But I think women don't ask. It's just fantastic. And yeah. So I think women could do with more confidence. Like I'm not a social psychologist. I'm just, I'm just a cognitive psychologist. So I do this kind of stuff like skills and memories and problem solving and whatever. And what we're really talking about is women and men and at workplace and negotiations and whatnot. This is social psychology. So not really my, my area. First and foremost, I don't have any social skills, but, but I'm not a social psychologist. But yeah. And I, but I've read some of this work and not being an expert, I will say this book is great. And generally, you've hit a real problem. But now that that digression is over, where are we going? Where are you going? Well, you know, we've just submitted a paper, a manuscript to a journal, a different journal, same, many of the same set of authors led again by Kayla Jordan, who's now Dr. Kayla Jordan. So I know, great. So what, what we did is we started, well, this is at least an overconfidence that is not fatal, which is good. And I just, I noticed this weird thing when, you know, we were all under our shelter in place orders, right? In the before times, well, or the after times. So I was watching Netflix, I was watching this Danish TV series on Netflix, a political drama called Born. And it's subtitled. And it has to be because Danish sounds like, I don't know, Dr. Seuss tripping on acid. It just does. There's no, like you can't understand. I mean, if you're Danish, I'm sorry, but I'll tell all my Danish friends this too, it just does. And you can't really understand anything they're saying. So, but what I thought was after about the third episode, I thought, I'm learning Danish. This is, this is also a great side benefit. Because when I go back to Denmark, because I go there sometimes because they have a great autobiographical memory center, I go there, I'll be able to have better conversations with people and they don't have to speak in English. And so then I turned off the subtitles to quiz myself. And I within 10 seconds thought, oh, this, I don't know any, what is happening? I don't understand any of this. And I thought it was a bad scene. So I turned it back on and went on and off like this for about 15 minutes. And I thought I was finally convinced that I wasn't learning Danish. And there's some kind of illusion going on. So, so the subtitles are like watching a video of somebody laying a plane. Yes, baby. Yeah. Yeah, that's what I wondered. So we tried this, we turned this into an experiment. And we showed people short video clips from different shows. One, one was this political drama. And the other one was a show, great show on Netflix called Rita, still Danish, about a school teacher. And you either saw the clip. And we just, we had had different clips because we didn't just so the this effect I'm going to tell you about is not, you know, tied to a specific clip. It's not. So we either showed you the clip, well, we showed you the clip, but we either showed it showed it to you with subtitles or without subtitles. And it's like a minute, right? It's not long. It's shorter than the plane video, if I remember correctly. And then we asked you questions like, you know, how confident you that you could follow Danish instructions in an emergency or read the weather forecast or make friends if they spoke only Danish. And it really, it boosted, if you saw the subtitles, it boosted your confidence that you could do these things time and time again. It was fantastic. Oh, nice. Nice. And then, of course, it was significant. Oh, yeah, totally. And then, yeah, yeah. And if we then gave you a Danish quiz, because that's the obvious answer, well, maybe you are learning something from the subtitles. And it had words that were in the scene, but also like the most common words in Danish like the we asked people just to transcribe what they just to write down what they thought the words meant. They just were completely, first of all, A, the same and B, on the floor. So that was pretty fun. But the same, you mean the people who didn't watch the subtitles and the people who did? Yeah, yeah, yeah. The subtitles weren't doing Jack, except making you more confident that you could do this thing. Are the people angry afterwards? No, I don't think so. No, people always think those. Yeah, I think people always think those kinds of things are interesting, right? I think they always think those kinds of illusions are interesting. I think people like to learn that about themselves. I think because they haven't, you know, they haven't maybe they don't do anything, but they really make it like an ass of themselves. They don't hurt anybody else, right? They don't embarrass themselves from fun and it's a private response. So, you know, in fact, we don't talk to them personally, they get everything in writing so they can have their own private reaction. But usually people just write, this is interesting. Oh, that's good. That's good. Yeah. But this is this is crushing of dreams. Because I think we all thought we could fly the plane. Yeah, but like I said, that's a dream. I'm okay to. You're okay to crush that one. I'm okay to crush that dream. Yeah. Be nice to the pilot. Don't hurt them. Don't hurt her. She needs to fly the plane. Because trust me, nobody on the plane can. Yeah. You know, where I want to take this is the obvious application of this kind of work, although we need to move into that area, is to see what happens in education because there's this idea in education that, you know, you start off with things that are easier and build confidence and then you bring in more and more difficult or exceptional examples. And it is possible that by doing that kind of thing, what you're doing is just creating this illusion of overconfidence and then it would miscalibrate you as a student thinking about how much maybe time you had to put into something or how well you knew what you were talking about and so on. So we want to repeat these kinds of studies, like so for instance, adding subtitles to a lecture. That's a very common thing, like you watch a recorded lecture. If you have subtitles there, and I don't I mean, like English lecture, English subtitles, so same language subtitles, would you think you were learning more about that topic than you were actually learning? So I think that's a really interesting question. I want to try and investigate that soon. Yeah. Yeah, that is interesting. I know it changes the way I think about what I'm watching. Does it? Well, probably the biggest thing is I hate it on on comedies because it ruins the joke because the timing is lost. Oh, because the timing is off. Yeah. Yeah. And so I find myself a little angry, but there are certain shows where it's like, I can't understand them. I need the subtitles and then I close caption. I forget, which I think we may be misusing the word there, but it's Yeah, subtitles is a generic term. Okay. Okay. Right. And when they're in your language, it's it's a caption. Yeah. That'll be interested to see what that does. That sort of relates circles back to work you've talked about before of whether you retain information better if you take notes or not. Oh, yeah. And also, if you retain information better, if it's a little more difficult. Oh, right. If it's a little bit more challenging. Oh, so one of the one of the ideas is that when you're taking notes, you have to work in the moment to distill out the essence because you can't transcribe when you're taking notes. Right. So it injects a little bit of challenge into the task, a little bit of difficulty, effort, let's call it effort. Okay. Into the task. So there's this idea in cognitive psychology called desirable difficulties, which is just a little bit of challenge or effort injecting it into a task like with note taking makes you remember that thing better. And so that's, that's, I think that's an interesting idea. I do like the idea of a little bit of effort. Have you ever been in a class where I'm sure everybody listening has where you're in a class and you start to get behind and you're really concentrating because it's difficult and you're really working on it, but you reach a point where you realize you're never going to catch up. Like you're over the hump, you're on the other side, and you kind of throw up your hands. You may even giggle like, I got nothing, never going to get this. I saw that happen in an entire class one time. It was a quantum mechanics class by, taught by Professor Van Hoeven. And the entire class just started laughing like, what is this guy talking about? Nobody knew what he was talking about. And Steve was in the class and he developed a theory. He said, I think that the Russians, this is during the Cold War, the Russians have infiltrated our university system and they're teaching absolute gibberish to the engineering students and our entire society is going to collapse as a result in 20 or 30 years. That's how bad this was. Did the guy ever know? Did he get it? Did he realize that he was losing it? I don't know. I have no idea. Quantum mechanics is utterly true. It's really interesting you say this. When I was a grad student, one of the things that I wanted to study was that feeling in a student. And it was inspired by this far-side cartoon I had seen where, you probably have seen this, where the kids in the class and he raised, he has his hand up and he says, can I go now, Mr., whatever my name is, my brain is full. And I wanted to study. I wanted to study that feeling when you just think that's it. I can't hold on anymore. It's just running out like a glass that's running over. Buffer overflow. Yeah, exactly. And I just thought that would be like, because people have that feeling and where does that feeling, like how accurate is it? And what are the consequences of it? What can you come back from it? Yeah, I think my guess is I would say is that it depends on the instructor, the person trying to pour the information in. And the example I give is Bart and me with Programming by Stealth is we started doing video for Programming by Stealth. Even though we don't record the video, he needs to see when I'm slamming my head on the desk, because I have no clue to what he's talking about when he's lost me. And it really, really helps. And now he'll go, you got that look on your face right now, until I'll say, okay, I lost you back like 15 minutes ago when you said blah, blah, blah. And so he'll go back and he'll explain it a different way. But if you don't have somebody who's willing to do that or able, because there's 600 people in the lecture hall or something, it can be highly dependent on that, whether that can be fixed, whether you can open it back up and scrape out the goop that didn't make any sense and start pouring in it again. I don't know. Yeah, it's interesting. It's fun research. Yeah, there's a whole lot of things to be investigated here. One of the other things we're doing right now, we're just in the early days of this. So I don't really know how it's going to shake out. But I was fascinated on one of my trips to Japan when one of my colleagues was explaining to me that the kanji often look like the thing they mean. What's the kanji? It's like the Japanese characters. Oh, okay. They come from Chinese. So the kanji for that means tree in Japanese looks like a tree. So if you Google it, you'll see what I mean. And the kanji for fire looks like a campfire. And I thought, I wonder if you started off thinking, somehow I took away from this, that all the kanji look like the thing that they mean, which is manifestly not true, because even Japanese people only learn some kanji because the rest of them are preposterous and it's not like you can recognize them. So I thought, oh, once I realized that some kanji, only some kanji look like the thing that they mean, then I thought, I wonder if you could get people to think kanji, the same kind of thing as the plane, right? Sure, I could learn kanji or sure I could understand Japanese just by showing them easier kanji first and then harder kanji. And that's what we're playing around with now. You've got more people to torture. Yeah, more people to torture. Yeah, I mean, I am interested in these. This is called, this idea where you step back from yourself and you think about what you know and you don't know, like we were talking about the beginning of our conversation is called metacognition. And it's one of the secrets of learning new things. Like when you work with Bart, you have to be able to monitor what you know and what you don't know. And at least say I'm lost or I understand this, but that's where it stops. Or sometimes you relate something like you do this a lot, you relate something to something you know, that's another kind of act. And metacognition is crucial for learning to do something new. And some people are better at it than others. So what we're creating here, like I'm interested in these situations in which these illusions pop up and how they arise and how you can maybe repair them or what their consequences are. I was thinking about something I saw on TikTok recently is somebody saying the problem with stupidity is that you don't know that you're stupid. And I guess what you're poking at there a little bit is is there a way to get people to step back and realize what they don't know? Yeah, well, that would be really important. Yeah, it would be useful. I mean, sometimes you can like in a classroom, which is easier to deal with in like reality because like it's a controlled environment. So in a classroom, I can ask you questions and reveal what you know and what you don't know. I can give you a quiz and you'll take away from that what you know and what you don't know. Get feedback on an assignment. But in real life, real life, you know, like it's just you would see people I'm sure on TikTok who are making it clear that they don't know what they don't know. And they're just blustering. And then the question is how do they learn? How do they come to learn that they're wrong? Yeah, you know, that's a terrible place to leave this conversation. Yeah, yeah, I know, I know. But that'll make me happy that I have crushed some dreams then. That's your favorite thing. All right, well, I am going to leave us on that depressing ending note. If people want to follow you online, let's see, you've been doing some mastodon tooting there. I have but not very much tweeting. I'm both. I'm Dr. Lamchop everywhere, except at work or I'm not. All right, well, if you have any complaints about what Dr. Gary said, you're supposed to send them to Allison. Send them to alison. All right, well, thanks for coming on the show again, Mary Ann. This was great. Thank you. I hope you enjoyed this episode of Chinchat Across the Pond Light. Did you notice there weren't any ads in the show? That's because this show is not ad supported. It's supported by you. If you learned something, or maybe you were just entertained, consider contributing to the PodFeed podcast. You can do that by going over to podfeed.com and look for the big red button that says support the show. When you click that button, you're going to find different ways to contribute. If you'd like to do a one-time donation, you can click the PayPal button. If you want to make a recurring contribution, click the weekly Patreon button. You're only charged when I publish an episode of the NoCillicast, which let's face it, it's every single week, so I don't charge Patreon for Chinchat Across the Pond Light or programming by Stealth Episodes. Another way to contribute is to record a listener contribution. It's a great way to help the NoCillicast ways learn from you and takes a little bit of the load off of me doing all the work. If you want to contact me for any reason, you can email me at alison at podfeed.com, and I really encourage you to follow me on mastodon at podfeed.caos.social. Maybe you want to talk to the other NoCillicast a ways. You can do that in our Slack group at podfeed.com slash slack. Thanks for listening and stay subscribed.