 I work for a test cloud in Berlin. I live in Tokyo, and I work remotely from there. This is me, just in case you want to know, and see a second image of myself. All right, to start out with, I have a question for you. Or actually, I have two questions, but I'll start with question number one. So who in here has ever had to work with a really weird old legacy system? Not even a complete ribble just had to deal with it somehow. Please raise your hand if you have. All right, second question. Who here in this room has a brain? Please raise your hand if you do. Not everyone has a brain, that's kind of surprising, but I assumed you all have one. All right, so everyone who raised their hand for question number two, should in fact also have raised their hand for question number one, because what our brain really is, is a big fat old legacy system, and it's been a while since the last hardware update. The good news, our brain is extremely powerful, and it can do a lot of amazing things. The bad news, the documentation is pretty crappy. They're error handling, it's not that great either, and we can't even debug it because we don't have access to the code base at all. So that sounds like the nightmare of every programmer, doesn't it? The problem is, we can't quite walk out in a project manager and just quit. We're kind of stuck with this obnoxious and brilliant heap of jelly in our skulls that helps us to process and react to our environment that helps us to reason about abstract problems that actually lets us create, communicate, even program. But on the other hand, it constantly keeps forgetting people's names. It reminds us of awkward situations three years ago in a very random fashion, and it constantly makes decisions for us without even asking. So today I would like to talk about how we can understand our brains better, the way they work, and also these weird little flaws that are called cognitive biases, and what to do about them. You see, we as programmers, we really like to look at ourselves as a group of people that is somehow more rational than others. Now, because after all, we earn our living by talking to machines every day, and machines aren't exactly known for being super emotional. So if we make technical decisions or if we plan projects or if we assess capabilities or competencies, it's just fair to assume that we adhere to rational standards, and that's what we base our decisions on, right? Well, I have a surprise for you. Programmers are human, and they have human brains. To be fair, most of the time, our brains do an amazing job. They have to process vast amounts of information, and they somehow have to come back to us with appropriate reactions, like how to deal with all this stuff that comes to us all the time. You see, the human brain is really old, and many parts of it developed when social coherence of a group was actually really important for survival, and things like accurate and quick assessment of threats. When it really mattered what your peers think of you, because being ostracized might well mean that you're gonna die, and race conditions were a completely different problem than what they are to us nowadays most of the time. So what is cognitive bias? Cognitive biases are heuristics that the brain uses to process information in a very quick way and come up with appropriate reactions. They're pretty useful most of the time, but they are not perfect processes, so they can actually lead to some pretty suboptimal behavior or decisions. The thing is this, we are all biased. That's something that's very important to understand. It's natural, it's how our brains work, and it's not really necessarily a bad thing. Our brain uses all these shortcuts, so it can actually deal with all that information that's coming to us and give us back something reasonable, so we don't just get an information overflow because the brain gets stuck with all these details. In fact, our brains has different modes, how it acts, and I'm gonna show you two really simple examples that will illustrate this. So, when you look at the next slide, there are a couple of things that happen to you without even noticing. You recognize right away that this is a person, a child in fact, and also probably someone you haven't seen before or maybe you've seen them, but you don't know them personally. You can also tell right away that she's currently not happy at all, and if she was standing right in front of you at this moment, she might be very close at starting to cry or actually shouting at you. This process of recognition and perception is something your brain does really quickly and effortlessly for evolutionary reasons. It's important to understand, like, oh, this is another person there, what I am, and to understand what they feel like. And in this mode, the brain also does a lot of very quick and automated decision-making, where we often don't even realize that it's happening just because it's happening so fast. And oftentimes, it sacrifices accuracy and correctness for speed and approximated results that are okay most of the time, and that's sufficiently large number of times. When you look at this on the other hand, unless you're really good at mental arithmetic or you still remember your multiplication table from elementary school, which I don't, to be fair, your brain probably drew a blank here, like, there's no evolutionary reason why our brain should be able to automatically process like semi-difficult multiplication problems. So it can't, and there is not really any way for it unless you actually memorize the results to spontaneously come up with them. So you can probably tell that this is a multiplication problem by looking at it because that's something you've learned. So you can recognize it. You can also tell that five and five million is probably not a very reasonable estimate for a result. But if you really want the result, you have to actively start thinking about it. You have to start calculating, and that's a lot slower, a lot more difficult and a lot more demanding than the fast thinking mode. Put simply, when our brain is in fast thinking mode, it uses cognitive biases to approximate solutions. And like all approximation approaches, this does not prioritize optimal solutions, but it prioritizes coming up with a feasible solution in a reasonable time. You can't really turn this off. It's hardwired into our brains. But in some situations, there are ways how you can work around it. That's not always possible, simply because firstly, you might not even recognize it's happening because it's happening all the time. And secondly, it might not be viable to just, excuse me, to do something about it all the time, just because if you were to question every single move that your brain makes, this would slow us down so much that it would be difficult to even act as humans. There are situations though, where it is definitely viable and also very good to try and work around these biases. And actively making decisions is one of them. We make decisions all the time, from really small ones like what to have for lunch, to major ones like how to make our lives meaningful. And most of the decisions that we make at work fall somewhere in between. How to implement this new feature, what kind of tools or frameworks to use for this new project, which applicant to hire for the job we're looking for. These are important decisions, and our brain's biases affect each and every single one of them. So what can we do to make our decisions as good as possible? I will start out with looking at cognitive biases that affect our personal decision making, like when we make decisions on our own. And since we as programmers usually don't work alone all the time, but we work in teams, I will also look at some cognitive biases that come into play when you're making decisions in a team of people. Confirmation bias is one of the first things that we have to look at when we're trying to trick our brains into making better decisions. Cognitive bias means that when we search for information or when we interpret information, we tend to do it in a way that confirms the opinions we're already holding. As I said, this affects both searching for information and interpreting information. We take what we already think is right, or what seems to make sense, or what seems to us as obvious, and then we try to confirm this idea. While at the same time, we're ignoring possible alternatives. If you have a strong opinion on something, or a very emotionally attached opinion, it might even get angry if someone challenges it. For example, many people have rather strong or emotional opinions about topics such as abortion, or gun control, gun ownership, something like that. So no matter if wrong or right, that doesn't matter at all. But if you read something, or if you hear something that challenges this opinion, you're very prone to actually waving it off as nonsense or even getting upset if somebody challenges it. While you will happily take in every piece of information that seems to confirm what you're already believing and that confirms your opinion, because it's obviously right, right? Or something more related to actual technical decision-making. And I would like to say here that this is a less emotionally charged topic, but I don't really wanna lie. So for example, if you already are convinced that Rails is the best this world has to offer for this new project you're starting, you're really very prone to not listening to people that come to you and tell you that Rails is a shitty framework for a reason X, Y, or Z. You know, you might listen to what they have to say, but you'll probably discard it pretty quickly for the opinion you already have, and you won't really dwell on it for a very long. You're also much more prone to look for information that tells you why Rails would be a good choice for this project and not why it would be a bad choice. The confirmation bias is a good example for the brain to recognize it's happening and sacrificing accuracy and correctness for speed and less effort. In fact, we have many preconceptions about the world that are actually true or close enough to true in most of the cases. So accuracy enables us actually to act and think much faster, and therefore our brains doesn't really constantly check if our opinions are true or false. It will just assume that this is the base on what we're acting. So this is not necessarily a bad thing, but when the problem comes up is when the opinion that we are holding and that we are trying to confirm in this way is actually not a very good solution for the problem that we're trying to solve. So what can we do about the confirmation bias? A good approach to counter it is to challenge your own opinion. Try to prove yourself wrong. When you're making an important decision, try to put yourself in the shoes of someone whose job it is to actually show that your approach is not correct. This is not easy at all, but it will certainly help you to take on a different viewpoint and maybe uncover some things that you hadn't thought about before. And if you are not sure that you can do this honestly enough, ask someone like a co-worker or just someone you trust to play devil's advocate for you and to actually challenge this opinion. Don't be defensive about it. Actually take into consideration what this person is saying. Change your opinion if required. I know this is not easy at all, but if we're not ready to change our opinions, we don't really need to start with all this working around cognitive bias because there's no point to it. And then if in the end it turns out that the original thought or idea that you had still looks like the best, then it's probably not such a bad choice. All right, another cognitive bias that strongly influences our decision-making is the mere exposure effect. The mere exposure effect means that we tend to like things more if we are familiar with them. We have a preference for something just because we know it. And this is a bias that's also strongly rooted in survival. Things we know, things we understand, things we are familiar with create something in us that is called cognitive ease. Cognitive ease makes us feel good. It makes us feel safe in a given situation. And our brain uses this effect as a kind of dial for a constant situation assessment to make sure we're safe. If there's nothing that challenges us, if there's nothing that looks like a potential threat or that we have to direct a lot of attention to, it will assume that the situation is okay. Cognitive ease feels good. It feels comfortable. But it also makes us think in a much more casual and superficial way. Cognitive strain, on the other hand, happens when we encounter something that we don't really have any experience about. Something we don't know, something we actively have to wrap our heads around. And our brain takes this as a clue that there might be a potential threat. There might be a problem that we have to solve. And therefore, it gives us a little heads up where it says attention, attention, there's something you should think about. So cognitive strain makes us pay a lot more effort into our thinking. We do fewer errors in this state of thinking, but it also makes us a lot slower, less creative and less intuitive. So again, preferring things we like is a natural thing. It just happens that way in our brains and we can't really turn it off. It wants us, our brain wants us to be safe. So it seeks out situations that make it feel safe. But as with the confirmation bias, we can't actually work around it by asking ourselves things about the world we dislike or like something or someone, a certain language, a certain framework, a certain applicant. So for example, if we're thinking about a framework we're gonna use, do we like it just because we're familiar with it? Or is it actually the best tool for the job? Do we doubt a certain applicant for actual reasons or just because they're not the kind of person we're used to interacting with? But at the same time, we shouldn't just toss familiarity out of the window. I mean, there's a point to this whole thing because being familiar with something enables us to hit the ground running. And to get started on something really quickly because we understand it. So the thing is just we should figure out beforehand if this is actually the direction we're wanting to run to and not do it just automatically because our brain throws it to us and says, take this, this is the easiest way. So what do we do about the mere exposure effect? A good way of dealing with this is when you have a major decision to make, set up a list of objective criteria or as objective as possible and use these to evaluate your options. Do this before you actually start looking at options and when you're evaluating, stick to these criteria and that should help you to at least to some degree keep your personal impression out of the game. But once you're done with this criteria, write down a short number about your personal impression, like what you feel about this option. That way you can combine this objective evaluation and your personal impression. And there's been studies that show that this is actually a very good way to make decisions that brings good results. And when you decide, stick mostly to the objective criteria and then use this personal impression report as a support for decisions. If you have enough people for that, you can also separate the person that does the evaluation and the person that actually makes the final decision because that's a way to actually keep personal preferences of one person out of the game. And if you're not super comfortable with making this hard separation, you can also just take your evaluation results and give them to a person that was not involved in the evaluation process and get their independent opinion. Yep, that was that. The confirmation bias and the mere exposure effect have strong influences on the way how we personally make decisions. But a lot of the time we also make decisions and teams. Why do we make decisions in teams or groups? There are different benefits we can actually get from that. One of them is, for example, that we come in with a variety of perspectives of looking at the problem and with a lot more information of possible alternatives or possible solutions. And it also makes for a better decision reliability, which means that through being in a group, you can kind of even out the personal biases of people and dampen them. So let's look at a couple of cognitive biases that actually have an effect when we're making decisions in a group and that stop us from getting these benefits. When discussing something in a team, it's really important that we're all on the same page about the topic we're talking about. I mean, that sounds like a no-brainer, really, but I personally have surely been on a lot of project meetings where we all thought we have an agreement on something and then somebody asks a question and suddenly the whole thing is in confusion because three people find out that they actually were thinking about something completely different, but they were thinking this was actually the topic we're discussing about. So that happens a lot and that is actually an example of the false consensus effect. The false consensus effect means that people tend to assume that what they think is normal. And so they overestimate how much other people actually think the same way that they do or how much other people agree with them. This is due to a couple of different factors. One of them is that even though our brains are actually pretty good at people things, we often have surprisingly poor social judgments. We just get people wrong a lot. Secondly, we tend to project our own assumptions and our own attitudes and opinions onto other people. This might to a part be wishful thinking, but it's one of the reasons is also that our brain does not really have any very good way into looking into the heads of other people and actively knowing what these people think. So our brain sneakily and silently replaces the actual question we're asking like what does this person think? With another question that's easier to answer and that would be what would I think if I was this person? And then it returns the answer to us for the second question, but we never know that was actually not the answer to the question we asked. We tend to do this a lot more with people that we perceive as similar to us or that are members of the same group. So mostly our co-workers fall into that category. That means if we don't clearly communicate our thoughts and our opinions on something, everybody will think that everybody else thinks the same way that they do and that's a very clear recipe for disaster and chaos. Something else that can happen with the false consensus effect is when you have a fairly dominant opinionator in your group, like for example, someone who has a fairly senior role or someone who is just very good at verbally leading discussions or conversations is that when they state their opinion, they'll usually do it in a very convincing way, which is per se, obviously not a bad thing, but it might lead to a certain dynamic where other people, if they feel like they can't really go up against this opinion or they're not qualified enough to do so, they'll just shut up and not say what they think. And then this dominant opinionator, which I'm not saying this because it's something bad, it's just a phrase to describe a certain communication style, this dominant opinionator will then assume that everybody else thinks the way they do and that way, because nobody's speaking up, decisions get made that do not actually reflect the opinion of the team. So what can we do about the mirror, excuse me, about the false consensus effect? Be explicit, obviously, when we call a meeting or a discussion for a decision, be absolutely explicit what this is about, what is the topic of this, what is the goal of this, what are we going to talk about, and what is the decision that we are going to make. So that way, everyone is on the same page and knows what we're talking about so we can go into the right direction right away. For the second thing I described, there's encourage questions, of course, when you want to encourage questions about the topic before you start discussing, so it kind of goes hand in hand with what I already said, and collect opinions first. So before you start out a discussion, before you get into this dynamic where one person's opinion kind of lies over the other person's, let everyone write down their opinions on the topic at hand, and on little pieces of paper or sticky notes or something, because that way you can actually collect everybody's ideas and opinions without having them run through the group consensus first, and then you can actually take each one of those points that came up and discuss about them and make sure that everybody's opinion gets heard. There's something else that can happen in groups that can severely undermine decision-making, and it's called groupthink. Groupthink means that to preserve the harmony of the group or the conformity in the group, members of this group or of this team will actually try to minimize conflict and reach a consensus without critical evaluation of alternative viewpoints, or even by oppressing or suppressing differing opinions from inside or outside the group. Inside the group, this can lead to things like people very quickly adapting their own opinions to what they perceive as the majority opinion of the group, or the opinion of a leading member of the group, a senior developer, for example. It can also take the shape of people actively or unconsciously suppressing differing opinions and discarding them really quickly if somebody brings them up. So what is perceived as loyalty to the group actually makes people not bring up controversial issues or challenge opinions that have already been established. This generally leads to a very sharp decrease in personal creativity and individual creativity and in critical and independent thinking, and it has a very negative effect on the decision-making of the group. When it comes to the opinion of non-group members, groupthink can start out causing things like just not getting any input or feedback from members outside of the group, consciously or unconsciously, all the way to actually actively or semi-actively trying to bar outside influences on the group. An example for the semi-active approach is something you might have heard before, is when people say things like, well, they don't really understand how we do things or they don't know as much about this as we do or any variation of this. Again, this has negative effects on the group's decision-making because it creates an echo chamber where only the group's own consensus or artificial consensus is reflected back at them and nothing else ever gets in. Ironically, even though it provably worsens the decision-making of a group, groupthink actually makes the group's members feel a lot more confident that their decisions are right and that the quality of their decisions is very good because it creates some type of feeling of belonging together, group cohesion and invincibility within the group. Groupthink is a dynamic with an evolutionary background as well and its purpose is, in fact, to create cohesion within a social group and to avoid infighting, which is important for survival. There are three factors that play together that lead to groupthink. First of all is high group cohesiveness. If your group doesn't feel like it belongs together, you won't have an issue with groupthink at all because you don't have a cohesive group. The thing is, high cohesiveness alone does not necessarily lead to groupthink. There are two other factors and at least one of them needs to be present to create this dynamic. One of them are structural faults. For example, an insulation of the group, so it's really isolated and does not really communicate with a lot of outside people. Another one is actually if you have a group that has a very homogenous setup of members, if you have a group where every single member is very similar in their background, in where they are from, like what they are like, their opinions, this very strongly encourages groupthink. And the third factor that can also have some influence are the situational context. Things like perceived threats from the outside that are feel highly stressful to the group or recent failures of the group tend to encourage groupthink. So that's actually good news for us because we really do want cohesive groups but we want them without the groupthink effect. So what can we do to counter groupthink, to make our groups work together in a way that they are cohesive but not like sheep? First of all, a cohesive but diverse group starts you out with a really good bunch of different viewpoints and opinions. So try to form teams that are diverse and not homogenous where people come from different backgrounds, from different demographics, from different viewpoints, with different experiences. Also, oh yeah, have a slide for that. Also, encourage critical evaluation. You have to try and build an atmosphere that actually encourages people to voice their opinions, to evaluate ideas that come up in a critical way because you see if you have an environment where people feel that if they actually say something that goes against the mainstream of the group, it will be frowned upon or they will be punished in some way or the other. They won't do it because why would they? It will only have negative consequences for them. So try to build an environment that encourages critical thinking and the expression of your personal opinions in a constructive way. If you're a leading or fairly senior member of the group, you might wanna think about not starting out a discussion on decision making with stating your own opinion on the matter because that way you're very prone to priming your team to stick to at least the area of things like what do you think about this topic? So they might just adapt to what you think and they might not bring in their own ideas. So what you can do for this as well is you can use this sticky note opinion collection technique that I've already described that's very helpful for this as well. And it just doesn't really mean that you can't state your opinion if you're a group leader, if you're a senior member or that you shouldn't take part in the decision making, it just means like don't state your opinion first, let the other people talk first, let them bring in their opinion and then after that bring in yours. To avoid creating an echo chamber, actually actively invite outside expert or outside people into the group. Let them state their view on things and then actively have your group members discuss the topic at hand with these people. And in general, encourage your team members to actively discuss ideas off the group with trusted people outside the group to get some feedback that's exactly not within this echo chamber of the in group. And last but not least, think about appointing a devil's advocate if you're making an important decision. Do this actually as a real role in the team. So one person on the team will be responsible to take a critical stance against every idea that comes up. And actually question it in a constructive way, obviously. It's not the point of a devil's advocate to shoot down everything other people say, but that way you can institutionalize critical thinking into your group. Just make sure that this devil's advocate is another person every time you're having a discussion because otherwise you end up with a member of the group that other people really resent over time because he constantly keeps shooting down their ideas. And remember this. When we all think alike, then no one is thinking. If somebody says something in a decision-making process and everybody just agrees without any further discussion, you should become immediately suspicious and think why is this happening? Could this be a case of groupthink? That's been a lot of information so far. So let's shortly recap what cognitive biases we've looked at. We've looked at the confirmation bias. This is a bias that takes influence in our personal decision-making and it causes us to search or interpret information in a way that reconfirms our opinions that we're already holding. The mere exposure effect also influences our personal decision-making processes and it means that we tend to like things more just because we're familiar with them. Next is the false consensus effect. This works in team decision-making and it actually says that we tend to overestimate the degree to which other people agree with us or think like us. And then groupthink, where people in an attempt to conserve group harmony try to minimize conflict and find a consensus without actually actively evaluating alternative viewpoints or suppressing other opinions. I have one more thing that I would like to mention and I think it's a very important thing because if we don't understand this, there's not really any point starting to think about working around cognitive biases at all. And it's this, it's okay to change your opinion based on new or updated information. I mean, it's really something that sounds very obvious but if we're honest with ourselves a lot of the time we hold on to our opinions that we've already formed out of a matter of pride or ego because we really, really don't like admitting that we might have been wrong about something. In fact, reevaluating and updating opinions based on new information and new facts is not shameful at all and it's not a sign that you can't make up your mind. It's rather actually one of the only ways where we can get closer to rationally thinking and acting. Daniel Kahneman is a psychologist. He's done a lot of great work in the field of cognitive bias. He wrote an awesome book and I definitely recommend you read it. You can ask me for the title after the talk if you want to. And Daniel Kahneman says, our comforting conviction that the world makes sense rests on a secure foundation, our almost unlimited ability to ignore our ignorance. You see, we can't really stop cognitive biases. They are hardwired into our brains. They're just there. And it would probably not even be a very good idea to try and stop them because that would quite interfere with the way our brain works. So there's not really any way for us to become completely rational creatures. But what we can do is that we can try to chip away at ignoring our ignorance. We can start to learn about cognitive biases. We can start to learn about situations when they happen and then we can recognize them and use techniques to counter or circumvent them when appropriate. That way we become better at understanding ourselves and thereby also understanding others. We become better team workers. We become better decision makers. And we generally become better and more successful at what we do. So in that spirit, let's all start working at being less ignorant about ourselves. And I hope my talk has given you a good start for that. Thank you for listening. Thank you.