 Thank you friends for your presence today and I'm delighted to present on cognitive biases today. This is one area which affects all of us and a lot of times these intuitive decisions we take, we are not even aware of why we are doing that or how we come to a decision. So I've done some research on this topic recently and I'll be sharing some of that with you. I'm sure you'll find that interesting because it relates to our everyday life. So as I go along, if there are questions or there are observations, please let me know so that I can address them as we go along. So this is based on behavioral sciences. I'm sure most of us realize that a lot of the things that we do in media and communication studies is taken from behavioral sciences. So it's again a good opportunity to talk about all that and also try and relate it to media and communication also as I go along because that also is a very important part of the cognitive bias and all these processes. So why 18? So before I start, I must emphasize 18 is not an exhaustive number. This could have been anything but given the time and given the kind of things that I wanted to talk about, this is one number that is stuck on me but there are many other cognitive biases. Some of them are related to each other but it's not an exhaustive list by any means. We have a lot of books on 25 cognitive biases. There are sites which talk about 30 odd biases and so on and so forth. So 18 is not an exhaustive list by any means but I have chosen the ones which I thought would be very relevant for a discussion like this. As I go along, I'll talk about the reasons why these biases could go on and how it affects our decision making. Before I start, I must talk about this gentleman, Daniel Kahneman and I'm sure we've heard about him when we talk about the framing theory and he is one person who is responsible for a lot of the heuristics that we will be talking about. So in 1974 they came out with a paper and all so I'll talk about all that. So with Daniel Kahneman and Tversky, they were pioneers in the field and they even began what is known as behavioral economics where it's not about people taking rational decisions as economists would believe but it depends on how maybe these choices are framed or how people see a particular thing as risk or not a risk. So how people take decisions may not always be consistent and over the last four decades and more Kahneman and others they've been talking a lot about these things. So this is how you define these heuristic biases or intuitions basically. We normally allow ourselves to be guided by impressions and feelings and the confidence we have in our intuitive beliefs and preferences is usually justified, usually justified. So we have a situation where we talk of expert intuition, somebody who's been doing something on a regular basis, he has that kind of an intuition where without getting into the details of a subject just by having a cursory look or just by having a glance or just with a limited amount of information available he can draw or she can draw conclusions based on intuition only and very often they are justified. So when we're talking about biases we are not necessarily talking about something which is inherently bad. It's just that we want us to be aware of what these biases are. A lot of the times when we make these decisions they might not be guided by a long thought behind it or by as we say some reasonable logic we put into it. Very often these impressions and feelings they guide the intuition that we go for and this is what Kahneman talks about heuristics. So we make judgments consciously or unconsciously about various things and often this violates the law of logic and probability because I will talk about the probability and logic details but as I said at the beginning this will be more about what we face in every in our everyday life without getting into too much of the technical details. But often we make these judgments based on simple heuristic decision rules or mental shortcuts about our own experiences about how we decide to reach certain conclusions so on and so forth. So this is one shortcut we use for making a lot of decisions in our everyday life and this is often justified because we do not have time to go into that slow thinking. We do not have information. Most of the times we do not have all the information that is required to reach to a foolproof decision or even there are no optimal solutions inside at times so there are different occasions where these things are very very necessary. So as I said it's not necessarily something which is something that has to be avoided at all costs. There are reasons why these biases are there and we will just talk about those reasons in a moment's time. So as I said it's justified at times. So this is one picture that I got from the internet so this is just to you know set the base as we go along. So what are the common uses for these heuristics? First of all it reduces the mental effort needed to make the decision and that's a very important characteristic that the system itself evolves into that we do not at times make a lot of mental effort into a decision. So when our effort is all into the mobile phone for example a lot of the questions we just answer or maybe you know the expert drivers and that can be very scary at times when they're driving they just depend on their mental heuristics to make a decision while they are kind of scrolling down their mobile and they are driving so it can be there. So the mental effort can be divided into that thing and it can simplify complex and difficult questions we'll see in a moment's time. They're also fast and accurate way to arrive at a conclusion. So that happens instantly. So we'll be talking about two different mental models inside us which we keep we arrive at without often knowing what we are doing. It also helps in the problem solving processes. So this is just a very simple explanation of the common uses of heuristics. There are three different reasons why these heuristics take place and we will talk about those three different reasons. Why do we arrive at a particular decision based on those mental shortcuts? I'm giving the reasons first and then I will come to the biases. One of the reasons is that there is a very limited amount of information available and even if you have a lot of information available you make use of whatever is required to come into a decision. So we often do not even if information is available we do not avail of all the information. So the cognitive biases could be number one due to limitations in available data. There might be very less data available and also because the human mind simply is not capable enough to process all the information that is there. So that is why we use these simple mental shortcuts especially in complex and unfamiliar or uncertain terrain and also you know when we have time constraints situations. Also as I said as I keep on repeating because of the limited rationality or the bounded rationality concept we suggest that we can only process a limited amount of information. So that is one reason why we refer to these heuristics or we refer to these shortcuts while making decisions without even knowing that we are going through this these shortcuts. Another is again a very important thing that we have evolved over the years and there are so many things that we can talk about this ecological sorry the evolutionary is the next one. This is the ecological perspective. So there might be a mismatch between the heuristics and the context. So your heuristics or your shortcuts possibly are based on another environment and you are trying to use it in a very different kind of an environment and that is where a lot of mismatch can occur. That is where a lot of the decisions we make might not always be true or they might not always be the best possible decisions. But as I said they are based on the spatiotemporal regularity. Spatiotemporal means that our experiences are basically based on a particular space in a particular time but when we try and apply it to a very different kind of a space and a different time then they might not always work fine. So the context and the environment in which we work it's more suited to that kind of a situation. So if we are in a particular ecology and then we apply that then probably it works better than when we don't live in that kind of an environment. And the third reason why these biases work is because of this evolutionary perspective and there are so many things that we can talk about that over the years our mind has evolved into the situation that we are here today. So we talk a lot about the amygdala for example that is the brain which that is a part of the brain which prepares us for impending danger or which tells us that there is danger. So people who did not have the amygdala they were lost in the evolutionary process. So the people who have survived are the ones who have the amygdala. So this goes on for generations this is not a new thing but there are biases for example I'll just refer to one bias I'll talk about this when I talk about the bias itself later on the action bias. So when we have a situation in which probably there is no rational justification to take an action but for evolutionary reasons because our forefathers have been taking action whenever it was required or when they thought that there was a justification for action they do take action. So given a situation where you either have to take an action or not not take an action often we go up taking the action and I will talk about that in details but as I said you know this is from the evolutionary perspective that as we evolved over the years we got used to taking action on certain situations based on the dangers or based on the possible advantages of taking that action so that is one of the reasons. So as I said these are the three reasons we talk about generally there are many other reasons one as I said is the cognitive psychological perspective the other is the ecological perspective and the evolutionary perspective. Also there is the association principle there are many other principles but as I said in a 45 to 50 minute lecture we are trying to bring in the most important parts right now. So the brain automatically searches for correlation coherence and connection this is always looking for connection and that is why good journalism is always about associating one thing to another so that you can draw your meanings in a better way so this associative way of perceiving understanding and predicting it arranges our observations into regular orderly patterns or relationships that is why very often when we see something like that we think that it must fall into that particular block and these blocks are the evolutionary biases that I have spoken of and I will talk of as well. Now all these things are there and these two very very important publications there are many others and we keep on going from these publications again but one is this book Thinking Fast and Slow. So I will just explain what this Fast and Slow thinking is and as I said there was this particular paper published in the Science Journal in 1974 where they talk about these biases and these judgments that we make under uncertainty as they say that what are the heuristics and what are the biases that we are exposed to. So this is why Amos Tversky and Daniel Kahneman unfortunately Amos Tversky died at the age of 59 and he was not there when Kahneman got the Nobel Prize otherwise they would have got it jointly. So these are two of the publications that I think are very important and there are many others that I will talk about as I go along but my presentation has largely been guided by these two publications. So as I said the Thinking Fast and Slow is one we must understand before we get into the biases themselves. So there are two systems that operate in our brain. The first system it operates automatically and quickly with almost no effort and there is no sense of voluntary control where we straight way we make decisions. So if I just show you an angry face you will immediately draw a lot of conclusions about that particular person or if you say or if there is somebody who's you know just throwing around trash inside a train compartment just by one look at that person you are drawing a lot of conclusions about that person and most of us we do and most of the times it might be correct about that particular person. So that is the system one of the brain which operates automatically and quickly and the system two it requires a lot of mental activity including complex computations. If I give you a large division or a multiplication that is where you can't have a very simple straightforward answer you will need to take a calculator or go for some manual calculations or whatever to get the answers. So the operations of system two are the agents of agency agency means you are doing it yourself by choice you are making the choice to expend more cognitive energy onto that particular object or onto that particular problem and it also requires a lot of concentration. So roughly speaking these are the two systems that are there in the brain that system one it operates voluntarily there is no effort on our part or little effort on our part it is based on our experiences it's based on the mental shortcuts which which we have acquired over the years and the system two. So as you can understand without even going into the details that which one would be the fast and which one would be the slow. So Kahneman talks of system one as the fast thinking and system two as the slow thinking it's not as if two different people but it's as if these are two different systems inside our brain. There's been a lot of research on the system one and system two although people describe it differently so these are the typical correlates that they talk about the system one and system two. So the first part is fast it is high capacity means it can take a lot of decisions immediately it's parallel so it doesn't take you know one after the other it's just parallel it's non-conscious you're often not even aware of what you're doing responses could be biased and that's what we're talking about it is contextualized it is based on a particular context it's automatic it's associative and it is experienced based and this is independent of the cognitive ability which we otherwise have otherwise we might be having a lot of other cognitive abilities but when we do this fast thinking that is where we are not even applying these cognitive abilities so I will not go into more details of to that but the slow part as I said the system two thinking it's slow it's of limited capacity because at one point of time you can do just one kind of a thing it's serial there's a linear approach to that it's conscious you're aware of what you're doing the responses are normative these are idealized responses they can't they may not be biased they might not be correct but they are normative responses it's not contextualized it is abstract it is not based on a particular context it is controlled it's not automatic and it is rule-based it's consequential decision-making and it is correlated with cognitive ability so if you are using the system two it depends on your cognitive ability the first one may not depend on your cognitive ability it's similar for everyone now having established this base we can go on to talk about the 18 biases that are the subject of this particular discussion so I'm sure this background is enough to talk about the details that we are going to talk about these are the three things that Kahneman talks about in the heuristics one is representativeness how much representative it is availability how much of the information is available at just a minute anchoring I will talk about anchoring when I talk about the anchoring bias so these are the things which are responsible for representativeness if need be I will come back to that these discussions at a later time this is about availability means whether you are able to retrieve instances that are provided to you whether when you're searching for something how effective that kind of a search is or what are the imaginations you are using when we are talking about the things that are available to us or the information that is available to us and oftentimes the correlation that we draw might not be correct so these are the things that fall under the availability paradigm and there is an anchor position which I'll talk about but as I said I will not talk in greater details about it here because that is one of the biases that we will be talking about another very important thing that we must be aware of and I spoke of this when I made a presentation on surveillance dynamics as well or surveillance economics as well and that is about the effect or the impact of emotions on our decisions and that's very important and I'm sure you will realize that when you are in a happy state of mind you are more willing to forgive people or you are more willing to justify the actions of the people but at the same time if you're angry or if you're outraged then you will be making very different kind of choices so this is again a very important factor of you know using the effect heuristic but in most of the discussions that we are talking about we might not talk about effect heuristic but this is one of the techniques that a lot of people employ especially during surveillance to find out your emotional state of mind at a particular time and based on that emotional state of mind they would try and push the products that you might not have otherwise bought a lot of the things you might you end up buying on one of these online platforms you buy because you are in a good mood or you think that I'm willing to spend more when otherwise if you're angry or if you're depressed or you're feeling down for certain reasons you might not be making that decision at that point of time so now we go on to talk about the biases so we will be talking about not in any particular serial order it's just a very random kind of an order but when we're talking as media and communication students it's important to start with confirmation bias and that's a very important element of bias that we have to be aware of especially when we are talking about how people consume news or do not consume news or information and how they find a particular kind of information believable and the other kind of information is not believable so oftentimes we have a situation where there are a lot of facts checkers and there are a lot of other theoretical inputs people provide about the kind of information so you might suggest that this is not true but there are very many people who are willing to believe that information to be true because it confirms the beliefs that they already have so if I already believe that this is the party a is good and I get an information which suggests that party a is good I'm going to take that information more and if there is an information which tells me that party a is not good I'm going to disregard that information I'm not even going to read that information and it so happens with our choice of television programs as well things that we do not believe in or when India is losing that ODI we will not watch the highlights we will just switch off the channel and we will not talk about that so whatever things that we like to read or we like to be informed about we are unconsciously at times going to look out only for that kind of an information so if there is a channel it could be anything many people would say that this is not good this is unfair so on and so forth but at the same time there might be some people who agree to all that information and they feel that that information is credible because that conforms to what they believe in already they believe in so that's a very very important bias to remember about that confirmation bias exists and very often it might not always be the case with everybody it might be different with different personality types it might be different for different people but it is present there in some form or the other and that is why the channels or the media outlets they have to understand the audience perspective as well so from the audience perspective it's very very important for us to realize whether they are looking out for that particular information because of some kind of an inherent cognitive bias and we as communication scholars we also talk about something known as theory of cognitive dissonance so if we are provided with information that does not resonate with the information which you already believe in then there is dissonance and then there is some kind of mental I won't call it trauma but there is some kind of a mental dissonance or disturbance that we want to avoid so confirmation bias is a very very important bias that exists and one reason that confirmation bias exists is that people are motivated by argumentative reasons because I believe this to be true so we are looking for arguments that defend our opinion and that we will be used to persuade others instead of looking for the truth so we are not looking for the correct information but we are looking for information which confirms the beliefs that I already have so there are reasons for that but this is a very important bias that exists as I said there's the anchoring bias is again very very important and there have been experiments this is as I said based on behavioral sciences and it's all based on a lot of experiments they do so if there was a question in one of these experiments people were asked to say tell us how many countries are there in the African continent before asking the question they were randomly given a number just like that the number was 10 for one group for the other group the number was 45 for the group which were given the first number as 10 which is kind of an anchor for these this group says that there are 25 countries in the African continent and for the group which was given 45 as the anchor number they said okay it's 65 so very often our information or our judgment of a particular situation is based on the information that is already known or which is first shown so if we know this person to be a bad person then we will start off with that anchor or if we at times when we know that the price of this thing is maybe 500 and if we get something close to that then we go for that so the number that comes to mind or the number that you associate that or even the information you associate with that information that predetermines your final decision making so this is a kind of a tunnel vision this takes you into a particular tunnel so if you are anchored on particular things and I'm sure you realize that when you talk about certain numbers or certain figures and when you believe it to be true you keep on helping on that because that is what is known as the anchoring bias so it's based on what is the initial number or the initial information that was provided to you this again is related to at times it's related to spiral of silence but not directly this is about what you consider everybody else regards as the truth or what everybody regards as popular so if you consider that oh everybody is saying that this is good so this might be good so a lot of the word of mouth effects and a lot of these things they are because of the band wagon effect so it is a bias favoring ideas that are already adopted by others oh everybody's using a smartphone so I think the smartphone is a good thing so I should also go for that everybody is doing classes on google meet so probably it's it's good so the rate and which you know it's adopted by others it will significantly influence the likelihood of these information being selected and taken forward so this again exists in many situations especially as as social beings we are looking out for what everybody else is doing this is another bias which exists without us even realizing that this is bias so very often if you go to a particular online site commercial site you will see it's they say that only two left order quickly and when you find out that there are or there are cases where in just one or two minutes they will have that kind of a deal which is which is which is a very quick kind of a deal and then either you get it or you don't get it so if you think that it is scarce a particular item is very scarce then we place a higher value on that particular object and a lower value on which the the one which is available in abundance as I said this is I'm talking about the fast thinking process so if you find out that this is scarce and that's what it what happens whenever there's a lockdown or whatever and somebody says that this item is going to be scarce that people make a beeline for that kind of an item or when people know that these kind of shops will be closed and there will be scarcity so they might require and hold it even it might not be needed for them in the first place so the scarcity bias is is another very real bias that exists a very real cognitive bias that exists the fifth cognitive bias that I'm going to talk about is the projection bias so this assumes that other you know people they share our pattern attitudes beliefs and so on so you are projecting your thoughts into others you assume that everybody else is thinking the same way or since I think that this is the outcome this is what others might think as well so there is another related effect which is the false consensus bias and I decided to drop the false consensus bias because it was so related to this projection bias so we hold the assumptions knowing that it is impossible for everyone to use the same mental framework we do so we project our thought process into others and it happens very often when as an Indian you go out of the country so you're so used to getting that free glass of water when you walk into a restaurant that when you go into a western restaurant and you have to pay for water then you think that that could be a very you you project initially that they will provide you water for free because you think that their beliefs and their attitudes this is just a very commonplace example that I'm giving you so this bias makes us think a lot of things that we don't or for example when you go to an airport and you take one of those strollies in India you don't have to pay for that and in other places you're supposed to pay for that so you in our place you even you know grab it from somebody's hand if somebody is walking around and there if you grab it from somebody's hand you're actually you're doing the fraud you're doing some fraud or cheating the other person because he has paid three or four dollars for that trolley so you're projecting that that will that system also has the exact same kind of things that we do so this projection bias often exists it can be about many many other instances I'm just giving you examples of the bias and we can relate it to a lot of everyday things that we do the other action bias I spoke of when I spoke of the you know when I was talking about the evolutionary process or the evolutionary factors responsible for these biases so action bias puts pushes us to act when faced with ambiguity when there is an ambiguity and when there is a possibility of either doing something or doing nothing we often favor doing something without even any analysis we think that this needs an action so this leads to develop solutions when the problem itself hasn't been defined well the solution is not even required but this leads us to developing these solutions so we need to appear active even if it does not lead to anything so we need to keep on doing things so this is the action bias that we are faced with in everyday life this again is a very important bias about how we judge other people how we regard other people so if there if there is somebody else who is very good at singing for example then we assume that that person will be good at other things also so for example when we were younger and we would see some wonderful cricketers on the cricket field somebody who could bowl very fast or bat very well and so on and so forth and we expected that person to be good in every other sense of the term but then very often many of those people were not very good with studies and all but so that was a kind of a shock for many of us because if they are good at A we think that they will be good at B, C or D or whatever or if somebody is speaking very well we think that he will be very good in singing and acting and dancing and everything else so these are things which have been going on for very long and the reverse also happens because somebody is not good at something we assume he will be bad at everything else so that halo effect was discovered or first coined by Edward Thondike in 1920 about their soldiers so when the soldiers who were being good they were just good on all the factors and people who were seen as bad they were bad on all the factors so this is almost like the erstwhile Hindi film heroes they were good at everything that they did from horse riding to singing to sword fighting to dancing to studies to nuclear material everything so this is the halo effect that we assume every day so if there are people who are good at one thing we think that he or she will be good at everything else the availability bias this also I spoke of when talking about Kahneman's work often our information is based on the immediate information that comes to mind it's a shortcut that enables us to make sense of the world based on the information that we have lot of the times we do not have the right information or lot of the times we remember the rare happenings we don't remember the common place happening we don't remember everyday happenings so often these quick decisions are based on overestimations of the dramatic or the vivid incidents that are easier to recall so when we talk about a particular place and if there is something dramatic that we associate that place with then we will assume that place to be based on that dramatic information that is available to us so it's based on these overestimations so availability bias is a very very important bias that it that happens with a lot of us based on the information that we have and as I said it's also based on the information processing capacity of our brains as well this is another very important bias we are now at the midpoint and very soon we'll reach the 18th bias also so this favors outcome this favors the option so if there are more than one option then we will opt for the option whose outcome is knowable or whose outcome is more or less guaranteed than those whose outcome is not known so often when we were as as the newspaper submitters will tell you often when you're looking for a good headline whatever fits onto the block for first and it looks good we take it we may not even go beyond that because otherwise it might lead to more ambiguity so this this impacts a lot of innovation outcomes although there is a bias which is almost exact opposite of that we'll talk about that also but often we go for out for those options whose outcomes are more or less knowable or whose outcomes we know so that bias also exists the outcome bias is another bias where we assess the quality of a decision based on the quality of the outcome so it's slightly different from ambiguity where we are not we are avoiding ambiguity and here we are aware of the outcome we know about the outcome so if there is something which leads to a positive outcome then that will be viewed positively so if there are shortcuts which do not appear good but it leads to a positive outcome that you are able to perform a particular action properly or you're able to perform that particular action to everybody's satisfaction then you will go for that particular process so if the outcome is positive then very often we might not be very very concerned with the process itself so similarly if a decision which is leads to a negative kind of an outcome that will be viewed negatively and we might not go for such processes framing effect another very very important thing related to media and communication and this is straightway related to Kahneman's work and this particular thing was another of his article as I said I'll be talking about another of his articles in this particular thing so this was in this particular journal American Psychologist in 1984 39th volume the fourth edition from these pages and this talks about the choices values and frames so framing theory is a very very important theory that we study in our media and communication classes and that's where how we frame the same thing in two different ways the same situation the same exact situation might be framed differently and Kahneman and Tversky talk about this as a prospect theory so if they're talking about and this is from the book itself and this is there are lot many things that this book talks about I'm just giving you one very simple example so when you're sure about you know how many people will be saved then you probably go for that particular option so here they talk about there is a unusual Asian disease which is expected to kill 600 people there are two alternate programs if you adopt program a 200 people will be saved but if you adopt program B there is one third probability that 600 people will be saved and two third probability that no people will be saved so this is kind of a survival kind of a frame that they are providing and when you provide these kind of options people will always go for an option although both of these things mean mean exactly same and when we talk of the rational human being a rational human being a supposed to take both of these options are the same because it means roughly the same in statistics in the other also you know there is a chance that 200 people will be saved and 400 will not be saved but since this talks of this as very this is framed as something very specific people will go for program a as you can say in the bracket that 72% of the people they would opt for program a and only 28 people would opt for program B because there is a risk element that they're talking about a one third probability and here there is a certainty that 200 people will be saved if the same thing is framed very differently and there are many other things about that maybe at a future lecture I will be able to talk about framing and all then I will talk about all these things in much more greater detail but how the things are framed to you determines how you're going to react to that so here with the same kind of information presented to you as one of the options as certainly leading to 200 people being saved you're going to take that option so your mental shortcuts are determined by this framing effect and even in the case of public relations there are different ways in which a situation can be framed in which the attribute or a person of the of a person or a company can be framed how choices can be framed how the actions you are taking can be framed it's about talking about people you know when you are firing people and how you frame that action so if you frame that as streamlining or something that the company is doing to make everybody everybody else's job better then you might go for that particular option which otherwise you might not you might not have supported that kind of a thing so it depends on how the choices are framed how the actions are framed how the issues are framed how the responsibility I mean at the end of the day whom do you hold responsible so how the responsibility is framed and how the news are framed so there are different framing steps which will determine how people react to these events how people are how this bias is their decisions or how that bias is their judgments pro innovation bias this is another bias which is as I said slightly different from the ambiguity bias that we spoke of earlier so new innovations should be adopted and whenever there is a newness a lot of people will regard it as inherently good and this is what happens with technology a lot of fake news is very easy to spread on WhatsApp because there are many people out there of course none of them in this group who are present in this online class but for many people the very fact that this is in a new technology that means it has to be true so you might have heard people saying that oh I heard this on face I heard this on WhatsApp so this has to be true so whenever there is something new so we there is a lot of public opinion about that leading to some very good impact so whenever there has been a new media new media in terms of you know when we had television people thought television will take care of all all ills of the society then we had the internet people thought that internet will take or even the radio so whenever there is a new technology there is a lot of technological determinism in that particular aspect where people believe that the impact will be inherently good regardless of the potential negative impacts there might and we are even willing to forego these potential negative impacts so when we talk of smart cities we might not even think about the environmental damage etc that is done so we are we go for the pro innovation bias in that situation another bias that clouds our decision or which affects our decision making is the recency recent events are much more easier to remember and that can and this weighs heavily more than the past event so something which was done recently and that is why people keep on saying that the public memory is short or things like that when people say public memory is short what they intend to say is that people remember your last performance or the recent pronouncements much more than that only might have scored 49 centuries but in the last match if it doesn't play well then you're going to feel bad about it or about any other things if you remember the recent events are good about that person or if I'm asking you to draw up the all-time favorite Indian team it it will depend on on your age on how you regard which person to be an all-time best so if there is a younger person he might be having more members from the present team if there's a slightly older member he would be talking of the things which were recent during his times so there are different ways in which we define this recency so most importantly the recent events are the one we remember because we whenever new information comes it replaces unless the information was dramatic to start with that is then another idea false causality bias so very often we attribute false causes and we keep on saying this in our research classes that correlation does not mean causality so often we attribute false causes for example if there is dry hot and sunny summer weather although the last summer was one where we could hardly go for ice creams outside but generally this causes ice cream or maybe this hot sunny weather causes a lot of sunburn if you're out in the sun very often but if you try to attribute that ice cream causes the sunburn because I've been having more ice cream and that's why I had more sunburn that would be a false causality because that is just correlation that is no causation so that is what I meant by when I said that I went and spoke of the false causality bias so what is causation and what is correlation has to be very clear and very often this associative thing as I said that our mind is looking for these associations so when we attribute false causes then we make these wrong judgments because then we are probably leading the wrong problems or we are or it happens in the design thinking phase when we attribute something to a particular cause when that particular cause might not be true in the first place again this is a very very important thing and especially with the young people and others it's important to realize that very often I'm sure that many of us will find this to be very relevant to our own everyday lives very often we assume that the entire world is watching us or if I write anything wrong or if I say anything wrong or if I do anything wrong everybody else is noticing and that is a spotlight effect and which probably is true in certain senses because our world revolves around us so that's why we are the center of the universe for ourselves but that does not mean that we are under a constant spotlight so we overestimate the number of people who are consciously paying attention to our actions so this is the the the the the eucocentrism is is suggested to be one of the reasons for this particular bias so even if you make a small mistake maybe during presentations we believe that everybody would have witnessed and people start feeling bad about that but the truth is that people are not always paying that much attention on you all the time so if you assume that everybody else is watching you and you might feel bad at how you're wearing your glasses or how you're talking or how you're reacting to things or what it may not be true so it's again a bias which makes us feel or do things or judge things which may not be true all the time this is another very important bias the dunning kruger effect so as we are nearing we are talking about the more technical ones so this one assumes that very often people overestimate their ability there are people especially who have not worked on do anything and if you ask and say oh yes yes i will do that don't worry i will i can manage anything so very often people with low ability to overestimate their ability it is it is because of some illusory superiority or it comes from inability to recognize their lack of ability so we had cases like you know the dhinchak puja where probably she did not realize how bad a singer she was but then probably people overestimate their abilities and the often it is comical but it might not be comical to the other person so often people are not aware of their shortcomings and they tend to deny their failures so they even fail to acknowledge the gap between the actual performance and how i perceive ourselves so there are you know all these memes about how you perceive how you're doing and how the world perceives you they might be very different you might be thinking that you know your croaky voice is very similar to kishore kumars for example but if you put it out on youtube and let others decide it will not be what you think you are so often these cognitive biases are very difficult for us to realize in ourselves it's very easy for us to see that in others so this dunning kruger effect is another effect that causes this cognitive bias about your own overestimation this is again a very very important thing to know and i will just share a particular diagram and from there i'll try and explain what this survivorship bias means so just see this particular forget about the covid i'll talk about covid later on so these are the world war two planes and this is how they would return that these are the places where they have been attacked these are all the holes on the plane these are these are the holes on the plane and whenever they would come back to the hangar the military experts would say that oh my god these are the weak areas so let's just cover this up so that the next time uh it does not get hit that properly now there was a statistician by the name of walled he said no no this is not correct because you are missing out on something very important these are the planes that have returned even after being hit so they have survived so the very fact that they have survived doesn't mean that these are the weak spots maybe the others they have been hit here and they have not survived and they have gone spiraled down and they have been destroyed so very often a lot of our decisions are based on the people or on the situation who have survived a particular situation we are not aware of the ones we have not survived so who have who haven't survived so we talk of entrepreneurs we talk of cases case studies we talk of people and we talk about all the people who are doing well or or who are out there in the ambit but there are a lot of people who are absolutely invisible and that is where the survivorship bias comes in so this is a very very important bias in the field of behavioral sciences and this was by this statistical research group at columbia university that they examined the damage study aircraft and they found out the most hit areas of the plane did not need additional armor but on the areas that showed the least damage so the statistical research group led by walt is a very very famous statistician he said that the area showed the least damage that required the most additional support not the u.s. military's conclusion that the most hit areas needed additional armor the additional armor was needed in the areas that were least hit because it's a it suggested that most of the planes who are hit here and here and here and here they could not return they did not survive these are the ones who survived so that is one very important bias that we must be aware of so this arises due to the error of concentrating on people on things that made it past that selection process so we know about people who came through the college system what about those who did not because of of that we have to be knowing about what are the things that was lacking in them so when you're talking about some people who have passed certain selection process then we are only talking about the people who have survived but we have to consider it also on the people who are not visible so that also is a bias that creeps into a lot of our decisions and the last one is the pelzman effect it's also known as the risk compensation bias so again i will give you a sporting kind of metaphor which will explain this easier a lot of these batsmen they have so much of the protective gear the helmets and the arm guard and the chest guard and the thigh guard and the you know back guard or whatever that they assume that you know they do not need to take any they can take a lot of risk because they are all protected so a lot of people in rugby and ice hockey they had a lot of problem with their necks and many other places because they acted in a manner in which they thought that there is no risk which happens to many of us who are using these antivirus software so we assume that oh we can go to any place and the virus will never attack and anybody who has done that knows that that that that can be so very dangerous so this risk compensation bias is you know explained through this process of risk homostasis and according to this theory risk is an inherent part of our nature and since we assume that we will not be affected we keep on taking these chances which otherwise we wouldn't have done that so right from the confirmation bias to so i'll just you know remind you of all the biases that i spoke of i spoke of the confirmation bias i spoke of the anchoring bias i spoke of the bandwagon effect i spoke of the scarcity bias i spoke of the projection bias i spoke of the action bias the halo effect where we assume everybody to be similar on on various attributes the availability bias the information that you make on the information that is available to you the judgments you make on that the ambiguity bias the outcome bias the framing effect the pro innovation bias the recency the false causality bias the spotlight effect the dunning kruger effect the survivorship bias and finally the pelzman effect or the risk compensation bias so with this i end my presentation here if you have any questions please you can ask me here good evening sir very good evening my name is a whole trip party i have a detailed question on this sir as you have told about cognitive biases which are based on various things like feelings emotions preconceived notions and informations but a human being always have the capacity to decide between a cognitive bias and a conscious thinking a mindful thinking to do something while he's planning to do something and take decisions which can be more beneficial to him or for his future academics and everything like that so arc is conscious thinking also somewhere is is influenced by your biases as i said in one of those initial slides there are these two systems of cognitive processes that goes on within us one is the fast process which is based on these mental shortcuts on and so forth based on the information that we already have are based on availability etc etc so in those kinds of systems or in those kinds of cases we are expending very little cognitive energy and that is why it is known as the system one or fast thinking in those processes where we show a greater amount of control where we use more of our cognitive abilities which is based more on agency as i suggested that is where you this bias can be avoided so when we talk of cognitive biases it means that there are these two processes which goes on with us simultaneously so it does not mean that we can simply do away with the system one way of thinking and leave out the mental shortcuts or not because it's there it's always there so the moment we when we talk of first impressions or these intuitive actions or intuitive beliefs this will always be there but they are there within us and as i said it's very easier for us to realize this in others of course there are ways of of taking care of that one of them is Edward Debono's six thinking hats and other ways where they ask us to slow down and think of all those six parts but this intuitive thinking is there especially in cases where the time is short whether when you just see a person just flashing by you are not going to ask that person to stop and you know talk ask questions of him or whatever you make decisions of that particular person when an information comes to you immediately there are there is a reflex action it's not as i said it's not even conscious so that happens subconsciously that happens it takes place of course subconsciously so we you are not even that's why i started up with the explanation first where we provided the fact that all much of this might not be within our control we might not be even be aware about it so should human beings always try to control this means they have a intuitive thing should they always try to control this no i i'm not saying that i'm saying that very often in fact if you remember my first few slides i even said that very often it may be correct as well often all these estimates are correct or useful but just to suggest that there are cases where or we must be aware of these biases that exist as i said some of this can be can have far reaching consequences say for example a survivorship bias i mean only because you sat down to think about it or only when the statisticians statisticians not the experts the military experts and very naturally this is where it has been dented this is where this has been hit let's only replace that part but it requires a lot more insight into that particular case to find out whether it needs more reinforcement on the other parts of it so as i said it's not necessarily bad but just the idea just the knowledge that this exists or as a content provider i must be aware of the confirmation bias amongst the audience if i'm not aware of this confirmation bias of the audience then a lot of the things might not be correct so it is it is there so the realization is more important thank you thank you thank you all for the question yeah if anybody else yeah i just want to know i mean is there a relationship between this bias and experience yeah so these these mental shortcuts as i said the judgment under certainty it's it's based on your one of the perspective in which i answered or one of the explanations is the ecological perspective so it is based on a sum of it's one of the reasons is that it's based on your experiences so of course experience is one of the reasons for these biases we spoke of two or three other reasons also and there are some other neural networks processes also but basically the three explanations that we've provided one was the cognitive psychological the other was this ecological third was evolutionary so we have all these three processes these three examples these three explanations thank you so much professor paul for your question oh thanks anybody else sir one more question sir yeah sure most part of your presentation was based on thinking fast and slow that book or it was from various other places sir the explanation was from thinking fast and flow fast and slow the other was from all the other works okay if i wanted to know some more details about the thing because it was very interesting thank you so much thank you that's why you know we make the effort of doing it and i'm so happy that you find it useful thinking fast and slow is one of the beginning already they provide a lot of explanation but as i said there are many many other books or there are many other pop psychology attempts at trying to explain these but the knowledge is very important thinking fast and slow i have the hard copy available so when i buy hard copies it means i find it very interesting okay thank you sir thank you very much Raul anybody else any questions so i take the opportunity to thank all my super colleagues in the department of journalism and mass communication sure and run out college for women women and jyoti shower for their wonderful work because we are in the midst of a very important assessment and all those exercises so the fact that we could manage to get this going speaks very highly of your commitment your dedication your efficiency and everything thank you students thank you colleagues thank you everybody for joining today it's been a pleasure preparing all this and sharing this with you we will as i say keep on sharing this material with you through youtube and all and thank you to all my colleagues from other places as well i can see the book i can see show make i can see many other people i can see shamina i can see shamita madam i can see others as well so thank you everybody for for joining and thank you for your good wishes hopefully we'll come back with more such sessions in the future thank you very much i take your leave now thank you sir and i hope you have all filled up the feedback form carefully yes okay thanks thank you sir thank you thank you very much thanks a lot