 Hello, my name is Carolyn LeBou and I'm a perioperative echocardiography fellow and clinical visiting instructor at the University of Utah. Today, I'll be discussing common cognitive errors encountered in perioperative decision-making, specifically pertaining to the field of echocardiography. I have no disclosures. When talking about the field of decision-making and cognitive reasoning, we have to start with a landmark article that was published in Science that is cited by most every major paper in the field. In 1974, two psychologists, Amos Versky and Daniel Kahneman, described a framework to help explain current theories about human decision-making and judgment. They theorize that individuals process information and make estimates and choices through the use of heuristics. Heuristics are our intuitive thought processes. They are mental tools that we use to allow for reflexive thinking and cognitive processing with limited amounts of information available. Heuristics help us make rapid decisions, but with the benefit of expediency, we can potentially sacrifice our accuracy. From the body of work of Versky and Kahneman, the dual process theory has emerged as the predominant approach to explain reasoning. This model describes two systems of decision-making. Heuristics exists in the first system of thinking called type 1 or fast thinking, and this is the ability of the brain to think and act intuitively. This is the type of thinking that is prone to cognitive biases. System 2 thinking is our slow and analytical deliberate approach to decision-making. These types of thinking can be translated into medicine. As physicians become more seasoned, their cognitive processes become more routine, more like type 1 thinking. Much of the data on cognition and medicine, specifically when it comes to cognitive processes and image interpretation, is out of the field of radiology. To give an example, experienced radiologists, using their seasoned type 1 thinking, may reach a diagnosis without much conscious deliberation in a matter of minutes. However, the heuristics used may fail to achieve the correct diagnosis due to inherent errors in our thinking called cognitive biases. Contrast that to a first-year radiology resident, where they may carefully review an abdominal CT scan and consciously think about each structure in a checklist. Such a review, using predominantly deliberate type 2 thinking, may take an hour or so to reach a conclusion. However, this type of thinking is most likely to facilitate the correct diagnosis. Okay, so what does the dual process theory of reasoning look like? A main publisher in the field is Pat Cross-Carrie. He's an MD-PhD who originally trained as an experimental psychologist, who then went on to become an emergency medicine physician and is a professor in emergency medicine at Delhouse University in Halifax, Nova Scotia. He has worked on the adaptation of the dual process theory of reasoning for medical decision-making. Remember that system 1 thinking refers to our intuitive, unconscious manner in which we make most common decisions, whereas system 2 thinking is our deliberate decision-making. The model begins with the initial presentation of the patient, and that presentation has a pattern that is either recognized or not by the observer. If it is recognized, the parallel, fast, automatic processes of system 1 thinking engage. If it is not recognized, then the slower, analytical processes of system 2 thinking engage instead. System 1 processes engage immediately and automatically. This process is reflexive and unconscious. There is no deliberate thinking effort involved. If the patient presentation is not recognized, or if it's ambiguous or uncertain, then system 2 thinking processes engage. Now the system is an analytical one, attempting to make sense of the presentation through objective and systematic examination of the data. It is a linear processing system, slower than system 1, more costly in terms of investment of resources, but considerably less prone to error. The model has several mechanisms for modifying its output. First, system 1 and system 2 may interact with each other so that the final output is a synthesis of the two. The monitoring capacity of system 2 over system 1 allows it to reject system 1 thinking by applying rational override. So if at first look, the mobile echo density on the aortic valve may trigger an endocarditis diagnosis, but if there are atypical features present, then system 2 can override and force a reassessment. It has been suggested that most common cognitive biases are likely due to the overuse of system 1, or when system 1 overrides system 2. Clearly, medical reasoning and decision making do not neatly fall into either one of these systems, and there is obvious interplay between both of them. But this model aims to provide a basic framework for medical decision making. Okay, so why am I talking about this today? Well, our thought processes and our medical decision making are similar to this model, and as echocardiographers, we must acknowledge our susceptibility to cognitive biases. Let's go through a couple cases from the University of Utah that exemplify just how susceptible we are. Our first case begins with the 67-year-old who presented to the OR for a normal EF cabbage. Here are some images from his pre-bypass TEE. In the midisophageal, long-axis view, we see good opening and closing of both the aortic and mitral valves, and no evidence of turbulence in the left ventricular outflow tract. His post-bypass TEE was unchanged from prior and shows no gross abnormality. The way our echocardiography service works at the University of Utah is that we have a designated anesthesiologist on call 24-7 that is certified in both advanced perioperative and comprehensive echocardiography. So in this specific patient, our normal EF cabbage was no longer responding to fluids and had an increasing pressure requirement that same evening post-operatively. The same anesthesia attending who read this patient's echo earlier in the day, Dr. Josh Zimmerman got called by our ICU team. So what would you do? As an echocardiographer, having performed a TEE hours earlier with no major findings, what would you think of this patient's current clinical status in the ICU? Dr. Zimmerman heads to the ICU and drops a probe and this is what he sees. Here we see evidence of left ventricular outflow tract obstruction from systolic anterior motion of the anterior mitral leaflet with significant mitral regurgitation. The development of LVOT obstruction is not the main point here. What warrants discussion is the cognitive bias experienced by our echocardiography team. The patient had two prior imaging studies that showed no evidence of LVOT obstruction. The echocardiographer thought, how could this be LVOT obstruction? Why would it be that? We've already ruled that out. What he experienced was anchoring bias. Anchoring bias is the tendency to mentally lock on to salient features of the patient's presentation too early in the diagnostic process and subsequently failing to adjust this impression in light of more current clinical information. With the new information of the patient's clinical status changing in the ICU, the diagnostic process must be able to change in order to account this new information. So what can we do? The overarching goal of the strategy to combat anchoring bias is to seek to disprove the initial diagnosis. If something doesn't fit, ask a second opinion. After making a diagnosis, ask yourself, what else could this be? What have I forgotten to consider? Our second case is an endocarditis patient. He is a 37-year-old IV drug user who was admitted to the ICU with MSSA Bacteria. At the Salt Lake City VA hospital, our echocardiography lab is run and staffed predominantly by anesthesiologists, and our responsibilities not only include performing and interpreting perioperative echos, but all the outpatient and inpatient echos that occur at our VA hospital. So with this patient, our echocardiography team was consulted to interpret the TTE for endocarditis. And this is what we see. Here are the images that our team was asked to interpret. You see no evidence of valvular vegetation, regurgitation, or abscess. The medical team had a high level of clinical suspicion for endocarditis, so that next day, our echocardiography team, actually the same exact attending, performed a bedside TEE to further evaluate for endocarditis. And this is what he saw. These are 2D and 3D images that show a large vegetation on the mitral valve. So naturally, the echo attending went back and reviewed the trans thoracic images from the day prior to determine if he had missed something. But even multiple echocardiographers in our lab were unable to find any evidence of a vegetation on the trans thoracic images. The moral of this case is not necessarily what happened to this specific patient, it is what happened afterwards. In speaking with this specific echocardiographer, he stated that over the next couple of months, all he could think about when interpreting images, especially in the setting of endocarditis, was that he may be missing something. He said he felt his sensitivity for detecting a vegetation dramatically increased, while at the same time his specificity plummeted. He was experiencing the availability bias, or the tendency for easily recalled or recent experiences to influence decision making. It was clear that after this experience, the availability bias was continuing to influence his further political care. So what can we do to combat this bias? First, we need to remember the objective data on the probability of that specific finding. Second, we need to create a differential before diagnosing a patient with an otherwise recent or measurable diagnosis. Our third case also involves endocarditis. This was a 66 year old male with a past medical history of prior mitral valve replacement and recent stroke. He presented urgently to the operating room for a redo mitral valve replacement for endocarditis. Here are his pre-bypass images. We see dehiscence of his mechanical mitral valve associated with severe mitral regurgitation and paravabular regurgitation. Further images also showed a very large vegetation on the mitral valve. And speaking with the interoperative echocardiographer, the findings of the dehisced mechanical mitral valve and large vegetation were so captivating that it was difficult to stop seeing the obvious when interrogating the other anatomic structures of the heart. All of the team members, including the surgery team, were so satisfied with what they saw as a reason for the patient's clinical decompensation, this dehisced mitral valve, that the mitral valve went on to be replaced. But when the study report was being finalized later that day, the echocardiographer saw something on the aortic valve short axis view that could be interpreted as a vegetation. Luckily, this was interrogated in multiple other views and turned out that this potential vegetation was actually an artifact. But what if it wasn't? Interoperatively, everybody looking at the images was so captivated by the mitral valve that if this was a potential vegetation on the aortic valve, it would have been missed. This endocarditis case exemplifies another commonly encountered cognitive bias called satisfaction of search. In this case, the elephant in the room was the glaring mitral valve endocarditis, but it was quite possible that the aortic valve artifact could have actually been a vegetation, and it would have been missed. All because the echocardiographers eyes felt subject to being satisfied. Satisfaction of search occurs when the visual search pattern is discontinued when the first abnormality is found that can explain the patient's clinical presentation. This is a bias that plagues radiologists. In fact, a 2013 study by Kim et al. that looked at classifying types and prevalence of radiologic diagnostic errors found that 22% of errors were related to satisfaction of search, which was second only to errors classified as under diagnoses or misses, making this the most common cognitive bias in diagnostic radiology. So what can we do about satisfaction of search? Well, we can use a systematic approach to ensure all relevant findings are identified. After completing a primary search, we should initiate a secondary survey. Finally, we need to keep in mind related diagnoses and common diagnostic combinations to ensure we don't miss any commonly associated findings. Our last case we have a 77 year old male with a past medical history of coronary artery disease who had had a cabbage who was admitted to the hospital and decompensated heart failure. He had severe functional mitral agurgitation and the cardiac surgery team proceeded with mitraclip placement. Placement went fairly smooth and the patient was admitted to the intensive care unit after the procedure for continued ionotropic support. Shortly after the procedure, when the patient was in the ICU, our echocardiography team was consulted by the surgeon for clinical decompensation. The surgeon stated, I think this patient's right ventricle might be failing. Could you take a look? Here were the TEE images of that patient and you can clearly see that there is no overt evidence of right heart failure. With the clinical question answered, our echocardiography team finished capturing the rest of the exam images. When interrogating the mitral valve, there was evidence of new severe mitral agurgitation that was new since the procedure and was likely evidence of a dehist mitraclip. It was a good thing that the echo team was able to image this finding and they didn't merely attribute this patient's declining clinical status as not RV failure. In this circumstance, our team, if they had stopped imaging after the RV looked okay, could have potentially experienced framing bias. Framing bias is the phenomenon in which diagnostic reasoning is influenced by how the clinical question is presented. If the question posed to the echo team was, hey, why do you think this patient isn't doing well? Instead of, do you think the RV is failing? It may have been even more likely that our echo team could have seen the dehist mitraclip. So what can we do to combat framing bias? We need to step outside the clinical framework and consider the whole patient picture. If possible, try to interpret the images first while blinding yourself to the clinical concern. Finally, ask yourself, would I have still made this diagnosis if I had a different clinical history? This is a great review article from the field of radiology by Busby et al. published in 2018 in Radiographics. It covers common cognitive biases experienced by radiologists and offers some solutions on what to do when our thought process gets compromised. From this article, some systemic strategies we can employ include the reduction of interruptions while interpreting images. I know interoperatively this sounds impossible, but interruptions can contribute to all types of cognitive biases, particularly satisfaction of search, as our mental memory is compromised to address the interruption. This paper suggests designating an individual to manage non-interoperative tasks to decrease disruptions that are experienced by the image interpreter. This might be someone taking care of the anesthetic while the echo is being performed and interpreted interoperatively. This article then goes on to talk about the importance of participating in echo QA conferences and creating a peer review program with a positive culture at its forefront. We know that peer review programs that aim to establish an environment where errors are instructive rather than punitive, support an atmosphere of cognitive debiasing. Finally, this article discusses how to employ cognitive debiasing strategies. It presents this table entitled Questions to Guide a Strategic Approach for Unbiased Interpretation. We've covered some of these already, questions like, what cases have I seen recently or often that might impact my image interpretation? That is covering and getting after the availability bias. Or, what information or diagnoses have I forgotten to consider? That is trying to get after the anchoring bias. Or, what I've made this diagnosis if I had been given a different clinical history. And that is discussing the framing bias. And then, did I adhere to my primary and secondary search pattern? And that is tackling the satisfaction of search bias. A paper published by Eli et al. in 2011 in academic medicine highlights the importance in the use of checklists in an attempt to minimize diagnostic error. It is widely known that the use of checklists has been shown to reduce errors in a variety of fields and in multiple aspects of medicine. This paper reminds us that checklists help us resist biases and heuristics that lead to diagnostic error. They decrease reliance on our rote memory. They help us consider a comprehensive differential diagnosis. They allow us to step back from the immediate problem at hand to examine our thinking process. And finally, checklists allow us to recognize our altered mood states that arise from fatigue and sleep deprivation. At the University of Utah, this is one of the checklist type cognitive aids that we use when we perform our rescue echoes. This cognitive aid is extremely helpful when the stress of performing, interpreting, and guiding clinical care is at its highest. You can see that the checklist is broken down into the different shock states and in the middle it provides a guideline for specific views to capture and modalities to use to aid in diagnosis. So what else can we do? We can practice metacognition. The concept of metacognition was first introduced in 1979 by Flavale et al. in the American Journal of Psychology. Metacognition is a multi-factorial process where we reflect on our own thought processes or we think about our thinking. It helps us to acknowledge the limitations of our memory and helps us to seek perspective while making decisions. Metacognition has been suggested as a way to minimize errors in type 1 thinking. This can be done by the deliberate incorporation of type 2 thinking strategies. What else can we do? We need to raise awareness and educate. We need to make physicians of all levels aware that our thinking and our diagnostic process can be compromised. All medical specialties, especially the ones with a heavy emphasis on image interpretation, need to implement an educational curriculum for trainees and staff. In fact, many papers state that awareness that errors can be made is arguably the most effective strategy we have to reduce cognitive bias. If we can work to familiarize ourselves with the major classes of cognitive bias, then we can help ourselves predict the various circumstances in which we are prone to error. There is some hope on the Education Front. In 2013, a study by Riley et al. was published in the British Medical Journal of Quality and Safety. This study, they implemented a year-long curriculum in diagnostic error and cognitive bias during an internal medicine residency. And the final evaluation at the end of the year showed a statistical improvement in resident's knowledge and recognition of cognitive bias. This paper suggests that introducing educational content during the training period of physicians during medical school and residency, when these decision-making habits are being formed, this can enhance future decision-making through increased knowledge of reasoning. If you took something away today from my talk, I hope it is the fact that heuristics and the use of type 1 thinking is not inherently bad. It is just subject to cognitive bias and we need to be aware of this, especially in the clinical setting. I hope it's obvious to you that we need type 1 thinking to function in our day-to-day life. If you had to think in detail about every single clinical decision you made, you wouldn't make it through 10 minutes of your day. Finally, I want to encourage everyone to start considering our own cognition and its inherent flaws, especially in the setting of image interpretation. I want to emphasize that these biases are not something for beginners that go away when we have more experience. They are embedded flaws in our cognition. Remember that we are not perfect and that our mental processes are subject to error no matter how much experience we have. Thank you for letting me talk about what I think is a fascinating subject and I'd be happy to answer any questions.