 Okay, welcome everybody to this webinar, introducing the human factors and the role of the science in primary healthcare. As the chairperson of the Wonka Working Party in Quality and Safety, I would like to welcome all of you to this initiative. Just for your own benefit, I mean what is Wonka? Wonka's mission is to improve the quality of life of the peoples of the world through defining and promoting its values including respect for universal human rights and including gender equity, and to foster high standards of care for general practice and family medicine. Wonka is the world's organization of family doctors and as the Working Party in Quality and Safety have the mission to support family doctors around the world in their efforts to improve the quality and safety of family medicine. We hope that through this and through other related activities you may know better the Working Party and we will be very happy to receive expressions of interest for joining us. Patient safety is, it's already in our name, so it's very high in our agenda, in particular in collaboration with a range of WHO patient safety initiatives related to the WHO Global Patient Safety Action Plan. 2021-2030, which is one of the most ambitious programs in patient safety so far. The Working Party has recently supported activities focusing on medication without harm, and we are working on patient family engagement in patient safety with the aim of reviewing the evidence, developing a research agenda, developing training in this area and a number of rate activities and webinars. The WHO Global Patient Safety Action Plan has a number of strategies and one of them is strategy 2.4, which requires a strong human factors perspective and input from primary healthcare in order to strengthen the resilience of primary care organizations and clinical practices. It is my pleasure to introduce the co-chair of this webinar, Dr. Maud Norte. She is a family physician from Candon Health Improvement Practice, a practice which looks after the after high risk homeless population in London. Maud qualified from Rotstam, the Netherlands and completed her prescribed training in North London. She is a certified human factors trainer and has an interest, particularly in reducing errors between primary and secondary care at their interface and to the use of checklists in mitigating medical errors, including in prescribing. She is a founding member of the Medical Faculty and an innovative collaboration with pilots using flight simulators to facilitate human factors training for doctors and other healthcare personnel. She lectures regularly on human factors at Southbank University and Middlesex University and is author of 22 peer reviewed publication. Maud leads the human factor agenda within our working party and it is a true privilege to have her in the working party and leading these and our related activities. I just, before I hand over, I just realized I simply did not introduce myself. I am Jose Maria Cema Valveras. I'm a GP in the head of family medicine at the National University of Singapore and I would like now to hand over to Maud who will be able to tell you more about the plan for the webinar and also be able to introduce our very distinguished speakers. Thank you Maud. So hello, all of you. This is the program for the afternoon. There are a total of three presentations with a five minute break in between. So I'll introduce human factors and primary care myself with a short presentation of about 20 minutes. And this is followed by John Beasley Professor John Beasley from University of Wisconsin he will talk about using a human factors approach to improve electronic health record use in primary care. And this is then followed by a five minute break. And finally, you will see Peter Nataraj from a big airline company, a pilot who has an interesting presentation on lessons learned from human factor integration in aviation. So those with a brief question and answer session and with a brief panel discussion. And lastly, the aim throughout all this is to support really the members of one car and to another organizations in the direction that human factors can take in primary care. And as I lead on the working party I urge you all if you feel like contributing to this process. If you or your organization feel like it, then just mention this in the chat and we will use that chat as a kind of a resource base. So feel free, you can also contact us directly afterwards if you if you want to have a talk further. So for now, I'll leave you to it. We have about 20 minutes for John's presentation. He will introduce himself in the first slide. I know his presentation. All the other two expert must be excellent presentations and I really wish you a happy afternoon with this webinar. Andrea, go ahead with the first presentation. Once more, a warm welcome to this introduction. Human factors in primary healthcare. What does it mean? My name is Maude Nataraj. In the next 20 minutes or so, I will give you a flavor of human factors. It is a tasty webinar about what this science is about and how it applies to primary care. There's one exercise and a video. Towards the end, we will touch upon some ideas for the future. For now, enjoy. I have no commercial disclosures to make. Let me tell you a bit about my world and journey in human factors. I'm a GP in a practice for homeless patients and also a human factor trainer. This is different from a human factor specialist and we will come to that later on in the presentation. From a young age, I was fascinated by aviation and space. I took some flying lessons and enjoyed solo flying in a small airplane near the city of Rotterdam. I thought of a career with the Air Force, visited a research center with a human centrifuge and a naval air station. Then my medical studies took over and the opportunities to keep flying stopped. I stayed in contact with aviation though as I continued as a doctor organizing repatriation of ill patients to their home countries. In 2014, I encountered aviation again through a course in my home country, the Netherlands, where pilots were bringing doctors in contact with a, to them, unfamiliar environment of a cockpit simulator. This was to elicit behavior under stress away from medicine and then to enable the doctors to reflect on this effect for their practice. I came out of that course thinking everyone should get the opportunity to know and apply the science in their day-to-day practice. Incorporating human factor principles and daily operations make my work and that of the team I work with easier to do. Ever since that course, human factor science and in specific its practical application in day-to-day general practice has been my special interest. It led me to observe my thoughts and behaviors in decision making under pressure in primary care. In all this I had a steep learning curve that could not have taken place without collaboration with a group of friendly pilots who shared their human factor knowledge enthusiastically with myself and a small group of other clinicians 2016 onwards. Together we set up training for more health professionals around Heathrow Airport. Some of you may be very experienced in the science of human factors. For others it may be a first encounter. Some of you may not know very much about primary care but are more into human factors. In that case enjoy the tour of our settings. This webinar aims to make a start with vision development by bringing primary care and human factors closer to each other. An exercise of awareness of each other, a taster. In this presentation we will touch on what human factor science is, see how it relates to our operational environment by means of two cases and we will see what situational awareness is relevant to follow the other two presentations. Lastly we will see some aspects of what human factor integration in primary care could involve. This is a short video clip. Just listen to the instructions and take part in the exercise. You have only one task. Recall. Enjoy. Wouldn't it be wonderful if we did not have to rely so much on our memory? So did you get the six questions? It was not easy but why? What else did you notice apart from the questions? Was there any music? Did your mind wander away on distraction? What was the influence of the sound on the accuracy of your answers? Some of you may have noticed it was rather difficult to answer all six questions right. And by the way, did you notice the word genius was spelled incorrectly? Our visual attention and memory have a limit in what it can do in a certain amount of time and under certain circumstances. Perhaps you've noticed this in practice when you've just briefly seen a patient with some pathology in a high-pressured clinic and you notice you cannot recall whether the pathology was on the right or the left side and the patient has gone in the meantime. What would this limitation mean for your work? And how would this influence the safety performance of the service you work in? Let us take a dive in our working environment. We'll work in very different work systems. Let us have a look at a few. On the left upper corner you see an image of a few of 300,000 refugees in Tanzania who are patients of the local primary health care services. On the right upper corner you see Mongolia's mobile health clinics bring primary health care to vulnerable communities. A patient, Norov, has just had his consultation with a mobile health team from the Subprovincial Health Centre which visits herder communities in their homes. Next, a primary health care environment in a metropolitan area, New York with street health outreach teams with their mobile units taking care of wound care for homeless people. Lastly, the area we cover may be so vast that we need to fly. This image of Royal Flying Doctor Service in Australia shows you a PC-12 landing on an outback strip and then there is of course also the classic office environment for primary care in houses or purpose-built facilities fully stocked. Different environments, each with their stakeholders, budgets, other allied services and existing policies and procedures different primary care models, some within a public health system, some with public health included different staffing, different patients needs, all grown historically. Let's now go back to our practice and see what happens to the memory of two experienced health professionals in a practice in a very poor underserved community with a lot of infectious diseases. It is in the middle of a second wave of COVID-19. A patient comes for phlebotomy and at the entrance, the nurse, as per national protocol has just asked if he had cough or loss of smell, which he did not. Whilst in the chair in her room and to her surprise, he had a cough fit and subsequently could be heard non-stop talking through the door by the doctor opposite about sick notes. As she did not know what to do with this patient, the nurse went outside into the corner and consulted the doctor about the patient with potential COVID-19. As per national guidance, the patient was put in an isolation room. The patient informed him that he had no COVID test done yet, not seen any blood, no fever or yet some weight loss notice, but not short of breath. He insisted on a sick note for a past leg problem non-stop talking and the doctor and nurse informed him that he likely had COVID-19. As per protocol, the patient was left in the isolation room to call the telephone number to organize the COVID test. Afterwards, he left. This took no more than two or three minutes. Three weeks later, a hospital discharge letter arrived with the diagnosis of cavitating pulmonary tuberculosis. The patient had been admitted for two weeks and had lost 15 kilos. So what happened here? Why were two experienced clinicians sending away a patient with potential tuberculosis? On the right, you see some explanations. Under stressful, challenging conditions, some unwanted cognitive reactions can appear automatically due to a common ancestry from thousands of years ago. Anchoring on COVID, a pre-suggested diagnosis via the nurse, ignoring the fact, the weight loss. Perhaps you have not noticed me mentioning it. Availability bias. The isolation room for COVID-19 was about half a meter away in front of the clinicians, scared for not being vaccinated or others not being vaccinated. It was a deadly disease at the time. Their performance was simply shaped by their limitations as a human and this is a recognized phenomenon. In such a changing environment, a computer decision support system will not work. More training on tuberculosis would also be unlikely to help. They were already experienced. For this type of situation, there are recognized cognitive skill sets, such as de-biasing questions. These can be acquired through human factor training. The case we just saw showed staff taking the national protocol on COVID in mind, an interaction between staff and a protocol. They also interacted with each other. One gave a diagnostic bias to the other. Now let us relate that to the definition of human factors. Let me give you the one from the International Economics Association. It is a scientific discipline concerned with an understanding of interactions among humans and other elements of a system. And the profession that applies theory, principles, data and methods to design, to optimize human well-being and overall systems performance. You may have heard of the term ergonomics or the science of work. They all refer to the same. To find more about human factor degrees, you can have a look at the educational programs of the International Organomics Association, Google EarthMaps, online. Now let us have a closer look at the definition of human factors. It has the term system in it. So what is a system? There's not much magic about that word. Es Pascal Carayon defines it in a conference of eye practice about 10 years ago. It is a set of interrelated components working together towards some common goal. Think back of the earlier work environments you saw in a slide earlier. You can take the refugee environment as your system or the whole healthcare in a district or your own practice. Systems have a structure, function, boundaries and people and for good functioning need alignment. Peter Nataraj will show you three simple questions you can ask yourself when defining your system. Human factor science has goals to enhance safety and comfort, to reduce human error and increase productivity. It is a science that is multidisciplinary and based on psychology, physiology, sociology, anthropometry, interface design, engineering and biomechanics. You can say that people from the human factor community are by nature people who like to design and they like to put people at the centre of work. What about primary care and human factor research? How does primary care compare to in the literature? A general public search on human factors shows some 800 articles for surgery and 128 for primary care. You can see in the timeline that surgery started earlier with research on the topic. It is worth noticing that retrospectively some primary care articles would have human factor elements in them but then did not get the corresponding tag. An interesting question could be how much access primary care practitioners have to full text human factor journals via their local medical libraries and how much the national primary care organisations engage with the local human factor community. We saw earlier what the multidisciplinary study of human factor sciences a study of several years. Human factor training, on the other hand, is training for those people working in the operational environment of primary care. It addresses hazards and risks identified in the organisation. On the left side you see a list of typical topics of such training. They're all interconnected and it is important to keep this in mind when setting some of the operational preconditions for optimal and safe performance in the care environment. The address adaptation to changing environments during the day in primary care and helps staff to get through sometimes unplanned or even rare type of patient encounters. On the right side you see a list of typical techniques for use on the operational side. Let us now leave the human factor training and take a look at our second case. Here you are duty doctor and look at the hemoglobin results of a patient you do not know. Normal level is 130 to 180 gram per litre and you conclude the patient is with an HB of 112 anemic. The health record shows a nice graph of the trend of HB overtime. To your surprise there was a significant dip of HB down to 58 a few months earlier. You wonder if the patient had a major bleed and keep searching for this in the records. In the end you wonder whether the data could simply be wrong as you cannot find anything like a bleed. You open a letter from a diabetic clinic with the same date as that of the blood sample with a low HB. The 58 had been an HBA1C and not an HB in the letter. You know the coder however as a hard worker and wondered what may have happened here. Let us take a step back and look at what the coder would have done to do for coding an HB but now in the record of a test patient. By entering HEMO the following long drop down menu on the right lower side of the screen appears. You can see that the first one that comes up is HB and not HBA1C which is number 4. Early pattern recognition by a hard working coder may have played a role here when number one was chosen instead. The coder was merely doing what seemed logic, unaware of certain hidden design aspects of the automation in use. Let us reflect on the two cases we saw. Tuberculosis versus COVID and HB versus HBA1C. In both cases the health professional and the coder respectively were not in a position to understand the correct meaning of what they saw. There was no counter force for the phenomena of the biases and pattern recognition that they experienced. Because of this they could also not project in the future what the status of these patients would be. One was going to deteriorate and have an admission for tuberculosis. The other was going to be missed in follow-up for his diabetes not being in control. Micah Ensley, an engineer and former chief scientist of US Air Force developed the term situation awareness or SA defined as the perception of environmental elements and events with respect to time or space and the comprehension of their meaning and the projection of their future status. To understand how to best help the coder with the work one would have to observe how they do the work in reality not how we think they do or how they are supposed to do. In the next video you will see Frank Gilbreth just do that. Early in the century Frank Gilbreth used photographs to break a worker's every action into the smallest units revealing every unnecessary movement. He studied the fastest workers in order to teach maximum efficiency to everyone. In Gilbreth's cyclograph technique he attached flashing lights to a worker's fingers to indicate the length of time a motion would take. Soon Gilbreth's photographs were changing the lives of everyone from golfers to oyster shuckers. Gilbreth observed that surgeons took more time looking for their instruments than they did in performing the operation itself. He suggested a new procedure. Keeping an eye on the incision the surgeon would extend an open palm to the nurse and utter the now famous words, scalpel please. So in summary we've seen through some videos and exercise that humans have limitations. When designing for primary care we will need to take them into account as no matter how hard we try we cannot alter those limitations that were set for us hundreds of thousands of years ago. So for safety around all those limitations we will need leadership on board. Let us have a look at general structures around safety of patients and staff. Here's a spiderweb I made in 2018 around who and what is involved when thinking about human factors and safety. I used it earlier for human factor training for regulators. At the time the context was that some health professionals had to go to prison for errors. Perhaps you see your own organization or role already here. You can make your own spiderweb for wherever you are. As long as we all play our part with leadership slowly the ship of human factor science integrations throughout primary care will turn. This will take time and may take decades. Leadership on human factors comes with vision development. On this slide you see some interesting examples of visions in other organizations. The Chartered Institute for ergonomics and human factors. The clinical human factors group. A technical operations group within the FAA air traffic control. The human factors advisory group of YASA. And finally the CAA the civil aviation authority in the UK. As you can see they each have their own thoughts. Whether you're a single-handed practice or individual doctor in a bigger organization or a global organization being aware of human factors and developing a vision helps to guide you and your organization. In conclusion we defined what human factor science is what a system is and what human factor training is. We first hand experienced how we have limitations in our performance simply because we are human. We noted that human factor science helps with building systems around those limitations. This then enhances safety and well-being in primary care. A logical next step is to build on a vision. Hence our next human factor webinar will be in 2023. Good evening or good morning or whatever it is and whatever time zone you're in. I'm John Beasley. I'm a family doctor and I've been practicing for someone over 40 years in the Madison, Wisconsin area. And for those of you who use EPIC, my office was about a mile, two kilometers from the EPIC campus. And I have been asked by the WACA working party on quality and safety to discuss a few things about bringing human factor science to the way that we use electronic health records. I guess that we started with this with an article that was published a few years ago about it's time to bring human factors to primary care policy and practice. And indeed we want to now do that and I'm going to focus specifically on the issue of how we use electronic health records. It doesn't have to be this way. This picture breaks your heart. It was published in JAMA, Journal American Medical Association. And on the left, on the middle, you see a seven-year-old girl. She drew this picture over on the side, on the right, are her parents. And on the left is the physician, totally engrossed in his use of the computer. And again, this is poor human factors designed when we think of relationships, recording information and the stress on the physician not relating to the patients. And I'm going to talk, of course, from the United States perspective because that's where I am. But I can say that in the United States, EHR use has been a mess. Here's Harvard Business Review a few years ago saying, EHR has been a disaster for the clinical use. A disaster. Good heavens. And claiming, pointing out, that the billing and compliance function has dominated. So we need human factors science to understand what's going on in primary care to the cognition, the social interactions and the team functions and the communication, the technology. It's all important. And then we needed to help us design the useful, potentially useful interventions and improvements. And then we need to conduct some sort of laboratory test of these proposed interventions and improvements. And finally, most important, not most important, but very importantly, monitor systems in the real world when systems and technologies are deployed with a special attention to unintended consequences. And I like sometimes to consider the analogy to developing a COVID vaccine where we needed a lot of basic science, a lot of basic understanding. We had to understand the molecular and cellular biology of the disease and the host. And then we had to design the vaccine around that understanding. We had to test it in formal studies to make sure it really worked. And then, again, very importantly, monitor it as was used in practice to assess effectiveness and safety and monitoring for the unintended consequences. And human factors gives us the methods to do this for all of health care. So human factors helps us understand the cognitive needs, the situation, the established situation awareness, the communication needs, how we have effective communication, not only with our coworkers, but even with ourselves in a sense of sense making and of course with patients as well. We need to have human factors understand how to reduce workload, how to reduce distractions with data and order entry or other things that go on. And how do we use the EHR to help us develop relationships rather than disrupting them and more. And human factors looks at the entire system in which the technology is used. There's policy and purpose. And in the U.S., the purpose of the electronic health record has become documentation. That's a terrible purpose. It has to help us support good care and communication. We have to have good technical and user interface with more functional designs and interoperability, which is woefully lacking in the U.S. and I think in some other countries. There has to be appropriate implementation by organizations so that they no longer try to turn physicians into typists and clerical data entry workers. And then we have to use it appropriately. We have to keep the tool in its place, avoid focusing on the screen, using the screen as a way to bring ourselves and our patients together. And then we have to help patients use it by developing portals for easy access. Now, again, I'm from the U.S. and so I hope you'll pardon me if I speak from this particular perspective. But there are some real differences in other countries and again, if I've been in completeness list or inaccurate, I hope you'll understand. But in the U.S., about half of all physicians are employed by large healthcare organizations and many, many more in group practices. Solo or one or two partner practices are really rather unusual here. There are longer visit times, an averaging 15 to 30 minutes in family medicine and the other primary care specialties. Certainly this is much longer than the visit times many of you and many other countries have. But at the same time, we have a complex, inefficient and wasteful funding system, or systems, plural, that drive electronic health record use and most of this is a fee-for-service system where you spend a lot of time documenting exactly what you did. And then as a culture and as a country, we unfortunately have a rather low emphasis on primary care, including in this country, family medicine, primary care internal medicine and primary care pediatrics. And then we tend to have larger support teams, more nurses, more nursing assistants, more receptionists. So there are a lot more people around and this I think has both good and bad impacts. But for all of us in all countries, one of the things we do during our time with the patient is we establish situational awareness so we can understand what it is his patients need. And situational awareness, and I credit my colleague Micah Ensley for this, it came kind of out of the aviation and military worlds. But to summarize, what's going on? What's happening? What's the blood pressure? What does it mean? Is this abnormal or normal? Where is it headed? Oh, it's getting worse. It's dropping. And if we change something, what will happen? Well, if we give intravenous fluids to help resuscitation, that will be good. And situational awareness is critically important for taking good care of our patients. We need the system, including the electronic health record. And again, the electronic health record is only part of a larger system to help us do this. We did some research which looked on barriers to developing situational awareness. And one of the things we came up with was what we called information chaos. And this came out of the study of physicians and patients and staff rooms. And we looked at the information chaos had a large number of elements to it. There was information overloaded in the room with the patient. There's other people in the room. There's a chart. There's a nurse. There's the electronic health record. There's our own prior knowledge. There's information under load, something we can't find. We can't find the electrocardiogram. Information scattered in multiple places. Information uncertainty. There's two different medication lists and which one is correct or maybe what the patient is telling us. And then there's information error, stuff that is just plain wrong. And all of these conspire to create information chaos. And all of these need to help from the electronic health record to reduce the chaos. Now, we looked at the context for electronic health record use. And we've noted that on average for all patients, whether they were babies or toddlers or adolescents or elderly, they had over three problems addressed at any one visit. And that's been replicated in a couple of other studies. I'm using older data here, but my colleague John Tempe reupted. And the numbers are very similar. And so if we had patients who are over 65 years of age or diabetics, they average over four problems addressed at every visit. And in the U.S., I mentioned the need to have human factors drive policy decisions. Well, we have what is called note bloat in this study done by Downing and colleagues. I'm going to turn off the video here simply so you can see all the references. If I can get it to go off. Well, it won't. And, you know, the length of notes in the U.S. is more than four times longer, more than four times longer than the length of notes in most other countries. And you have to ask the question, does this really add more useful information? So if you put in examples where human factors can inform what we should be doing, well, it can help inform how we should record information, how we should communicate information, how we should use the right tool. And by the way, paper is not evil. Paper is really useful stuff. And in the U.S., to put it in context, there's this great thing about let's go paperless as if that was a goal. Well, that shouldn't be a goal. Effective communication, effective understanding should be a goal. And should there be development of effective clinical decision support tools? Or maybe not at all. Maybe sometimes we don't need these. And of course, we have to increase the efficiency. So recording information, obviously I'm just scraping the surface of a very large field here. But I would postulate that using a pen handwriting leads to better understanding than using a keyboard. And I'll give you some data on that. We need to record information in a way that builds relationships, doesn't just interrogate people, but it helps us engage in a dialogue. And when we record information, we establish information, situation awareness. So as is common in the U.S., just blowing in a bit of text really doesn't help this at all. Using templates for clinical information does not help build situation awareness. But using narrative, telling stories to ourselves, to our colleagues, to our patients, that does. So here's the thing on the pen is mightier than the keyboard. And I realize many other countries really do note taking during the visit to a greater extent even in the U.S. But this is a study done with Princeton graduate students. And they noted when the students were to record information with a keyboard, rather than writing it down, there was shallower processing. They prescribed to transcribe verbatim rather than processing the information, reframing it, and a steter metal to learning. And you see that the group in this randomized control tile that used both laptop and shorthand, they had about the same recollection of the factual information, but they had distinctly different with keyboard and coming out much worse and handwriting do much better ability to get to the concepts. And I would suggest that any physician who keyboards, the notes during the patient encounter is probably missing things that the patient might want them to know. Also using the electronic health record as a bridge, not a barricade here. You know, I see many times our learners, our residents kind of typing like that picture on the upper left, where my colleague just to demonstrate is focusing on the keyboard and the screen and not on the patient, rather than using the computer as a way to link patient and physician. My own preference and a strong preference it is, is to dictate right in the presence of the patient. When they can talk to us, let us know, did we get it right? Did we understand the history? Are our suggestions reasonable? And if there is a YouTube video there that you can go to if you want to see the technique for that. And patients love it. The data are that patients really find it's very good. They know they've been understood and they know that if we made errors, they have a chance to correct them. So communicating information. This is interesting too. Oftentimes it's hard to communicate information on screens rather than paper. The study by Jaber goes into the screens are more cognitively taxing than paper. People consistently report that what they want to focus, when they want to focus, they read it on paper. Now if I want to look up a lab result from six months ago, sure, probably the screen is much better. But for a lot of information, we should not be relying totally on screens. We should also be encouraging verbal one-on-one communication versus electronics. This has been shown to lead to better outcomes, lower LDLs in patients with coronary disease and even fewer ER visits when more interpersonal communication between physicians and support staff or between physicians is used. And then we want to reduce the use of templates to develop narrative because we do communicate by narrative. We communicate by telling stories. Another fallacy in the U.S. particularly was assume no paperless is good. Well, is it? On order entry, my colleague Christine Sinski wrote to me an email. It takes six seconds to write return one week with CBC and potassium and a piece of paper and 121 seconds, 121 seconds to do this in our electronic health record. And I'll be getting shortly to think about the pajama time work, but typing the clinical notes accounts for most of the one and a half hours, nearly one and a half hours a day of pajama time work after clinic hours. And the fact is, as noted above, oftentimes we read less well on screens than we do on paper. So the question is, what is the proper role of screens? What is the proper role of paper? Well, clinical decision support. When the electronic health record came in in the U.S., there was great thoughtful. This is really going to help doctors make better decisions. They'll, you know, they prescribe an antibiotic and it'll pop up. It comes to you, well, you don't really need it here. We should use a different one or whatever. Well, there's been a couple of systematic reviews. And the story is really very sad that these have not been so shown to be useful. And oftentimes they may not be relevant. And the physicians are drowning in information and a little alerts and alarms are going off all the time. And the real question, do we get enough benefit to patients out of all of this? Because it's pointed out that the clinical decision support can contribute to physician frustration and burnout and may not improve care, unfortunately. This study got a great deal of press by my colleague Brian Art was the lead author on it. But he looked at our University of Wisconsin physicians primary care and found that they spend an average of one and a half hours of work on the electronic health record outside of a 10 hour clinic day. That's the blue line. And the blue line, by the way, is on weekdays. And that's, you know, you see that big lump, they come into the office at 8am and then on the computer. And then somewhere around six or seven things taper down. But there's this long trail going on to the you're right on the screen of using the computer in what we call pajama time. This is clearly contributed to the burnout of physicians. And it's equally bad on weekends. The physicians are getting on it on the weekends. And you see that they're, you know, kind of got a mid day during the weekends doing a lot. And then there's that little blip on the evenings in the weekend. We call that date night with the electronic health record. You know, really kind of sad actually. So there's take homes from this for our policies. And I think we need to remember the analogy to a new medication for our policymakers vendors and healthcare organizations. We really should try to understand primary care, really what's going on cognitively socially in terms of situation awareness before proposing improvements. And then before implementing improvements for having sex test them, do they really make a difference. And then finally, again, after we've implemented them, evaluate them for the impact on patients, clinicians and staff. And it's human factor science that gives us the tools that we need to accomplish those above goals. We also need appropriate policy and purpose, you know, patient care, not documentation should be the purpose and policies. Policy requiring this or that have any impact in the real world of practice. And it may not. And again, human factors will help us document that we need better human technology. The medical medication lists and epic are dreadful trying to figure out when a medication was changed is almost impossible. And then there's feature creep. People keep adding more and more features. And we need to reduce the numbers of clicks and strolls and human factors can help us do that. And finally, we want to have supportive implementation by organizations. You know, there's been this big call to go lean. Well, going mean means that we really don't waste the physician's time and attention, which is the most valuable resource in a clinic setting. So we don't ask our clinicians to be typist. We get rid of computer order entry, which is, you know, it takes 20 times as long in the computer as on a piece of page. We get rid of useful, useless clinical decision support and excessive words. And then one important thing I didn't mention about is we design clinics. So if there are multiple team members, they're really all co located. They can talk to each other. They can exchange a glance and we encourage verbal communication. And then for us as clinicians, the patient does come first, but we matter to if we're burned out, we can't take care of our patients. But we want to record narrative information with dictation. We want to ensure patients of real confidentiality over 80% of patients are concerned about confidentiality and medical records. And we want to be careful that we're sharing the screen so that the electronic health record becomes a bridge, not a barricade. So I'll finish with this slide, which I think is kind of cute, but it really shows the importance in a sense of that we've got to be in command of the technology and how we use it. We can't just be riding in front of the airplane. We've got to be designing the airplane, be the pilots and use it appropriately. And I thank you. Thank you for the invitation on behalf of myself and train ethics to talk to you about the way human factors have been implemented into aviation over the last 30 years and hopefully also to see what lessons might be learned with implementing a program going forward of human factors training or human factors improvement within the medical industry. A very brief introduction about myself. My name is Pete Nataraj I've always wanted to be a pilot and to be a pilot now for over 30 years in various different companies. But I very much the odd one out in my family all the rest of my family are doctors are involved in medicine in some way my dad was a general practitioner. My mom was a theater nurse my uncle's aunties are all in various different branches of medicine. I feel a very strong connection with the medical industry, especially when it comes towards safety and wanting to improve as much as possible. I've also been very involved in human factors training for the past 10 years of part of my job in one of the world's largest airlines. And it's very important to us in particular not just about using the theories but about making it practical and useful, particularly in aviation but also in other safety critical industries going forward. The World Health Organization Global Patient Safety Action Plan has some pretty stunning statistics and for me there's a few that stood out. The idea that 134 million unsafe acts per year and 2.6 million deaths per year can be attributed to substandard care are pretty frightening. And if you put that into context, that's the equivalent of crashing 96 Airbus A380 crashes each day. So how does that compare to aviation. In 2019. It was the busiest year pre pandemic. Aviation carried 4.5 billion passengers on approximately 38.3 million flights. It's an incredibly dynamic environment. Some things happen within our control. Some things happen outside our control. Mistakes are made by pilots engineers ground crew cabin crew every single day just like any other human being. However, the 2019 statistics also showed there were only 115 accidents, six of which were fatal and a total of 239 deaths. So despite the complexity, something is still going right. It hasn't always been like this in the 1960s. There was a huge advance in aviation huge progress into the commercial jet age. However, it wasn't all good. These were images that you could see in newspapers regularly compared to the 4.5 billion passengers we carried in 2019 1965 only had 200,000 compared to the six fatal accidents in 2019 1965, there were 57 with 1640 deaths. If the same accident rate would have continued from 1965. We would have killed nearly 3 million people in 2019. We were in a very similar place to where medicine finds itself right now. And something had to change for us the way that humans were interacting with this new and rapidly changing technology clearly had some flaws. And while commercial aviation was struggling with its accident rate, NASA was also discovering that they were having some issues also. They'd selected English astronauts for having the right stuff as amazing individual test pilots. But they found that when they put these incredible individuals into the Gemini project and asked them to work closely as a team in long periods of time, that the results were just not quite as good as they expected. The same skills and behaviors to operate as a team are not necessarily the same as being a brilliant test pilot. At the same time, accident investigators also realized that very often the reasons for problems and incidents on board aircraft were not quite as simple as they appeared. In the 1870s, there'd been cockpit voice recorders installed in almost every airplane. The main reason being that unfortunately with all the deaths that the people they needed to talk to were just not alive by the time they came to investigate the incident. So the recorders were all installed. They discovered that the majority of the incidents that were occurring, about over 80% of accidents were failures of interpersonal skills, issues with communication with decision making with leadership. They were asking themselves questions like, why did really competent operators do something as silly as landing an airplane with the landing gear up? Did they not know what was going on? Did they understand the procedures? Were they too busy doing something else? Did they know they were close to the ground? Was their situation awareness in somehow impaired? What they discovered is it wasn't even as simple as that. There was two elements to this. One was the human factor's element. The other was the environment that they were working within. This is a picture of a 1930s designed airplane, a B17 flying fortress, which has crashed with the gear up. I'm using it as an example because it shows what was going on with the ergonomics of airplanes. Aircraft started off relatively simple, with relatively poor performance, but as they became more complex, engineers and designers just added more switches and added more dials. Without really thinking about the humans that were going to use them and how they would operate these machines as they got faster and more complicated and the workload started to increase. There was almost an expectation with the early aircraft and the pioneers that there would be failures and there would be accidents and there would be deaths associated with it. But as more and more passengers started to travel, that was completely unacceptable. And people wanted to know why these kind of things were happening. They started to look at the positioning of switches, something as simple as a gear lever and a flap lever. It's pretty obvious that we need the gear to land an airplane, but we also need to allow the airplane to have the flaps out to fly slowly. If you raise the flaps shortly after takeoff rather than the gear, you will probably crash. These are things that every pilot knows, so why were highly trained people moving the wrong lever? If you look at the image that's on screen, everything looks the same. The gear levers, the flap levers, both of which are indicated look pretty much identical and are located in very similar positions. No thought had actually been given to the human as part of the design process. Think about the machines that you use and the objects that you use every day at work. What's the packaging like? Are the machines just more complicated now than they previously were? How much the human being considered as the advancements have been made? Or are these things just making your life more complex? And is there room for error just in their normal day-to-day usage? Aviation looked for some of those big ticket items. There was a standardisation amongst all of the design, which meant that ear levers look like they've got a wheel on the end of it. Flap levers have got a flap shape on them. Designers and engineers decided to put them in different positions in a flight deck to avoid confusion. And almost every manufacturer, no matter what different aeroplane type they are, has taken that same standardisation. So there's even less confusion between aeroplane types. Some of those really big items that improved safety have now been done and the accident rates have reduced. But they've not stopped completely. It's from here that the subject of human factors and ergonomics became a subject of study in its own right. So what is human factors? Well, it's defined by the International Ergonomics Association, like this, which is a definition that I think you may have seen before. The important part is that it's this split between the human and the system that we work within. Human factors can be split into three categories, the physical, the cognitive and the organisational. I'll talk slightly more about the organisational later. The physical part of it is very much the remit of the designers and the engineers to understand the environment that we work in and the equipment that we use and to understand our physical limitations. The cognitive element, that's something that we make a point of making sure that every single pilot understands from the moment they start their initial training. And that's not just because it's a good idea, but also because it forms part of our regulatory framework. There are definitely lessons that can be learned from the slow pace of change of the implementation of human factors training within aviation. It was absolutely clear in the 1960s that something was seriously wrong. However, it took until 1979 when NASA had an industry workshop where they invited flight surgeons, psychologists, engineers, designers, all together. The results of that suggested that not only were there improvements that could be made with the design, but also that the human limitations of the pilots were also the cause of some problems. The pilots were fallible and actually could benefit from the advice of some people outside aviation. There was the introduction of the cockpit voice recorders, there was the introduction of flight data recorders that were monitoring every single thing that pilots did on board aeroplanes. But there was incredible skepticism amongst the pilot workforce, not only because of the advice coming from external agencies, but also because all of these recording elements were considered to be spies in the cockpit, that were gathering information to be used against pilots and to blame them for incidents and accidents. But there was still no requirement to actually implement this kind of training. So we pulled down to the airlines at this stage to do something that they considered to be right. In 1981 United Airlines developed its own crew resource management CRM course, which is what we call human factors training within the airline industry. But behind the scenes something more important was actually going on. This change in culture to go away from this idea that all of the monitoring systems were there to actually find blame, but actually there to underpin safety and improve the safety overall of the airline. We waited until 1989 before the international governing body, the international governing body of IKO came up with the human factors digest, which a few years later became a 1992 document, which embodied a requirement for CRM or human factors training to be implemented for every single pilot when they start their commercial career. So what human factors training did pilots actually undergo? Well, it begins at the very, very beginning, even the most lowly, humble private pilot learning to fly their single-engine aeroplane just for their own personal pleasure has to undergo human performance and limitations training. They must pass an exam as part of their licensing that embodies human performance and limitations. The simple concept of that is in the words of the 18th century poet Alexander Pope in his essay on criticism that to err is human and to forgive is divine. We will make errors. We will get things wrong. It's an acceptable part of being human and we are not infallible. By accepting that we'll get things wrong, we need to surround ourselves by processes and training and systems that stop those errors from ever becoming serious impacts to safety. That type of concept with more tools to enable how to manage it is developed when we do our commercial pilots exams with the commercial theory exams. And then when a pilot joins a company is mandated that they must undergo more human factors training. Why at this stage? Because the way that human factors is implemented on a company company basis as part of their company culture is really, really important. Although the core elements are the same universally the language that's used and the way it's actually embodied by a workforce is really, really important to understand. So it has to happen every time somebody joins a new company. And from there on it's embedded within every single form of training and checking and proficiency check that a pilot undergoes, which every pilot must undergo every single six months. Despite all of this being part of our regulatory framework and having been so for the last decades. It's still up to national regulators and then on to individual operators to implement the regulations. And it's not always done in a way that everyone may think is appropriate or even effective. Too low, terrain, too low, terrain. The first is because it shows that despite on an international scale, we have had human factors training as a requirement to be implemented. It doesn't always prove effective. The second reason is to show you that it's fairly obvious when you look at something that CRM human factors is not being done well. However, it's not necessarily quite as easy to define why. The third reason to show you that video is because despite that was very obviously not good training. Despite the fact it was going on in a critical phase of flight very close to the ground. There was still a safe outcome. And the reason why that's very important is something that I'm going to talk to you about in just a couple more minutes. Which is that it's not always very obvious how we can measure safety and the impacts on safety if all we look at are negative outcomes. In this case, there was no incident. There was no accident. However, clearly things were not going particularly well. So if what is wrong is fairly obvious, how do we measure good from a human factors perspective? Can we? Well, the answer is yes. One of the biggest improvements towards human factors studies came when NASA defined good behaviors. They started off by calling them the no text, the non technical skills. We've moved on from those initial elements that NASA came up with into what we now call the pilot competencies. This pilot competency framework is pretty much on its 19th iteration in the last 10 years. It's continuously being modified and redefined. However, if you look closely at the detail within there, you'll find that many of the pilot competencies are not really pilot specific at all. They are just human competencies that have got huge relevance in most fields, especially those that combine a social and a technical element, especially when it's within a complex and dynamic system. By looking at the good, we look to reinforce the good practices. By implication, if we don't see good competencies displayed, then things in reality are rarely going well. It's this kind of framework that allows us to look at why things go well and occasionally don't go very well and to look for the root cause of them. And it's why every single time we go within our training and assess by these core competencies. I think we can all agree that the principle of implementation of a universal requirement for human factors training within aviation has been a really positive thing. It's had a very positive effect on safety, even though it took decades to generate this universal requirement. Prior to delivering this presentation, I talked to quite a few of my colleagues within the Aviation Human Factors Training Department and asked the question that if there was a blank sheet of paper going back to the 1970s, what would we do fundamentally differently? And almost everyone said that this particular slide shows where the focus for us has changed and maybe where the biggest improvements could be made. We have gone through various different eras as each decade has passed, thinking about technical elements, the pure human, the organization. And now we're in an era where we are thinking about where we fit in as human beings and as pilots and operators within to a much wider system, a kind of systems focused view. Having a systems thinking view associated with safety is something that really in aviation we are just at the beginning part of the journey with, which is why I think it's so important. However, the thing that underpins the background to all of that is the cultural change that has gone on in the background. Having a system culture that is set up within most of our forward looking companies that stop thinking about the individual as being part of the blame and start thinking about the wider reasons why events actually happen. What is a system? Simply, it is thought about what I'm doing, what I'm using, and where I'm doing something. It's the interconnectedness of all of the elements associated with what we do as a job. So why is having a systems view so important? It takes the focus away from the individual and thinks more about how they interact within the entire system that they are working within when it comes to errors and errors that are made. Historically, we have always looked at errors and looked to attribute a reason or blame associated with them. We've looked for causal factors to then attempt to eradicate them and then eliminate that thing from happening again. That's something that we call safety one. Taking a much broader view as to what goes on within a system is something called safety two. Safety one looks at an event. It reacts to negative outcomes. It assumes that when things go wrong, it's because there is an identifiable cause. And it assumes that the system that we work within can be decomposed into its internal small components, which can't necessarily always be the case. The other issue with safety one is that we spend all our focus looking at the things that go wrong, which are actually only a very, very small number of the things that we do every day. The vast majority of the outcomes of the things that we do on a day to day basis within the system that we work have a positive outcome. There are very few of them that end up in an incident or an accident. So if we only focus on those things, we have a very, very small sample to work with. The problem with that goes back to the video that I showed you earlier on. That showed an example of a time when clearly things were not going right within the flight deck of the airplane. But there was no outcome that was negative. There was no incident. There was no accident. In theory, there's nothing to investigate or nothing to look at because nothing went wrong. Professor Eric Holnegel, who is the senior professor of patient study at University of Young Copping in Sweden, came up with the concept of safety one and safety two. What he is fundamentally saying is that instead of just looking at the things that go wrong, we should also be looking at why things go right. Despite the situation that we find ourselves within almost every single day that we go to work. What we planned to do is not exactly how we do it. Something else goes on differently. That means that we need to make an adjustment that we need to do things slightly differently. Yet still somehow the job manages to get done by looking at all the things that go right despite the system that we work within. We are able to look for repeatable measures to improve the way that we do things. We're looking for how we can make the improvements without necessarily having the incidents and accidents, trying to stop them before they even occur. Now that isn't to say that we shouldn't also be looking at the things that go wrong and investigating them. It's just that there is a much greater sample of things that go right. However, the only way that this can happen is if there is a culture of open reporting. People have to feel that they can say the things that they have almost done incorrectly, the mistakes that they have made, but they didn't have a negative outcome. They have to trust the reporting culture that exists to be able to say the things that they think are going wrong without any kind of fear of blame. We need people to be able to say the pitfalls that they are avoiding every day, which cracks in the pavement that they are stepping over and to report them. It's a much more positive way of working because it focuses on why things go right. It's those near misses that help shape the system that we work within. What systems thinking is trying to do. It's looking at whether technical failures, the unsafe facts and the contextual factors that are within the environment that we're working push through to get us closer and closer towards an incident or an accident. We know that within the environment that we work there are behaviors that are resilient things that are stopping those incidents and accidents from happening. Instead of just focusing on the things that break through into those incidents and accidents, what we want to try to do is find out what's gone on and try to improve the resilient behaviors that are stopping them from getting there. That is how we improve the overall system safety. What we need for safety to is a fundamentally strong safety culture, which is underpinned by open reporting from a system that has a very just culture. It's not a no blame culture. It is a just culture. So what do we mean by that? It's not the same as a no blame culture. What a just culture is, is this. It's a culture in which frontline operators or other persons are not punished for their actions and missions or decisions taken by them that commensurate with their experience and training, but in which only gross negligence willful violations and destructive acts are not tolerated. Other things are considered to be learning experiences that the entire system can learn from. The intention is it's to support a culture of openness and to maximize opportunities of learning from each other's mistakes. The really key things to remember though are a just culture is incredibly fragile. It takes years to build the kind of culture that allows people to feel that they can openly report that they can say the mistakes that they are making they can tell people about the things that they think they may have done. And therefore it takes only minutes to destroy it. However, it is the foundation of having a sensible forward looking safety culture. It's the direction that aviation is trying to go. We are by no means there yet, but it is the direction that we're trying to go. And if there is one big learning lesson that can come from what we do. It's that all of the things that we have done in the past all those eras that we have passed through could be bypassed just by generating open reporting cultures, allowing people to say, without any kind of fear, what things they think can be improved and trying to help them generate a safer system to work within. Aviation has come a very long way in 30 years, but it was exactly like medicine. Our incident rates and accident rates were entirely comparable. It took decades to start, but the cultural shift was influenced by international policy standards, even though the local implementation does have variations. Improving safety needs open reporting. People must feel that they can open the report, the situations that they see occurring day to day that they know that need improvement. Good human factors skills have to be embedded and it begins at the very, very start from a recruitment process that recruits people that have good human factors skills through their training and ongoing professional development. It is a continuous process that never stops throughout somebody's career. There is always room for improvement. Medicine does not need to be on the same timeline as aviation. It took us 30 years to get to the very beginning, but we have made significant changes. However, cultural change does take time. People must trust the governance and it's really key to remember that that trust in the systems that people work within the trust in the reporting culture that they have is incredibly hard one and extremely easily lost. However, it is worth it. Thank you very much for all the presentations. You can see there's a bit of echo. I'm fully aware that we have actually, I wasn't able as as more knows to be part of the presentation because of a competing demand. So I'm not sure exactly whether the you have had may have a previous planning for a section based on timing. Well, we unfortunately had a technical problem where one of the versions of the videos ended up in head office in some kind of inbox that wasn't meant to be and therefore difficulty in downloading and checking whether it was working. So, if people want to stay, we can do a very brief panel discussion. Can I just have a hands up who would like to stay or would you like to stay and have a discussion. Any kind of hands up or ideas you can open your microphones and we can undo the microphones. I see that I haven't been able to see any questions asked in the chat only for your information more. I'm also in a very difficult situation in which I'm not able to stay, which is extremely unfortunate. But I would please encourage everybody else who doesn't have a competing meeting to do so and I will most certainly watch the recorded section later on. I won't even spend time thanking again all the speakers in the interest of time and hand over to more to to chair the panel. Thank you so much. Okay. Thank you, Gemma. Thank you. Hello and welcome then to our small discussion. I can see there are a few participants who say Miguel, Raquel, Gloria, you're more than welcome to join in. Why as well Carmen I can see you're more than welcome to come. If you want you can open your videos or open your microphones and ask us any questions you would like. So then I'll have a few questions to the other presenters. Just coming from the floor. Let us know if you have any questions just undo your microphone we're very informal, very informal if there's anything I can tell you. No, then Pete can I have a question to you. Yeah sure. I'm a procurement factor trainer, and the trigger is me into thinking if we would integrate that in primary care. How did you ever get to a curriculum for such a topic globally, or nationally, and how did you approach the training of the trainers. So it's, we've been very lucky in aviation that it's it's the very basic core of it is dictated by IKO. The actual the actual core requirements and what is needed within a human factors program is set out so it's that that makes it quite simple. However, that really is a bare minimum. And what the sort of governance goes from that international level to a national level and then to each individual airline associated with with their own training and so that happens in house. What's quite nice then is that every airline is allowed to make that training very specific to them. It's also allowed to select its trainers, which is which is specific to them. And one of the very nice things about that is that the the underpinning part of that is they've got to be able to show that they do that on a day they use those skills on a day to day basis. So if you can't prove that you can actually do this kind of stuff practically. And I think this is it's been really great to have this kind of conversation generally about this subject. My personal point of view is these things have got to be really practical. And I think you show that as a trainer that you've got to show that you can actually do this for us out on the line when you're working. If you can't do it day to day, then you shouldn't be selected to go and do the job. And that's that's kind of how it works from a selection point of view. And then you just carry on more than you're going to say something. And then from following that the actual the way that we train our trainers, it's, it follows along the lines of quite a lot of other aviation training, a lot of ground based theoretical stuff. And then it becomes very much like a teaching course that it's less important as pure content, but more important that you actually were able to get the message across to people. And that the information that you give is fresh, and the way you deliver it is very different every day. You make sure that it actually has practical relevance to the people that you're dealing with on a day to day basis, which is which is why it's so great to actually work with people in different industries. The principles are the same. The details are different. Thank you. And then john, are you still there john. I am. And unfortunately, I can't get my video to start my camera is sort of quit for the day or something. No problem, we've seen you for some time so that's not a problem. Can I ask you a question about your templates that I noticed in the, in your nice presentation, you are not too enthusiastic about the use of templates now we're here in the UK inundated with templates. Is it something like they're not good for establishing situational awareness. Can you expand a little bit about that. Absolutely. I think they're really bad in a couple of ways. One is that we can wind up the people are using templates to be kind of interrogating the patients so they can fill in all the little blanks in the template rather than having a dialogue with patients, which may not fit the form of a template. Secondly, I don't think it really supports the cognition it records data but it doesn't create a narrative. And I love to see a good study there one or two out there. I think Alice Savoy and others have said that really the electronic health records are using it does not help us with situational awareness that's sort of a whole another area. And I would like to think I do think part of that is the use of templates I don't know of good hard data on it, but I think it's true. The other thing I mean if I can add, it doesn't help communicate I have a colleague who regularly use templates and we would share patients sometime. And so I could kind of get the data points that those metoprolol was changed. But then I get no sense of why. Yeah, there's no sort of narrative around that why was the dose changed. Thank you very much for that answer. Are there any other questions from Gloria, why, or Raquel at all. Just let us know in that case. I can't see yes I can see somebody in the chat. That's just the communication channel. Sorry about that. John and Peter do you have any questions to each other. I was always sorry John carry on. It's interesting that when I do the bigger lectures on this I like to use example of aviation because it is an area where we've made huge progress. In terms of human machine interface and where the human machine interface and the interface with the larger system whether it be a healthcare organization or air traffic control is critical and when I talk on this stuff in the US. People are sort of raving. Oh my gosh, the HR is killing us. It's all, you know, it's dreadful. And I keep saying, no, it's not the HR. It's the way the system uses the HR. That's the problem. I mean, the HR can be improved just like. But you need to really look at this tool within the entire system. I agree I think there's, again, not being a medic seeing we're moving to many more kind of electronic data gathering information so all of our flight plans now are all done electronically we now fill them all in on an iPad. That just gets sent off. What's fascinating is there are no bits of paper hanging around the flight deck anymore. So those little notes that get passed between people when there's a crew handover those little scraps of paper that had just a frequency change just little bits of information are just no longer there, which is exactly as you're pointing out those little data points is the information that a designer has has thought about are all recorded. The paper wasn't just used for that. The paper wasn't just used for recording the data points that were for A to B. They were also used to record all the extra little bits of information that were there as well. And that's what's missing. And I think it's very important that you don't lose sight of the fact that the situation awareness is built off so many extra different components. And one of the things that I talk about a lot when I do do other work with medics is someone we now call distributed situation awareness. It isn't just about the people that are involved. The machines that you're dealing with have got their own level of situation awareness because you've absolved yourself of responsibility for monitoring to be done by something else. Once you know what it's what it's fallibilities are, then you're missing out on an element of the situation when it's not for a very modern aeroplanes. They don't tell us half the problems that are going on below the surface because they don't deem them to be important enough to tell us about. We would never even known about them on an airplane that was very old fashioned but these latest generation ones are monitoring hundreds and hundreds of systems and just don't tell us. Understanding how we communicate with the machines that we work with, how the data points are gathered and are missed because of that I think is really quite critical as technology moves on. Thank you. Any other questions we're going towards the closure or the webinar we've run a bit over time. I think that you enjoyed it and definitely stay in contact with us. If you have any interest in contributing or you think that you you're interested and you want to keep in touch by contacting us, one of us by all means do. It's an open culture here and it's a learning curve for several decades to come. So, thank you very much for your attention and have a good day. Thank you. I certainly learned a lot. Thank you for setting us up mod. Okay, can the faculty stay behind a little bit longer? Thank you.