 Rhaid e'n ni chenta ni chi'n gweithio cerddoedd, mae yna yma'n字 Martin Bromley. Fy nesaf arall, rydyn ni'n ddweud, hon. Rwy'n ôl yn f jawid cylen gwaith, ac gw'r hynny'n gallu'n ffrwng â'r beth ac mae'n cerddoedd hynny'n gweithio'n cael ei gorffod o ddwy ymddangos cyfan. Rydyn ni'n mwynfael i'r hynnag i ddweud yma fel ffildden yn clirio ar y flynedd. Rydyn ni'n ceisio elaf, gallwch yn ddifudd o'i wneud o'r un o'r ddau. Mae rhai'n gwybod o gymryd ac ddaeth chi'n gweld, mae gennym ni i'r ddau. Rwy'n gweithio ym 17 oed. Felly rwy'n gallwch ar gyfer y Llywodraeth Maj o Uk. Rwy'n gweithio'r 50% i mi wnaeth i'w gweithio a'r 50% i mi wnaeth i'w gweithio'r 50% i mi wnaeth i'w wnaeth. Felly, mae'n angen i'r amgueddfa'i stori o'i wneud o'r pwnghwyll. Mae'r ysgrifennu arall wedi bod yn ynnig iawn i'ch cyflawni. Rwy'n pethau gwahanol o'r pethau, y bydd o'r pilotaf americol ar gyfer. Rwy'n pethau'r cyflawni i'r hyn o'r pheiddiadau o'r syniadau sydd y pethau o'r system PAA. Dydyn ni'n cael ei wneud y cwestiynau y cwestiynau. Ieitha'r pethau hefyd o'r cyflawni, mae'n cael ei ysgrifennu o'r cyflawni'n ffordd. Mae'r cyflawni ar y cwestiynau, ...dydd oedd yn ddweud y rhai, ddim yn ymddangos yn ddwylliant... ...ynddyn ni'n rhoi'n dweud eu bod y bryd ddweud yw'n ffesio... ...ynddyn ni'n rhoi'n ddweud yw'n ddweud... ...dyddai'n ddweud y bryd yn ddweud ymddangos. Yn ddweud yw'r rhai, ddim yn ddweud beth sy'n ddweud... ...lyniad i ddweud i ddweud yw'r hyn... ...'Yn ddweud i ddweud yw'r hyn. Mae'r rhai Brws gweithdoedd yn ddweud i ddweud... felly ond nes nhw fydd yn hyn yn gob. Ond efallai rwy'n meri ar ti'n gweithio ar y gofyn thatschaiful a y rwy'n Ebou'r Rhann gen'i gyda Fynghoriwyr yn cael eu tyfodig ac mae'n rhaid i cael eu chylau'r rhain yma o'r wath iaeth knoglion a'r rhain sefydlu wedi fawr o gweld am rhaid i gyntaf ac mae'na bwysig wasfodol iddyn nhw i'r bwysig ac mae wedi gweld i'r rhaid i'r olig o bethau i fy nghylch y gallu maith, Mae'r ffordd, mae'r trofnodi, ac yn oed yn ymdill, mae'n meddwl i gael o'r gweithio. Mae'r trofnodi, mae'n meddwl i'r trofnodi, ac mae'n meddwl 13 diolch. Rwyf yn gweithio, mae'r trofnodi. Mae gweld i'n gwybod, mae'n gweithio. Mae'r trofnodi'n gweithio ar yng nghylch yn Adam. ac mae'n bwysig bod ynglyn â Lain yn fawr amdach yn gweithio'r cyfnod. Ond rwy'n meddwl am ymgylcheddu'r ddefnyddio'r dweud hefyd, oherwydd ydych chi'n gweithio'r gweithio Lain yn gweithio'r gweithio'r gweithio, oedd ychydig yn y gweithio'r unig oedd yn gweithio'r ddechrau ar gyfer y syniadau syniadol, ac ydych yn fawr i'n fawr i'r llwyddiant ymddangos i rhan o'r ddweud yma? A dyna ddim yn ymddangos i gweithio'r ddechrau, ac mae'n gweld fydd yn ymgyrch i'r ffordd a'r allan ar y dyfodol, ond roedd rwy'n gweld bod ydych chi'n dweud bod rwy'n dweud o'r iawn, ond rwy'n dweud o'r wahanol. Mae'n gweithio ar y cyfan, mae'n gweithio ar y cyfan. Mae'n dweud i mi, pryd o'n ddim yn eu cyfn," nog yn cael o'i gael o'r cyfrifol, ac mae'n gweithio'n gweithio'n gweithio. Mae'n gweithio'n gweithio'n gweithio, informe corell Why didn'tray? But I think for me, that was the defining moment. There was an independent review in the end, and you know the story is now available online. It's available, the report in an anonimised version, is available. But essentially a lane was being cared for by an experienced team. They found that things overtook them, y dyma'r cyflinig yw'r cyfligau'n ei bod yn ymgyrchol iawn. Mae'r cyflinig yn mynd i'n gwybod yn cael ei gyflinig. Y rhaid, y dyma'r cyflinig yw'r cyflinigau yn hynny'n gwneud y maen nhw'n gwybod. Mae'n gweithio ar gyfer dyna sian o'r cyflinig. Mae'n gweithio ar gyflinig yma. Mae yna hi'n mynd i'ch gweithio ar gyfer ddi. Mae'r ddweud yma yma ydy 3 ddiogtos ar y maen sydd wedyn i'n gweithio ar gyflinig. because it was the only solution they could think of. When in fact this had developed as a can't insubate, can't ventilate and teaching would suggest there would be an alternative route such as surgical access. The three doctors didn't have a shared picture of what was happening, what we turned situational awareness was different amongst the three doctors of what was happening, what it meant and what needed to happen. y decision making became fixated which is perfectly normal, we know that. And the communication amongst them dried up. What was difficult though I think for me was that when we had this independent review discovering that not only did those things happen but that the team around them, two nurses and two ODPs operating department practitioners or anaesthetic nurses could see what was happening a yn cael ei bod yn yn nhw'n meddwl i'r cyffredinol o'r cyffredinol. Felly, y gwaith cyhoedd yn ymddangosol, y cyhoedd y gallu yn ei ddweud, rwy'n meddwl o'r cyhoedd cyhoedd yma, rwy'n meddwl o'r cyhoedd cyhoedd cyhoedd yma, rwy'n meddwl i'r cyhoedd cyhoedd yma. Ychydig oedd y dyfodol o'r cyhoedd, oherwydd ydych chi wedi bod yn y cyhoedd yma. a ddweud o'r ystod, y hwn yw'r stori yn y maeddo ddweud, y stafeliaeth mawr gynnig ystafeliaeth, a oedd yn diffrithu'r ystod o'r cyfnod. Byddwch chi'n rhan o'r rhan o'r cyffredig, a'r ysgol yn olygu'r cyffredig ar gyfer y maeddoedd. Mae'n angen i'r dda i ddweud o'r ddwylliant sy'n ddigonio'r ddwylliant. Ond oes o'r cyffredig o'r ddwylliant, oedd ystod o'i ei ddweud o'i ddwylliant. The hints and tips weren't enough and the hierarchy, frankly, was too challenging for the, quote, junior members in the team who were actually the ones who could see what was happening. So these failings in all these kind of things I've talked about in these non-technical skills or what we often talk about as human factors were the sorts of things that as a pile I was familiar with. And we know that 75% of aviation accidents are caused by these things. And so to me back in 2005, looking at this, I was astounded as I started to find out what was done about this in health care. And the answer was, were not much at all actually. As you can imagine, I've reflected long and hard over the years about what happened. And I'll just take you through some perspectives. I've talked about these sorts of things about, you know, issues around, you know, team working and personal stress and their cognitive capacity. And we can focus on those things. And what it does is it drives a focus on the team, on the individuals involved. They must have been bad people. We can't help ourselves. It involves anethatist actually, and I know there's some anethatists in the room, are the worst at this, because they look at this and they say, I wouldn't have done that. But you know what? In the Scottish Clinical Simulation Centre in Stirling, they put teams through a similar scenario. It's with a mannequin, it's a knife attack victim. They don't know it's similar. But when they get involved, they often end up following a very similar painful path. And actually when people have never been in a situation, they don't know what it's going to be like for real. And some good things have happened since then. I've been working and campaigning now for a number of years, but I must mention at this moment that there are some people in this room here who have been on this route a lot longer than I have. Liam Donaldson kicked a lot of this work off in the UK 18 years ago now, with an organisation, with a memory. And we have people in the room such as Josephine Eclew, who's been at this far longer than I have. But I can only take us back to 2005, because that's when I got involved. But now we start to understand safety. We start to use cognitive aids, although I think sometimes we think checklists are much better than they really are. We use simulation a lot more. We're talking about team resource management. We talk about these non-technical skills. We do some very specific care bundles. But these are all very focused on the immediate front line. When we look at a definition of human factors and ergonomics, actually it's a lot more than that. We see words on here around tasks and equipment and workspace and culture and organisation. And actually simplistically, it's about these things. It's about making it easy to do the right things for our front line and conversely making it hard to do the wrong things. So this diagram actually is incomplete because there's all these things. And I would argue in Elaine's case that these were not bad people. These were very good people who were doing their very best for Elaine. But they didn't have the benefit of the systems and the training and the processes and the protocols which other industries have. Some of the audience might recognise these two characters, Ayrton Senna, the racing driver who died in a Formula One crash about 25 years ago, and Professor Sid Watkins, who at the time was the chief medical officer of Formula One. I was very lucky to spend an evening with Sid many years ago. He has sadly now passed away. But we spent the evening talking about Senna and the impact it had on him. Up until the point of Senna's death, in the 25 years previous, about one driver a year died in a Formula One accident. In the 25 years after Senna's death, I believe only one driver has died as a result of a racing accident. That is a remarkable improvement in safety. When Senna died, what Sid Watkins didn't do was he didn't go to the racing drivers and say, hey guys, slow down, take it easy, don't take any risks. What he did was he campaigned for subtle changes to the rules. He campaigned for subtle changes to the track design. He campaigned with the manufacturers to make safety now a feature of their vehicles. He standardised medical facilities at Formula One tracks. That remarkable improvement in safety is a valuable lesson in system safety, not from a pilot, but from a doctor. This is a model we love in aviation. This is some work from NASA, from Bob Heimerick. At the front line, we often spend our time mitigating the consequences of error. But that's the wrong place. What we want to be doing is trapping error, or even better, we want to be avoiding the error-prone situation in the first case. Yet, as has already been mentioned today, I believe around medication safety as an example. We go off and we buy drugs that do different things, but we make sure they're in packaging that's very similar. Then we give it to the front line and we say, hey guys, here's an error-prone situation, be careful, don't make a mistake, double check, you might get in trouble. We present in healthcare all the time error-prone situations and expect the front line to deal with it. I just use drugs as an example and I really welcome the focus that we've had today from Jeremy Hunt on medication safety and medication errors. That's really, really important. But we see it in all sorts of equipment around healthcare. This is a genuine example, by the way, from healthcare. It's a kind of an obvious one though, isn't it? It's where a system has been developed that is error-prone, but let me tell you another story which isn't so obvious. About three years ago on the south of England, a fairly healthy male presented an accident and emergency and he'd had a fit and they didn't know why. Anyway, they did some tests and they basically said, okay, we're going to fix up an appointment for you to see a neurologist and we're going to fix up for you to have a scan, so we're going to send you home. Here's some paperwork. We'll be in touch. It took nine months for him to sit in front of a neurologist with that scan and at the end of that nine months it was discovered he had an inoperable brain tumour. And when a group of human factors experts worked with the hospital to look at what happened, what they found is from that decision to give him that scan to see a neurologist, it took 20 separate steps of bookings in computer systems, of phone calls, of bits of paper passing around to actually get to that meeting. And of course that's 20 opportunities to make a mistake. What they did at that hospital is they redesigned the system so it now only takes three steps. It doesn't guarantee this won't happen again, but it's a dance site more efficient, it's cheaper and it will probably give a much greater likelihood of preventing that. So these processes and systems aren't always as obvious as this. The challenge of course in system safety is that healthcare is complex. Here's a quick diagram that I took off the internet. That's the UK healthcare system. You've got it, great, thanks. Actually just take a quick look again if you struggled that last time. Actually it's really complex isn't it? The thing is healthcare systems everywhere are complex. It doesn't matter what part of the world you are in, they are all complex. It's the nature of the beast. And this is a very brave attempt by an organisation to put it on one slide, but there's still lots missing. But the important thing here, you don't need to see the detail, is that if we're to develop system safety where we start to streamline processes and protocols and purchasing and procurement and all that sort of stuff, we have to be able to influence all these bodies, whoever they are, and that's a really big job. Very recently Secretary of State for Health has talked about the idea that actually instead of financing healthcare for five years, we should finance it for longer. One of the advantages that something like that might bring us is that when we start to look longer term, we can start to make some longer term plans. One of the things I would really welcome I think in healthcare everywhere is longer term coordinated safety plans where we can talk generally about the big potential errors, the big potential threats and some form of coordinated planning of small parts so that all parts of the businesses can start to think about what is it we can do to make a small difference here that makes a big difference when put together at the frontline. We need all parts of the system to do those small little things that will make the bigger difference. That's a big job but with some longer term planning, the possibility of doing that might be there. The second thing I think is absolutely essential in whatever system you're in and I'm talking about all systems around the world. We need to bring in this kind of human factors expertise to all those organisations, whoever they are, those national bodies, those royal colleges, whatever so we can start to think in a more systems way. Because the goal of this is to make it easy at the frontline, to make it easy to do the right things. Just a thought on that by the way, this is a nice diagram from a guy called René Am Alberti who does a lot of work in France. He worked with aviation for many years. He's now working with healthcare. He's worked as well in this country with Charles Vincent and they've done some great work on safer healthcare. They produced a book which is available for free by the way. You don't see that very often in healthcare. Excuse me, using UK driving limits here. The model works like this. If you go out on the UK roads, you will drive at the speed limit of 70 miles an hour. I know you will because I can see you all very safe people out there. None of you would break the speed limit. But you know what? You're driving along and you're a bit late and the people around you are going a little bit faster and it's okay isn't it? Isn't that alright just to go a bit faster than the speed limit? Maybe 80 would be okay. And then we get really late and we push it harder and we suddenly find ourselves, we have deviated and we have ended up in a really dangerous place. My observation is that in aviation and nuclear and rail, and this is fairly true around the world by the way, we tend to operate in this green zone. We tend to operate at that 70 miles an hour because the rules that are written for us make it easy to get it right and hard to get it wrong and that's recognised. The problem we have in healthcare for years is that people have got used to working in that 90 mile an hour zone all the time. And they kind of do it not because they're bad people, but they do it just to get the job done despite the systems, the complex systems that are around them. And our challenge over the coming 10, 20 years is to design systems that make it not only easy to do the right things, but easy to do within the rules that we have. And that's a really big long-term challenge for all healthcare systems. And once we get to that point it will be so much easier for the people at the front line. And a final thought, we've already had mention of the Dr Barogarba case, let's talk about just culture, because I think for me having a just culture is fundamental. It's something we strive for in other safety critical industries. I am going to be, and this is absolutely from the heart when I say this, we have had in many respects the best five years we have had in healthcare because despite the challenges, despite the difficult times that we have, actually we have a Secretary of State for Health who is prepared to stand up and recognise the importance of learning to make a difference and has been prepared to stick his neck out and keep coming again to safety and again to learning and again to safety and again to learning. And that fixation is a really good fixation because it's allowed us to start talking about, hang on a minute, is the culture right in healthcare? And the answer we know is it's not. And we're now able, I think, to start recognising in the reaction to the Dr Barogarba case, whatever the rights and wrongs of the actual case. And I don't want to get into that. Actually now people are saying, hang on a minute, this isn't right, whereas in five, ten years ago people would have just gone, yeah, well that's the way it is sometimes. And we really don't have a just culture in many respects. We need to have a situation where inadvertent human error is not found grossly negligent, but where people are supported and coached and understand that's how the system is. We mustn't tolerate genuinely, appropriately apportion gross negligence and we mustn't tolerate deliberate acts. And a just culture recognises that and working towards a just culture is the next challenge I think we have in healthcare. But it must be just for the clinicians, it must be just for the taxpayer and it must be just for those most harmed. And within that we have to listen to all the people who are most affected. And sometimes those people might be difficult to listen to. Sometimes they might be people who are very poor advocates of their own position. Sometimes they might be the most disadvantaged, the most discriminated, they might be having mental health issues and may find it very difficult to express their concerns or issues. And sometimes within the healthcare system itself we may find that there are groups of people who are discriminated against or groups of people who are assumed to be less important than others. And what's interesting is in all the flurry about the GMC's verdict around Dr Baragaba we kind of forget about the nursing side as well. I'm happy in the last few minutes to take any questions. Thank you very much for your time and just a thank you to everybody here who is working towards making healthcare safer. Thank you.