 When I look back at my history of working in the area of safety in particular and occupational health and environment, I realise that we've gone through a number of steps changes. And the first big step change that seems to be being made by organisations is when they're faced with a massive disaster of one form or another and they have to really start and take this seriously to the next stage of saying, well, we're doing all sorts of good things but we don't know if we're doing the right things, we're doing too much, we're doing too little. So we get to the next stage where we start getting organised. We work out what our biggest problems are, our greatest risks, we work out what problems we can put aside because they're very infrequent. The next step change is the one that everyone these days is really wanting to make and the one that they're finding quite difficult to make and that is moving from being organised to actually having the performance that you really want to achieve. And one of the things that's driving that is the realisation that you can actually do better than you actually are right now and that's moving to being ahead of the game rather than being driven by events. The final step change which we can identify is the one where you really want to integrate, get everything together, you need to actually understand how you're doing it, what you're doing it, who's doing it and whether you're being successful and then permanently getting into a state where you think this can't be as good, we must be doing better, we're not doing as well as we should do. The other way to make that original step change which is when regulators basically step in and say either you change or we're going to make your life a misery. But many organisations haven't needed that strong focus on the regulator because they realise themselves that what they're doing is not very good. The problem is that in that reactive stage you're always waiting until the next thing goes wrong and if things are going okay you think you're doing quite well. The only thing that drives the next step change is you don't get many of those relaxation moments when things do go well and you get an awful lot of them when they don't go well. And the next step which you really have to consider then is wait a minute, we're reacting, we're just to events as they come along. What are the most frequent events? What are our biggest problems? But there's also the much less frequent major process disasters. And those are the ones which we really don't quite know how to deal with it except we just hope that we've got a system that's robust enough not to have it happen very often. We actually do get a serious improvement when we do start thinking about how we're going to be managing our safety. And what we have hoped quite often is that if we manage the personal safety, that the process safety will come along. As it turned out, this isn't always the case. Big disasters like Longford, BP's, Texas City and the Deepwater Horizon have shown that simply reliance on personal safety wasn't quite good enough. But once you've got that organised and you've got your risk assessments done, you've got your priorities set, you're very successful. It does work. It makes a big difference to performance. And what they used to would say, hey, we wish we could reach that level of performance. They've reached that level and now they say this isn't good enough. We've got to get better. And that's much, much harder as moving to a proactive situation where what you're trying to do is to get ahead of the game and then the final step change is the one where you really want to make sure that all this is sustainable. And even if everybody got up and went away, new people coming in and carrying on the operations would be as safe and carried on doing it the same way as they are today. When I think about the critical elements of my model, I start, of course, with the pathological. And the pathological is the only one which really is not a culture of safety. It's a culture of get the job done, however, don't get caught. And if anything goes wrong, we know who to blame and it's not me, it's somebody else, it's the victim. The remaining elements, steps on the ladder are all cultures, but they're distinct cultures, which I would call cultures of safety. The first one is the reactive, which is basically where we wait until things go wrong and then we try and fix it and then we wait for the next one and we try and fix that. And the problem there is that what happened last time and what happens this time and what will always certainly happen next time quite quickly is that we have the same immediate and direct causes, except usually there's people doing things and so those are the people you originally try and blame because they're the only thing that they got in common. The next level that we moved to is now called the calculative. It used to be called bureaucratic, but I didn't like that because I could imagine people saying we're being very calculative today, but they'd hate to say I'm being bureaucratic. But that next stage is one where, in fact, you've got systems and processes, you get organized, you get your priorities, you get your resources, you make sure you've got your training, you do your risk assessments. The one after that, the next level up is the proactive. Now the proactive is the one that everybody really wants to attain these days. And the proactive is where you're dealing with the problems before the problems come and attack you. The calculative, by collecting a lot of data, is still inherently reactive and is waiting. The proactive is really looking at what's the next thing coming down the line, rather than trying to fight yesterday's battle, it's trying to win tomorrow's skirmish. And finally, with the generative, it's where everybody's doing their own job. You do your job and I'll do mine. And there's a lot of power which has been still held in the levels of upper management and line management in the calculative and in the proactive level is now dispersed down to the level of the workforce because the workforce are really the experts in how to do it safely. And the job of management is to make sure the workforce gets what the workforce needs. The ladder has got five levels on it, five treads on the ladder, but they're not really discrete points. They represent clusters of attributes and behaviors within the organization. It's much more dynamic. There's lots of different points. We distinguish 18 dimensions for personal safety, but when we add in process safety, we add in about another 10 dimensions. And you can be at different points, but they form a cluster for each one of those different dimensions. What I often find is very useful is thinking about where you are on the ladder, not as a point, but as a footprint. So what you have is that the main weight of the foot may be carried in the middle. Typically, it's going to be in the calculative area somewhere. We have processes, but we also nevertheless still manage to exhibit very, very clear reactive behaviors. When some sorts of things happen, we just react as if we've been stung or we've been bitten. But there's also the front of that foot because the reactive is the heel and I think of it as a footprint. The heel is in the calculative and then up there in the proactive, there's a few bits and pieces, parts of the organization, which are really scrabbling to try and get ahead of the game. So when you want to understand how an organization's culture is operating, it's not a single point. It's a whole series of points and they're dynamically moving. People are getting better and sometimes people are getting worse. People often think that the only way if people are left on their own and are not supported in the appropriate sorts of ways, they can also go down the ladder. But if you stand back a bit, you immediately realize that those characteristics, those cultural characteristics of organizations are much, much broader than just safety. They refer to how we do the finance, how we go about dealing with our customers, that what makes the ladder specifically relevant for safety is the transition going up the ladder in terms of the level of understanding of the risks and hazards being faced by the organization. This applies to safety, it also naturally applies to the environment, it applies to occupational health, it applies to security and probably applies even in finance. And the real realization is down at the bottom of the ladder, you really don't understand your risks, you haven't a clue and the best thing that you can do is shut your eyes and hope it'll all go away. And in a well regulated world where other partners and other players are doing it well, you can get away with it. It's rather like a bad driver in traffic. A pathological organization can be like someone who's doing terrible things on the highway and they don't cause an accident just because everybody else is avoiding them and making up for their bad behavior. As we go up the ladder, move to a basic simple understanding of what the risks are and then moving to a slightly more nuanced idea of what the risks are and where the risks are coming from up to a full understanding not just by the people at the top not just by the safety department but an understanding of the people who are facing the hazards and the people who are managing the hazards exactly what those hazards are what makes them more likely to be a problem what makes them less likely to be a problem what's the best way of controlling them what are the ways that we actually don't need to do. And so when you get to the top you've got a lot of nuance and you can actually quite often avoid having to do some of the things that we have to do lower down the ladder because failing to understand what we're doing means that we really have few choices we can't be nuanced. One of the ways I think about operating in a risky environment is a bit like having a bull's eye and you can have a bit in the middle where there's basically no risk but it's inherently safe and it doesn't matter what you do or how badly you behave but the returns on investment at that point are pretty minimal because everybody can do it and anybody can operate in that particular part of the space as you move out a bit you move into an outer ring where the risks are pretty normal they're standard, we understand them not everybody wishes to take them so we can make more money we can get better returns we can do it and we do it well and the better you're getting at doing it the further out you are moving towards what I call the edge where the edge is where it gets very exciting but if you fall over the edge that's when you have an accident or a major incident and what is interesting about thinking about things like high reliability organizations proactive and generative cultures in general is that they enable you to operate out close to the edge for two reasons one is to operate and the second is you've got pretty good systems for telling you where the edge is and your operating processes keep you away from the edge so for organizations that in a harsh commercial environment have to sweat their assets it's absolutely vital for them to do this kind of stuff well in a harsh commercial world so when you're sweating the assets you better be jolly good at what you're doing rather than just doing what the bookkeeper's told you one of the natural questions you can ask is doesn't all this safety stuff just cost money and the answer is no it makes you money but you've got to get your head round how it does it and I think that that's very important because typically what people do is they complain they have to do this, they have to do that safety's just a cost gets in the way of doing the business but the reality is if you've got your safety right and you can do it safely then you can go in and do interesting and exciting dangerous and dare I say profitable things because you're good at it you know what you're doing and you know when to back off so you don't get hurt whereas if you're not very good at this you don't know when to back off but you may not have the nerve to do the really exciting stuff as well to make the argument I often do this especially with senior people like boards which is I have a figure in my head which is that roughly 10% of turnover is wasted on poor performance in areas like OHS environment and process safety so if you're turning over 20 billion a year 2 billion is vanishing in smoke because you're not actually managing it very well now people disagree with me and I've had two level types of disagreement one was a friend of mine from a very big company that makes an awful lot of profit and he thought I was entirely wrong and I said well what figure should it be and he said 15% so the trick I do is to say to people well I may be wrong but if they object say well you must have the figures so you know what the figures are they'll usually retreat in some confusion but they still can't be right and I say okay it's fine it's only a guesswork anyway the conversation started and we can use a spreadsheet where you can look at the costs of different types of accidents and different levels of consequence and we can say well you fill in your own data you fill in the likelihood that these kind of incidents are going to happen from unlikely to very unlikely to almost impossible and you fill in whether they're going to be really expensive or just a little expensive and then when you put it all together lo and behold you come up with something that looks like a 10% of turnover but they're your figures not mine by the time you've got people at that level to that level of understanding the finance people only complaint is why didn't you tell me this earlier and so all of a sudden the finance people can become the safety people's best friend rather than what they thought was their natural enemy it's quite interesting the way we go about it is to create basically what looks like a risk assessment matrix where the cells contain for different types of incident the costs of a level 5 total disaster down to the cost of a level 1 which is just almost a near miss and level 0 is hardly counts as an incident at all for different sorts of incidents so in the case of oil industry we had a quick rule of thumb that a total platform loss was going to be about $1.6 billion whereas a level 4 where more than one person is killed and there's major asset damage you're probably looking at something more like $160 million an order of magnitude less now losing the total platform is very unlikely so the expected cost that you're actually exposed to is the product of a very small probability with a very large amount and it usually comes down to maybe a couple hundred dollars on an annual basis what we discovered was interesting was the place where all the money is vanishing is not the big headline events we're actually quite good at managing them most of the time although we could still be better but what it turned out is the level 2, level 3 stuff which is regarded typically as not hardly worth reporting more than a little way up the line gets aggregated, lost and isn't considered as being worth bothering about so it's what I call the death of a thousand cuts is where almost all of that 10% exposure actually comes from when you see the figures looking at you say ooh, now we know what we can do we can actually do something about that and if we're clever we'll manage it in a way that the bigger more headline items get covered at the same time as well one of the big problems that we face in today's world is litigation scares people to the point where they think that they shouldn't say what's going on, they shouldn't say what's happened to them because they're afraid that if they get into court they're going to be in terrible trouble now I think that this is actually misunderstood the really crucial discovery is that probably your best defence in court is the realisation that things will always go wrong life is not fair and what counts in court and what counts in the spirit at least of the law although sometimes the letter of the law might need tidied up a bit and I'm not just talking about Australian legislation areas I'm talking about America, I'm talking about Europe as well what counts is were you trying if you were trying hard and you're doing your dandest to avoid an accident and nevertheless you just got caught by something which came completely out of left field then you really should be able to get off you might be still required to compensate the people but the problem is that if people are terrified by litigation then what can often happens is that they're going to say don't tell me I don't want to know, I don't want to hear one of the things that you have to do is really realise that you've got to do certain things because that's what's expected right thing to do I'll give you a classic example which we discovered in the deep water horizon a case that I've been personally involved in as an expert witness and what it turned out was that BP had what I would argue was the best safety management system in the world at that time called OMS and they had developed OMS as a specific response to the Texas City disaster and we thought that they hadn't rolled out OMS in the Gulf of Mexico because that was a difficult region and they'd done it elsewhere turned out BP had rolled it out in the Gulf of Mexico but what they did was they rolled it out on their own assets first and they left the non-BP assets like TransOcean's Deep Water Horizon to a later date so what they did and demonstrated that they failed to exercise their true duty of care was that they took the active decision not to implement the local management system OMS on that particular well and what I was going to argue in court was quite clear which was that the system was so good that if they had implemented it it would have prevented the disaster now if they'd failed to do it because they just forgot or they were in a hurry and it was literally coming along the next week that would be understandable where they went wrong was they actually had a risk committee and took the active decision not to implement it so don't do this at home folks one of the things people want to do is to find out what their culture is like and the standard way of doing this is to carry out a safety culture survey the problem with that is that first of all they're big everyone's supposed to fill them in and also that they're really attitude surveys and the problem I have is they know what answers to give and they may well give the answers to achieve the results they want to achieve so I've seen surveys that have been filled in by groups of people who had a very clear message that they were wanting to send to their management it wasn't about safety it was about their relationships and their industrial relationships but leaving that aside we also have complications because you've got 150 questions and say what are we going to do once we've got the data what's a 3.8 mean on a 5 point scale what's a 4.2 mean they are useful but they're not that useful I find at least and what they do is they fit with the requirements for instance from the UK Health and Safety Executives definition of safety culture in terms of values beliefs and attitudes and behaviors with respect to safety which is a perfectly good definition but except it doesn't really capture values too well it doesn't capture beliefs at all it's very good on attitudes and somewhat weak on behaviors but I have a paradox which I've discovered as well if I know what your values beliefs, attitudes and behaviors are from your questionnaire from your survey I may not necessarily be able to predict exactly what it is you're going to do when you're on your own at 3 o'clock in the morning this as I have discovered is a typical example of the classic definition of aircraft line maintenance it all happens at 3 o'clock in the morning with a single engineer who's working on their own trying to make sure that everyone stays safe if on the other hand I observe you behaving at 3 o'clock in the morning I can work out pretty accurately what your values, beliefs and attitudes are because I know your behavior so one of the things that you really need to concentrate on is work out what people actually do rather than what they say they do a lot of senior management, their behavior is saying the right things rather than necessarily doing the right things they're very good at talking but they're not always quite so good at walking so the way we try and assess safety culture is by saying rather than giving people very carefully crafted single item questions like my supervisor tells me when I'm not behaving correctly or something like that what we do is we can have what we call rich descriptions where people can say that's us that feels like us, that's what we are like there can be around what is the status of the safety department what are the rewards of good safety performance how do we do audits how do we communicate, who communicates and when you have a sentence of three or four that you can actually put a description together along one of those dimensions so you take each of the five steps on the ladder then you can say you can pick one of those and say that's us and we just arrived the original tool that we used in the hearts and minds program going right back to the earliest study in 2000 by doing this but what we also discovered was that we made the tool so people would say well we're a bit calculative and we're a bit proactive and it's in between the two somewhere so we made a system and we scored those and that's where we left it for a long time but I became dissatisfied and I realized that there was something going on and what was going on was that people were not picking the description of where they were they were picking a description that also reflected where they would like to be just for themselves and for their colleagues they would like to feel that their workplace wasn't as bad as they were tempted to score and so they would say well they edged up a bit as well so what we found was that the scores on these tests were probably being heavily influenced by the effect of self-esteem aspiration rather than the actuality so I decided single-handedly a few years ago to change the way we measured turned out to be very useful and very insightful and what I did was I said I'm not interested in measuring where you are read the descriptions and choose those descriptions where you think reasonably that your organization could be 24 months from this day and I say 24 months because 24 months is long enough to think that you might actually be able to effect a change within an organization and short enough that you haven't been moved away from your job it's just a way of anchoring people to a point in time which is not tomorrow but it's not 10 years down the line and so I said just pick those what I call aspiration scores we're not interested in where you are so they do this and all of a sudden they're not saying well we want to be a bit proactive but also some generative 99.9% picked one box out of the five said that's us but we reckon we could be that describes us really well so we get a new score and a profile that footprint with the heel and the toes but usually calculative and proactive and now what we've got is a gap so then we pick what are the most impactful gaps the ones where you think you've got the best chance of success let's work on those and what you're now doing is picking quite concrete steps will be exhibited as an organization rather than simply going around saying what we need is better values around here these tools and approaches have typically been developed with large resource rich industries like the oil and gas industry like aviation and the question often arises is what about the little guy where they're all working their tootsies off because they've got to stay in business and they can't spend a lot of time going around filling in paperwork for people because they've got a job to do in fact in some ways it's easier because you've got fewer people to persuade fewer people to work on they know each other when people know each other then they know what other people are good at what they're not so good at if we can get everyone into a shed or if it's an aviation operation if we can get them all into a hangar and what we do at the end of every day we should be doing now what went well, what went badly why did it go wrong, what are we going to do to make sure we never get into that problem again and sometimes we say well we thought we'd fixed it and we haven't so we'll have to try again and what you realise with small organisations is that they can do this if they are given enough time and I think one of the problems is quite often is that clients don't give their small contractors enough time to become better and one of the things that you can do is actually make an investment in your contractors if you're a bigger company by saying take some time at our charge and we may be talking 10 minutes we may be talking 20 in a week, in a day or even half an hour in a week to say what are the things that we could do that would make us better next week than we have been this week I find thinking about progression up the ladder which is why people really approach me and they ask can I help is that there are 5 steps on the ladder and there are 4 arrows there's the arrow from pathological to reactive from reactive to calculative and so on and what we're trying to do when we're getting better is make a transition over one of those arrows and I discovered that there's a very simple structure which helps me a lot when I'm trying to advise organizations in how to do it and I'm trying to design plans for improvement and that comes from the first realization that when I was talking to organizations they would say to me we're pretty good we're definitely heading up the ladder we're heading towards the higher reaches and I'd say yeah I'm impressed it's pretty good stuff I would do because they're paying me but I'm tricky why are you so good and they'd say well we've got this in place and that in place and this in place and that in place and I'd hear in place coming in like mortar fire from the enemy trenches and I'd say yes again I'm deeply impressed just one question they'd say yes this point they think they're beginning to know I'm tricky this one question are you using any of it yet is it in operation ah they'd say we're going to we've got a plan we've got an implementation plan we've got a work group and we're starting next week I say good so you're not actually using it yet but you're going to or sometimes we're using some of it but we're still planning on using some more so the transition from reactive to calculative is taking the stuff that you put in place when you stop being purely pathological and actually getting it to work so we have standards but we actually use them as opposed to having them sitting on a shelf looking bright and shiny but not actually influencing anything then there comes another problem which is okay so we're using them I say are they any good well they say we've got a few processes that really don't work well but we don't dare stop using them because that'll show our lack of commitment to safety I say well wait a minute why aren't they working well they're not very good so the realization I became to discover was there's another transition the transition from calculative to proactive which is a difficult one which is of making what you've got effective so actually taking what you put in place and then making sure that you're actually going to achieve the performance and the results and the behaviors that you intended when you put stuff in place if we look into the future there's one thing we know things are going to change what we've got to do when we change is recognize how we're moving as we change into the way in which we're going to operate with the world we may slip back into a reactive mode because we don't actually understand how our new technology is working but if you understand that you don't understand then you're already beginning to get a head start and I think that in my ideal world people would move up because as they start to design new approaches to work they would design in how the organization is going to handle the changes not simply in classic change management but much more also at the cultural level and one thing I can guarantee which is that if there's major technological changes and these cultural aspects are not considered you're going to get a few massive disasters along the way