 Hello everybody and welcome again to another OpenShift Commons briefing on Transformation Friday. Today we have a wonderful speaker and guest, John Willis from the Office of Global Transformation here at Red Hat. You probably know him as Bacha Golupa. I think that's how you pronounce it. There you go. And from his work in DevOps and the Phoenix Project, so today he's going to walk us down an alley. I think that I've watched him do qualitative data analysis from lots of notebooks and notes and posted stickers and all kinds of things. But how we take that and turn that process and use some computer assisted methods to do this as well and how you can take that into your organization. So I'm going to let John introduce himself, introduce the topic. There'll be time at the end for live Q&A. Just ask in the chat and with that, John, take it away. Great. Thank you, Diane. Yeah, so, you know, fancy title, qualitative data analysis for digital transformation. So I'll go through why, you know, why this title, you know, my sort of experience of, it's something I've been doing for probably three or four years now. And it just isn't, recently, I've gotten more sort of academic and prescriptive. And that's primarily because I now work with Jay Bloom who has a PhD in design transition. So he helps me a lot of showing me a lot of the stuff I have been doing over the years and how it actually is can be enhanced through some well known techniques and processes. Anyway, if you've been paying attention on Fridays, you've probably seen these, the four of us. We're called the Global Transformation Office, that's Andrew Clay Schaefer, and it's Kevin Bear, who's co-authored the Phoenix Project. That's me, the short guy, and then Jay Bloom, who I just talked about. Andrew likes to say we wrote the books or wrote some books, I guess. You know, Kevin Phoenix Project, I did a collaborative project with Gene called Beyond the Phoenix Project. I was co-authored Dev Ops Handbook. In fact, we're talking about the five-year anniversary coming up next year book, so a revise, so pay attention. Andrew wrote some chapters, Web Operations, Site Reliability, and Kevin was one of the original authors of Visible Ops. I was an advisor on a unicorn project, I'm not really credit as an author, but just quickly on myself, again, if you don't know who I am, I've spoken a couple of times on Fridays, I'm part of GTO, but I just listed a couple of the sort of publications I've worked on. I think I've, you know, back in the day, I used to write IBM Red Books. I probably have 10 or 11 books in my resume over many years, but the two I think are really interesting. I'm not even in today. Normally I talk about automated governance, and so there's two papers that are out there, IT Revolution and Five Governors for Owners, and both of those are Creative Commons, and then I've only sort of listed the last 12 years. I've worked for a lot of companies. I think I've done like 10 startups, but I sold a company to Docker. I actually started my career at Exxon, I would say Red Hat now, sold a company to Dell, all good stuff. So one of the things that, you know, I've been doing this and all for a long time, right? Like I, my career spans, you know, I wrote IBM Mainframe Assembler Code, you know, back in the 80s, right? So, and I got, I went through the first sort of distributed computing, and then I spent many years supporting a reasonably successful services company based on the Tivoli portfolio. So the first crack of distributed computing, and then I moved into the open source world with Chef and, you know, Puppet Chef and, you know, all the other great things including then cloud. So this notion of how do you help people improve, right? Like this is the golden ticket, if you will. How do, you know, and so about three or four years ago I left Docker to become independent and I was really working as a one person shop at one sales person myself. And I went in and I had this whole notion that like I was going to use all these sort of prescriptive ideas that I had learned and used and, you know, learn from a lot of sort of friends, you know, being in sort of Jean Kim's tribe, if you will, you get to meet like leaders across some of the biggest industries. They're all friends of mine. And one of the things I started finding out is I started sort of devolving my practice pretty quick. You know, I like, for example, I would come in with lead value stream mapping to do this and, and I realized there was a lot of other stuff that I really wanted to get to, you know, think about truth. But I needed to understand about your organization that these frameworks got in a way, you know, and I, you know, I'm a big fan and I'm just calling frameworks. I'm not sure what you call them, but things like lean values to mapping. I think it's an incredible tool. I think it's the worst tool to start with. And I'll explain why. And, you know, horizon based stuff or zone the wind. I don't know if maybe eventually I'll agree with it. I think it could be used and appropriate, whatever. So the second sort of, you know, thing that I think gets in a way of transformation is this sort of impersonal, like some large organization comes in and says, these are the fire things you need to do. Or you can't do agile unless you co-locate or yeah, I mean, just these like ridiculous. They're, they're based on industry doctrine. But the reason I could impersonal was not, you're not asking or including the people who are going to be affected by the change in the change. And last is the notion of mental models. And I'll go through each one of these really quickly, right, which is everybody has these different mental models. And we think about, you know, in criminology, right, the famous stories of, you know, three witnesses to a crime. And when they go to get the, you know, what they looked at once that they had red hair, the other ones that they were blonde hair, you know. So, so in getting looking at these three, right, I was, this is something I've always said, you know, in fact, I came to this conclusion. When I started decoupling my practice, you know, the idea that I was going to do all the frameworks is you can't lean agile, say for devouts or SRE or, you know, zone to win or whatever your way around a bad organizational culture. And you like, you literally have to get some understanding of what the organization from the people, I say the people at the edge, the people who do the work. Second is, I did an interview with Christina Maslach, who is the foremost organizational burnout expert in the industry. She's written books. They've named a canonical test. It's got her name in it, the NBI, Maslach Burnout Inventory. And so I just, she made a comment in the podcast, so I took that as a quote, but it's, it's whenever we're talking about any kind of change or improvement, and you're counting on a bunch of human beings to make the change to make this happen. If they haven't been part of figuring out how to do it, the change efforts will be done. So you have to at some point include the input of the people who are going to be affected by the change. We don't do that a lot in our industry. And third is this notion of mental models. We go back to Peter Senge, the fifth discipline, pretty much the father of system thinking. You know, mental models are deeply held internal images of how the world works images that that limit us to familiar ways of thinking and acting. Very often we are not consciously aware of our mental models and, you know, in the, sorry, I just an effective way we think so then the net net is that these mental models actually get in our way when we're trying to transform because different people. I did an exercise once where I was talking organization. It was a group of people and I asked three people that worked on the same service work together to go into three different rooms and whiteboard their service. You know, how it looked and you wouldn't be surprised that they were all different. I mean, you all had the same general idea, right? So this idea that people, you know, mental models sort of have my concept or something, even words, taxonomy in an organization. You know, people go into meetings with going in and the intent like the meeting is about compliance, right? And everybody in the room or at least there might be three different definitions of compliance of the people in the room. And nobody knows that. And so the meeting really didn't really go anywhere. So, so then I talked about qualitative data now. So I think it's important to quickly mention the difference between the quantitative approach, which has been taken in sort of DevOps and transformation and a qualitative approach that I'm sort of suggesting or I've been using and I think it works very well. So in a quantitative, you start with generalized theory and you use correlation towards specific conclusions. It's inductive deductive, if you will, right? So it's your specific conclusions from general principles or premises. It's typically numeric, right? Statistics. And it usually is either, in our industry it's been surveyed. So we'll talk about some of the surveys. It's impersonal, like a survey is impersonal, you know, like when you're taking a survey, you don't get to ask, well, what does that mean? Or what if the answer depends, right? And it's closed ended, right? So there's sort of, there always is one answer. It's either strongly disagree or strongly agree in somewhere in between, or it's a multiple choice, you know, I deploy between six months in a year. I got multiple services. What do I do? And so let's take an example. So probably the canonical quantitative approach that's been used in DevOps, which has been very successful, is the Dora. Now it's accelerated, you know, and which has been a psychometric survey. There's this number of patterns that address and I don't want to just limit it, but in general, the industry has accepted the fact that there are basically four variables that we look at. And we sort of describe organizational performance based on you. So the idea was that the authors of these surveys. And there's multiple ones, but you know, the most prominent one is, is Nicole Portram and just and Gene Kim. Sorry, I can't believe I forgot his name. And so the theory would be that if you have shorter lead time, you have more frequent deploys, you have a lower change, fail rate, and you have a quicker time to restore. That you're more likely a high performance organization and vice versa for a low organization, right? And so for 6 years, I think 7 years now, the, the idea is to collect a lot of data through surveys and sort of prove that theory out or inductive that, you know, that, you know, and then you'll see things like, you know, high form organizations are 100 times faster in lead time than low performing organization. Right. And so like, for example, how often do you play more than 6 months, you know, between month and or, and so again, I'll do the pros and cons here in a minute, but the point is, if I have more than one answer. If I don't know what you meant by on demand, I can't ask you. Right. So, you know, in, you know, and I don't know, I don't know the context of your specifics. Are you like a one person? Are you a digital sort of side project in a large, you know, 80,000 company that you're the only person that maintains it and in, you know, in your thing and you deploy all the time. Is that representative of that organization? Right. You know, the large bank. And so the pros of quantitative approaches, it's definitely easy to minister. Right. It's, I can administer thousands. I mean, it's harder to administrative in a company. I will, I've tried that. And in fact, it's why I went to a qualitative approach. I found that a quantitative approach inside an organization was a very difficult. I'll save that for another presentation, but you get more data. You know, again, the industry surveys have been very effective for us in our industry. Right. Finding out general theories of things like, you know, how high performance work versus low performance. It's helped us sort of gauge initial ways to create outcomes as more data. It is objective and it is based on a scientific method. The cons, like I said, is impersonal. I can't ask you. I can't look at your face and see that you didn't understand my question or I can't drill down on the question. Or you can't ask me. The answer is depends. Close ended. I would say it's theoretical as opposed to empirical, which I, which I'll explain when I get to quality. And like I said earlier, it is content specific. Now, qualitative, right, is it moves away from theory, driving the data to an approach where the data drives the theory. Right. So it's abductive. Right. It's, it really, James says that it's great. It's like, it's like a murder mystery. So, in sort of the qualitative is what, what Jay would say is inductive deductive. I think I got it right, which is you're really going to come out with an answer. Whereas abductive is abductive inductive or inductive abductive. Sorry. Is is basically more like a murder mystery. Right. I've kind of sift to a lot of different ways of you telling me stuff. And again, I think that to me, that's the power. Because these, the complexity of these organizations are just, they're just too complex to create 1 single answer from, you know, or 50 answers to 50 questions. They're categorical. So here again, what we're trying to do is decouple the mental models. Right. So we're saying is like, if I, if I can look at the answers to interview questions from multiple people. Right. I might have, you know, do an abductive process. Be able to get a better understanding of what this really means. And I can roll things up and I'll show you examples. It's interpersonal. Again, I can go back and forth. I can. You can ask me what the heck do you mean by that? I can say, do you know what I mean by that? And it's open ended. So, so as opposed to sort of an industry doctrine based on a quantitative approach, it's been used like the sort of door, accelerate, you know, lead time, those things. I would say, and this is what I use actually have 7, I call it 7 deadly sins, but I would say that most organizations that I have worked with and been in and interviewed or just based on our general. Industry experience or even sort of like collaborative experience. What other people is that most companies struggle with invisible work, visibility, consistency, capacity and toil issues. Right. So in this qualitative approach or my approach now is that I'm going to basically sort of set these like these are the things I'm just going to try to tease out and actually have 7 and I'll show you them a little later. And I'm going to tease these out and then and so here's an example like a question might be what is the audit process like in your organization. One person might say, you're terrible, you know, they're horrible. Another person might say, you know, they waste about 30 times a year. And then this is the one I love the best, right? Which is, we don't tell orders things down or no, because it allows me to, to, well, here's the sort of pros and cons is, you know, it's empirical. It's, it's, it's, it's, it's, I can, I can use multiple observations to drive through the process of understanding what they're saying. I can link things together. You know, so, you know, it's verifiable observations. It's open-ended. I would say it's combinatorial. Right. So going back to mental models, right. I, I can, you know, I can use the objective example of trying to figure out like, hey, you know, this person said this and even more importantly, like, for example, an overloaded tone, turn, turn, sorry. Something like compliance. You know, I find that a lot of organizations and different organizations have different meanings for things like governance, risk compliance, individual meanings, and then even in the organization itself, they might have their name. One group might call everything risk. Another might call everything compliance. So if I'm having these conversations or these interview processes, I can actually take notes and figure out what I thought you meant for compliance, what it really means in an industry perspective versus person three, four, five and six. And then through the, the rigor of the qualitative approach, which I'll talk about in a minute, I can actually create, you know, I can come up with a theory that I believe company X, you know, one word for this type of thing is risk. And here is how it's addressed. Right. Obviously harder to administrate a less data and it is subjective, but the end of the day, like all of these process processes are sort of inductive in the sense that you're, you know, there's some subjectivity like to the, you know, again, there's more rigor in crunching numbers and doing statistical analysis and clustering and all the things. But at the end of the day, there's the sort of kind of sort of. It isn't verifiable observations. So here's an example really quick of. So I'm going to show you a tool that I fell in love with. It's called max QDA max qualitative data analysis. There's a category of tools called computer assisted. Qualitative data analysis and just if you look really quick, you load in all the interviews and. And the, what you'll call it. And then you, you basically start tagging different. What they call coding different areas. So the approach I've been using here is called grounded theory. It is multiple theory. Again, I'm not going to sort of profess dying, like, you know, sort of a PhD qualitative analysis. I've. I've read, read a lot about it and I've fortunately have jade to help me understand this, but this is the approach I use. And so this approach. The implementation is approaches really. Using the qualitative data, the interview. So basically take all all the interviews that I do. I don't take the transcripts and then notes. And then I load them in as artifacts, including lots of other stuff like letters and sort of notes from, from emails that people sent about the process. And then you do this, this thing called coding where you basically go ahead and. You use, you sort of highlight, like you saw on that last screen and then you assign. Sort of a temper, in my case, it would call a temporary def definition to it. And then you roll through what are called concepts categories, which then you get to sort of out. So think about the industry doctrine is how I sort of start. And it's my sort of industry roadmap or recommendations is how I ground it. So what I'm doing is I sort of know the things that have generally. Wrong with the company. And I know the things that generally should be done to fix a company. What I don't know is how to connect that to the data that's driven by the organization. And I do that to a qualitative data analysis process. So, so what you see here that the idea is codes are key observations. Concepts are groupings of simpler codes categories. So there's this roll up process. The interesting thing is when I sit down with a CIO and they ask me, you know, well, John, I don't know if I agree with this. You know, I can say, well, you know, okay, we can walk back through the category the concept and we can actually find the paragraph of the sentence. Of what was said about this. Now I always delete names and I always sort of do the aggregate. Like it's a beautiful process when you get disagreement CIO, like, you know, I wanted my famous is your audits or theater, like your audits are terrible. They don't really connect the dots and oh, no, John, I'm not going to accept that. And like, okay, you know, let me show you like 10 examples of why I came up with this. And this is to get back to the data. Like it's not XYZ corp coming in and saying, like, we're smart and you do these five things and you'll be successful. It's like, I have an idea of what's wrong with you. Let me listen to all your, let me follow the data, which is basically interviewing a bunch of people. And then I'll tie that to sort of industry doctrine solutions. So here's an example of a granted theory, you know, you saw this earlier, but now I've got it attached to the sort of methodology right the code might be the sentence that somebody said in answer to a question. The concepts are audits are inefficient. The category is risk. And then in this case, the, the recommendation might be automated governance. So I can walk in and say you should do automated governance. Right. Like, and I'm probably going to write. 9 out of 10 times or mess like 8 out of 10 times. But now I have absolute like confirmation and to go back to the, the other thing too is. Going back to the sort of the impersonal right like now people feel like they were so if the organization comes in and says, we're doing automated governance because we listened to you. And we heard that audits are terrible, you know, and they're really hard and they waste a lot of time. Everybody involved is like. Yeah, no, that was our input. Awesome. Right as opposed to a big four coming in and say you must do automated governance. And then all of a sudden they're doing all this stuff and it's like, yeah, I don't know. I mean, nobody asked me. And so, so the, the, I told you that I had like 7, I like the number 7. It works good for presentations. Like it could have been 6, it could have been 8. But basically these are the ones where I, I, I find over my experience over the years. That I can decouple or or go through sort of an. Inductive process by navigating around these ideas. Invisible work multiple mission. System toy like you might have 5 or 6 different systems to manage tickets. Alignment knowledge alignment, you know, so the Brent syndrome organizational design complexes and security compliance. You know, and some of you see my presentations having deadly sense of DevOps. Basically, I could sit at a funnel. That typically drives the worst and deadly sin of them all basically security compliance there. Again, I have longer presentations on this thing. So, so the approach then is. That I've been taken with an engagement is we have some original conversations. I do the assessment analyze port and then really help try to figure out how to do the transformation. So. Meetings, you know, pre-covid, you know, I've done them all virtual, but pre-covid, you know, with a large in game, the largest game genre did was a top 5 bank. I spent a month. On site, I interviewed like 300 people. Probably close to 50 meetings. I calculate the amount of minutes of. Of transcriptions and probably well north of 50 documents, right? But now virtually, I've been doing this with organizations where we'll do like 90 minute meetings with the team. So maybe 10 meetings overall over a 3 week period, whatever, right? So it all depends. But again, the virtual is I didn't think it was going to work virtual. It has actually worked really well. The only question is it just works better when I can be in a room with a team for the whole day. Typically, the way it works is there's some executive letter. It's got to have executive like it has to be. It usually has to be to say, I was buy it into this with everyone's have to tell people. I really want you to like this idea. Like, I want you to go in the room with this guy. This bunch of gang and I really, if all possible, don't bring your laptop. Right? Like, unless the place is on fire or whatever, but I really don't want you sort of in and out. Like for, you know, for a couple of weeks or a week or a 90 minutes virtual. I want you to just be focused. There's an engagement of interest response, which is brilliant. One of the clients that I came up with this a lot of these ideas. Every time I do on the client gives me better ideas. The idea where you, you know, the CEO says, I think, you know, these 15 people should definitely go and then let's open it up in a letter to say, hey, we're going to do this. Who'd like to go? And then people have to write sort of a response to an engagement and then I get to use that to identify you beforehand. It really works well. And then usually there's electronic notes, audio transcripts. Some cases it always works better when I can sort of record and throw away. I don't really use the audio. I just need the transcripts. So, and then there's this number of, there are people that I identify during the process that I get back and I'm like, I really want to talk to that person, Bob or Sue. And so I usually do sort of these post one on ones. The analysis, you saw the sort of process. Again, like this is an example one that was about 80 people. It was like 20 documents. So you can see I have all the interview notes. I have usually have two scribes have myself taking specific notes and then I have somebody else who's actually a scriber. And so I've got, you know, I've got the transcript. I've got the scriber and then I've got my sort of additional notes and this tool. And I'm going to come back maybe in at another time and just do a whole presentation on how I've got this thing I've been doing with doing a post-mortem on the Equifax breach. Using this tool. I'm going to write it up. It's, it's really cool. I mean, like, like the power of this tool. So, you know, it just has just everything you need to know. I'm just going to give you a little sense. You know, one of the things I'll do sort of at some point after I've loaded a bunch of documents, I'll do some quick word mapping. So it has a lot of really powerful features for word mapping. I can get, I can whitelist terms and stuff. And then, and then there's, so then I can, there's another screen I had here where I can do it ordered and I can see what the, you know, how many words, which are the words that, you know, just sort of a tabular list. So I can actually start identifying words that really have meaning and that helps me in the taxonomy. So this all helps me sort of build that taxonomy discussion. Right. But then, like I said earlier, there's this, you know, kind of coding exercise where you go in and you read through, you've already done the interview. So you're sort of sort of context sense of your head. And then you basically start identifying these codes with particular maybe risk design, different, different areas. And the idea is, even though I start off with these seven patterns, the seven deadly sins, I really don't know what is going to emerge from the data. So I'm not, I'm not stuck with those. I really literally try to free and it's hard. I try to free my head to say, you know, I'm not going to try to assume any solutions. I'm not going to try to assume that they have consistency. And even in the first couple of rounds, I really try not to, to do a whole lot of categorization. And then, you know, and then the tool becomes incredibly powerful in terms of the coding. Here's an example where, you know, sort of risk design consistency. These are the ones that just showed up all to you can see that consistency and design came up like most frequently on the right. And, and here's another sort of example of sort of a population of the codes. There's all sorts of tools here that can be very powerful. I'm still learning a lot about the tool. But there's these ways to do these sort of casing models incredible graphics. So here's an example of now once I've got all the codes, I can say to look at sort of structurally or graphically what's going on in this. So that there's sort of, and all these things are linked to each other. So I can go back and forth. I can, you know, this is a really good example. So I can compare the notes. To the transcript, right? So transcript, maybe the best transcript or soda, like will mangle certain words so I can get a sort of a gap analysis of what, you know, that the other thing I look for there too is like if the scribe was like, you know, distracted or something. What maybe there was something missed and I might be able to dive back into it. I've learned, you know, like I said, the more I do this more learn. I used to wait and get all the data and loaded in that was kind of stupid. Now I do it after each interview and I do a post-mortem with describe so I can actually start with this quick. No, hey, wait a minute, wasn't there a story around, you know, sort of, you know, just pick something, you know, projects were really bad and project manager was bad there and then we can oh yeah, I forgot I didn't capture that right. This is just another way to look at the data. And then just, you know, like, like, there's only a certain extent that the graphics really help but it does help in presentation mode. Like here is sort of drilling in on risk. And then here again, a reducing codes. There's a lot of they called a retrieved segments. So now I can sort of say, I don't see anything other than risk. And then so in that bottom section, I can go through all documents. Everything, the 20 documents, 100,000 sort of a novel, basically, if you will. I mean, I'm crazy. I, you know, I would do in a review for the five year anniversary devouts handbook. I'm going to actually load the devouts handbook in here and I'm going to do my review in here. You can attach notes. I mean, it's just really cool. Just quickly. I don't I'm not going to go through the gory gory details of this, but so then as part of that process, you start, you know, sort of applying thematic observations. Again, I said, you start with sort of your generalized industry doctrine. Like, I know that like visibility and consistency and capacity and toil typically issues. I know that there are certain other things that I know there's a set of patterns that I think you should apply. And I'll show you that sort of list that I usually, but what I need to do is I need the ground to that. Right. And, and then along the way, I get to sort of look at, like, are there these demonic or these things that pop up. You know, we typically will say, well, the problem and any pattern to devouts is trust. Well, yes, that's true. But like, let me find it in the data. You know, that your lead time, you know, again, it's, it's difficult for me being involved in sort of the devouts movement since its inception and working to a large bank today and talk about getting a VM in four weeks or, you know, getting storage in three weeks. And I mean, I've heard recently I've heard this, like, it takes two minutes to spin up an Amazon instance. It takes another two weeks to use it. Right. Like, what are we doing, folks? You know, how many active projects like, okay, everybody's telling me there are too many active projects and nobody knows where they all are and where they're in. Okay, I think one another kind of quotes I got recently, which was, I don't, I have dependency, you know, that's dark, what I call dark dependencies, like where you have all these dependencies, you're coupled with all these dependencies, other services. And you, but you don't like know anything about their status. Right. And so the, this, this notion of sort of a dark dependency or dark workflow. So, somebody said to me, like, I don't know for certain dependencies that are critical path for me. I don't know if it's going to take two days or two weeks. So scale, that's just like, how do you, how do you even manage flow with that? Right. I mean, if I knew it was two weeks, great funding is always an issue. And then, you know, what I try to do through that categorization process to say, okay, I mean, now we're actually sort of borderline quantitative, but one way I try to figure out like, how am I going to tell this client, which are the things that are probably what I heard the most. So then I'll literally look at the categories and sort of from the roll ups from the codes to the concepts to the categories. So, you know what, out of all these, the sort of consistency was the number one thing we heard. But here's, I can drill down on all these sort of evidence, if you will on funding and toil. And then sort of look at these as like, how do we sort of address these. And so at the end of the day, these are all the concepts, you know, based on the sort of the elements of the category. Again, we don't have time to go through all of these. And then, and then so what I have come up with again, liking sevens, I sort of the seven DevOps opportunities, which are, you know, just taxonomy and models, right. The most organizations just are terrible, which is common taxonomy. You know, sometimes it is simple as say, why don't we take these 10 words that everybody's using and make sure we all understand exactly what they mean and want to say. So one of the most interesting things I saw in our industry over the last 25 years is, you know, for those of you who have read Eric reaches lean startup, right. It's a good book. I mean, just wrote lead enterprise. It's an excellent book. Just humble. But the, the thing is that I saw this in place that I'll just say it was years ago, but I saw a Beth Comstek was the CMO chief marketing officer in an Eric grease fireside chat. At one of his conferences and when she was talking about we have gone head to toe on lean startup. You know, we think in terms of pivots build measure learn all the things are out of that. Yeah, it's a great book. Yeah. And then, and then I actually about a month later was doing a cloud implementation for what I would say where the grunts, like the people in the edge who had to implement an on on prem private cloud. Right. And one of my startups, right. And like they were using the same exact term, you know, fluently that Beth Comstek was using the CMO. They were talking about pivot. And I realized that organization had recently successfully created a common taxonomy around lean enterprise, which again, the, the, the, the butterfly effect of toil of miss taxonomy, like somebody should do a study on it. Also, common models. I won't go. I don't have too much time to go too deep in this. I think team to policies is just an amazing book. I use it, like almost wrote as suggestions for models, cognitive load, team, API, which is not like programmatic APIs, but API is in terms of describing your team to other teams roles, responsibilities. You know, we're, if you haven't, if you've been following this, what we're doing. So, Jabe and and commenting and three economies and how we think about platforms and platforms interface outcome based metrics automation skills, liquidity. Like, this is a real, like it's 1 thing to think about skills updates. Right. But how do I build, you know, there's sort of the. We talked about T shape, you know, I shape T shape each shape individuals. Right. You know, I shape is an Oracle DBA T shape is, I know, Python and real expert on Oracle, but I also know my sequel. And, and then each shape is I'm sort of like pretty much can bounce just about anywhere within reason. Right. So, understanding that the sort of a measure of your skills, liquidity is a measure of your performance, which actually helps. This said topic. Digital transformation. All right. So, taxonomies and models, you know, so then I go through, now I start grounding. The opportunities, you know, so team topologies, maybe, maybe SRE is right for you. Right. A lot of times people, I can do a whole presentation on the toil of discussing and thinking about SRE in the enterprise. Right. So some questions are like, just stop talking about SRE in 2020. And you will save a ton of time. Team topologies. There's some workbooks. We'll make this available. We have an open practice library right at that. There's some really good stuff there that. So I've been going through and looking at like, what are some of the areas that we have roles and responsibility. You've probably seen again, Diane could probably point a list of like Andrew's 5 elements that we've been talking about in a GTO group. So 5 elements. Understand the difference between value stream analysis and value chain. So again, I don't start with these tools. But once we understand how to fit them in a roadmap that actually makes sense, then actually these tools become incredibly effective. And just quite frankly, the value stream mapping is typically around your sort of lean value stream mapping. The chain mapping is sort of things like, uh, wordly mapping, if you're not familiar with that. Right. And then again, a skills liquidity open practice. There's some great books. It revolution. I mentioned a couple that I worked on over the years is I think Gene said they produce like 50 or something. I kind of like 30 or something, but who knows it revolution. These are all creative commons. So you can just go to revolution, foreign papers. There's a value stream architect is a transformational leader quickly. These are great guides. And again, the open practice library platform transformation. You know, so here's an interesting thing too. I think. What's important when we talk about platform, so there's a lot of discussion about project to product like Nick Kirsten has an excellent book product. I'll show you the reference. Right. And, you know, and yes, of course, like, we need to move from to product, but then I'm like, okay, that's great, but not good enough. What about product service? What about service to platform? And then where does sort of the cab and change management. So I think there's a, there's sort of an evolution of, you know, project to product product to service. Platform as interfaces I discussed earlier, infrastructure scale and then sort of operations. So how do you sort of look at these things again? Normally when I'm reviewing this customer, we're doing a lot, you know, a lot more sort of education. A lot of cases it turns into a workshop. The 3 economies platform by design. And so in change management, like, you know, how do I sort of get sort of scale out from centralized to local authority, unified backlog, cab corporation, central debt. And then here, you know, so looking at, I find in some of the large institutional banks, like they spend up to 40% of waste. Around non-functional requirements related to service management, which by the way, I call that a negative risk ROI. In other words, if you're spending. A ridiculous amount of percentage of your time, like I've had examples where it takes me 2 weeks John to code this application. It takes me another 8 weeks to go through all the sort of spreadsheets and forms and all these things all related to service, ability, reliability or service management. And by the way, none of that actually made the service. More, you know, more reliant and then even worse. Or an audit is just, you know, even though you pass the audit, it's completely disconnected to how that works. And then so there's a number of books here that from the revolution press. Dominic, the agranus, if you haven't read making work visible, it's an incredibly good book. Again, it's one of those books I say is like, I always want to be clear. This is an amazing book. But I think you need to do the qualitative data analysis first. You know, so if you guys are 5 thiefs of time, it's brilliant. So, I'm a person's product, the project, the product. Again, the open practice library will make this available metrics. We talked about the magic for now. I will say this again. You know, I think that the work done by door and all has been incredible for our industry. It's led us down to the 1st, you know, the, the, the, the, the colleges to say we're science in the S H whatever out of DevOps, which is brilliant. Right. And I think if you don't, I'm not doing any outcome based stuff, you should at least be doing these magic for the common metrics. But they are lagging indicators. And so 1 of the things I really like is this concept of flow metrics, because I think they're more leading indicators. They, you know, anyway, I'm going to run out of time. But, you know, like, for example, if I look at lead time, I'm looking at sort of maybe depending on how you measure it. As long as it's consistent, I don't care, but let's say commit to, to, to prod. Right. But the, so I start looking that in the aggregate, like, I lose some efficacy because like, you know, 1 took 8 hours, but I don't know the explanation. Like I had a bunch and I had a bunch that were sort of lead time was relatively short under an hour. And then all of a sudden I got 1 that's like 8 hours. Or on Tuesday, Monday to right, you see, you get the point of Tuesday, everything average is like 6 hours. But, you know, every other day, it's like, you know, 48 minutes. I don't know what's going on there. Like with flow metrics, look it up. It allows me to look at the wait time in between. And so I'm analyzing so. Um, outcome. There's a great bunch of great publications. Again, I to revolution. Access library from right at automation. Of course, you would imagine we were pretty high on. On Ansible, but and we have some trusted software supply chain. The Dallas automated governance is the book I worked on. I've been very heavily involved in this. I've done a lot of presentations. If you're interested, look it up. It's pretty cool stuff. I'll just say real quickly, it's a model for shortening all the time to either from 30 days to maybe a half a day, maybe 0 day increases efficacy from maybe 25%. In other words, it's not it's secure. Moving from security compliance theater to actually made 90% efficacy and creates a real sort of roadmap to be able to reduce centralized cab authority. Some great books that were precursors to the Dallas automated governance that we worked on. Dear auditor. I don't like you in DevOps and audit. There's also a great paper that presented about some getting cloud providers to create at the stations to some of their infrastructure for automated cloud governance. Yeah, skills liquidity. Some of the things there are really important like, you know, Devo Stojo, big fan of that internal hackathons internal DevOps days. Right. Again, these are good recommendations, except that if I learned that there's certain things about your organization, I'm like, you know what, you don't want to do internal days right now. Like maybe that's something you need to sort of fix this, fix this first. So that's the other thing is transformation isn't linear. Right. Like there's multiple service and orgs within teams and team of teams and and orgs. And, you know, like, and, and some are going to be on cadence. Some are going to be another cadence. So, so again, I do think that's where a qualitative approach helps you a lot. Skills liquidity. There's a bunch of really good tools out there for this. You'll lean coffees if you haven't I love lean coffees in the enterprise. It's, it's such a cheap way to create collaboration and create sort of horizontal, you know, move sort of tribal knowledge to horizontal, which is, you know, just set them up on Wednesdays in the afternoon. And, you know, people kind of just, it's, it's a, it's a really easy way to get people sort of communicating in different groups or learning about other projects lunch shows and tells. Good ideas demo days. You know, I think this works really good. It's, you know, everybody wins here. You know, people get to see. And most organizations just start out with demo days when they're successful. I know one bank where like every week now, instead of just having rigid board meetings with the executives. Half of the time is actually demo day. So now the board looks forward to weekly demo days about like how they've improved how they sort of dev ops payments. Or, you know, you know, so guest lecturers, you know, just keep your eye on it, you know, but we like to speak vendors love to send their people to speak. You know, if, if, if you're in Chicago, you know, you're based in Chicago. You know, look on the agenda stuff and see, oh, look, look Adrian Krakow for his what is speaking on Tuesday. I bet you he'd be if I got him early enough he might stick around for a day and come in. Like, I love doing that stuff. Right. So. Inner source, right about Stojo. The dojo that can we go on about to Joe. I talked about the recommendations. There's some books is great book account last year about getting started with to Joe's continuous learning. Yeah, safe to fail. Right. These are things like incident analysis. If you haven't fallen John Aspar and the work he's doing adaptive capacity labs. Brilliant stuff. Understanding psychological safety resilience engineers. There's a couple of vendors now. That are sort of moving chaos engineering into convenience verification, particularly with Kafka, which is really interesting. Do I know more about that? Ping me. You know, flattening incident management continues verification. Here's an example. You know, sort of, it's really taking chaos engineering to a higher level. Because anyway, so. So there's actually a new book out by Casey Rosenthal and Nora Jones, like Casey was involved in the Netflix engineering of the chaos engineering. And again, some of the stuff we have on the open practice library and yeah, that's that's it man. That's Bob. Bob's not my uncle, but. That was great. We have a couple of folks that are on Prasanna and Steven, and if you guys have questions or thoughts about this, it's actually a really good talk following on the heels of the anticipatory awareness talk that jade did last week. And I really appreciate you teasing out the ideas around mental models. And, and that and combining that with kind of what jade was saying about, you know, being aware of them and making sure that you and you have a common taxonomy for your invocabulary when you're doing this work. I think that's really important. You lose if you don't just simply try to sit down. I mean, the best experiment I ever saw was what what he did right where they literally had everybody, you know, whether you like lean start up model or not. It was like everybody was communicating with like acronyms and terms, and everybody knew exactly what it meant. So from the CMO down to the sort of lowest level engineer that was like literally configuring configuration files for, you know, for one of the private clouds, like they were speaking exactly the same line. There was another thing that we went through and Steven, if you have, please unmute yourself and join in the conversation here. If you like. Back in the day when I was a baby product manager and startups, we all we made, I made everybody go through pragmatic marketing. Which was a is another way of getting everybody on the same page around personas and how you, you know, build out that and that mean it's been around for like 10 or 15 years now. So I'm pretty sure they're still in existence, but it's like the whole idea of that. You have to have a common vocabulary in order to do things like qualitative data analysis. And I think that's really that's the work that goes in upfront to before you get there as well. Well, and then, but well, but, but to sort of be clear, and the reason why I favor qualitative analysis is to unravel. Like, I, you know, if I talk to. Harnet people and I realized that there are certain teams like the database team is calling using a term. This is common everywhere, like using a term. You know, even I've had this even with sort of red out products where some people would call it. You know, the self service platform. Some people call it open shifts. Some people call Kubernetes. Right. And depending how far the drift was. You might not have been having the same conversation. So just getting everybody on the same remember when we did that. This was public. We did that. The commons in London, right? And it was the Deutsche Bank presentation where they gave their own name. They came up with their own name for, for our product, right? And I thought that was brilliant, right? Because now you like to sort of put their own little sort of marketing internal marketing spent on it. And now, you know, it was 1 word. So you do not scrap for like, what are the 5 or 6 terms that even red act uses for Kubernetes. And then they take, you take more ownership of it. Totally. Right. You have that. And to be perfectly honest with you, although not, you know, not in our best interest as a vendor, but to the point I made to them is it gives an ability to uncouple themself with a vendor. So that's a positive for the consumer in that, like, if they do need to shift to another product, it's much easier. If they just call it, you know, the orange banana or whatever, you know, so. Have to be careful with that because there is a company X by orange that I don't think they called it banana, but they have no, I know. The folks at orange are doing really amazing things with open shift and pieces of the cloud native ecosystem projects. They're pretty amazing. Stephen, is there anything that you wanted to ask or add? I'm just, I'm a, I'm a, I'm a fan of John's. We've, we've, we've met a couple of times at Dev Update in Austin. So yeah, yeah, totally. Yeah, I struggle with these conversations with my, with my customers. I'm a, I'm a fusion architect in the energy pod down here in Houston. And I, several of my different, you know, several clients that I have, several customers that I have are in heavy regulated, you know, environments, right? They have utilities, they're ISOs, they're running, they're running data grids and things like that. And they have that highly regulated environment mentality of adopting change, right? And a lot of the times it's, it's very difficult because, you know, there's one customer that we have that we open shift is on, you know, on premise on site. And unfortunately they're bastardizing it to the point to where it's almost unusable right there because they're basically lifting and shifting their old proven audible, you know, documented processes into this new platform and they're literally, they're getting some benefits out of it. But it's, it's very difficult. It's very difficult to, to cautiously tell them or not, and not to create an adversarial relationship but to basically shake them and be like, you're doing this wrong. You know, you're, you're, you're, it's great that you got this new platform but you're moving all of these antiquated old systems over because you don't know anything better or you don't want to change because you have this built in organizational knowledge of the products. You have organizational guarantees that if you use these approved products, you won't get in trouble type of mentality. There's a, there's a really low trust environment there. So I'm always looking at ways to a how, you know, how to start that conversation which doesn't turn into a you're doing it wrong conversation. But then be how to, how to move up the chain because I can talk to mid level managers. I can even talk to kind of director level, but they don't have the organizational power to be able to affect change in those ways. I'm always trying to figure out like a how to get the conversation going and then be how to make that conversation grow, go up rather than go down because I explained the concepts to them. I mean, I'll shake their head and they're like, oh yeah, that sounds great, but it won't work here. Yeah, well that's the, you know, again, there's no magic bullet right I mean, but I do find out of all the times I've spent over the years. You know, I mean, it seems I've always been on this transformation journey with clients, I think we all have right but going almost five generations of technology. And so I, I sort of like come to this, you know, conclusion at this point that that, you know, the thing that we're always missing is understanding those sort of three things up front, which is there are these mental models, you know, that we can't go into early with prescriptive solutions because we miss a lot of things. And, you know, and then there's sort of like, the thing you find, you know, I, you know, I just so many times I'll interview an organization that just spent two years going through sort of a big four recommended plan. And I'll ask them, did anybody talk to you from that, you know, big four organization. No, and if they would have, I would have told them that this wasn't going to work that you know, so the, you know, people just, you know, like, they're so tired of just like the sort of, you know, they see it as you know, the CEO sort of, you know, sort of another poster in the airport. There we go again, you know, you know, just these things don't pass the smell test and so when you sit down with them, and you loud listen, I mean, they will not shut up. Typically, I find all I have to do is say, the starter meeting with, you know, I used to think I had to do some time lining and boxes and very specifically. I find now I just say, you know, what is your company not doing that it should be doing. And literally that, you know, it just doesn't stop from there. And then I have to sort of drill in when I hear like a, I hear a trigger word like something that's related to risk or consistency. I'll sort of, well, what, give me an example of that, right. But, but my point is that to the, you know, to your point that, you know, at first this has to be mandated at the sea level. Right. Because, and then what you're doing then at that point is going back to them and then, and quite honestly, I had clients when I was not with red at where they didn't like the data. You know, they wanted to ignore it, you know, and that's their prerogative, you know, back then I would say, you know, well, you know, I'd actually said of them say you're not going to like the output of this and oh, no, John, no, we want to know the truth about everything. You know, we're sort of, and then you tell them and they like to see how it actually gets mad at you, right. And, you know, and then, you know, and like, and then I say like, I don't say I told you so, but I'm like, hey, one of the things I tell you in the beginning and I don't like the data, but you're going to pay me anyway. Like, just because you don't like the data doesn't mean you're not going to get invoiced. And then, you know, and at the end, I'm like, you know, I mean, you don't like, this is it, you know, and by the way, it's not me. It's your people. And by the way, you know, one other thing I always like to say to them too is, okay, do me a favor, because people like you feel, you know, the thing is, I don't think a lot of us sort of, we know this, but we don't remind ourselves. All the people we work with our clients, most of them wake up in the morning wanting to do a good job. Sometimes we get lost in the mile. Like we think everybody's sort of nobody. There's bad actors to this. The majority of people that go to work, want to do a good job. Right. And, but we put these barriers in their way to do that good job. Right. And, and so now all of a sudden, like, they're told, oh, you know, you got to go listen to this guy and then immediately they think I'm like, you know, if they don't know who I am, then like, oh, here we go. Another Bob from office space. Right. And, and then, you know, and then at some point I win their trust. Right. Like, because they realize, you know, I given the speech that I'm, you know, I'm here just to collect data. But John, are you going to fix anything? I don't know. I'm going to basically, I'm going to. I'm going to do some analysis and tell the executives what you tell me. So, you know, the fact that they invested in me doing this is a good sign. You know, because the other 8 people I talked to said they didn't want to do this. But near the end, they get to this like, you know, they get excited. You see them like, let me, let me tell you about this. Let me. And it's not like they're trying to like, bash their organization. They're like, you know, and then you get, and then when they know it's getting their hand when I do any on-prem is almost this like sadness like, like, oh, he's leaving. And then they're like, but John, is it going to fix anything? So normally I go back to CEO and if the CEO is like not receptive to like all the, you know, I've had. I was get fired while I'm in the middle of the process and the CEO is like, I don't know. I'm going to do this, this and this before I do that. I'm like, all right, I beg you, you do one thing. Like pick 2 things out of this list of 15 and just do it. Pick the 2 most harmless things that you think there are. Because if you don't do anything from this report and all these people spent, you know, basically, you know, a couple of 100 hours with me, they're just going to throw me into that same old category of like nothing changes. Every time I have to talk to one of these people, nothing changes. I'm like, do something. Like, just show them, you know, that even if you think it's a throwaway idea, just at least give them a bone to say, like, you know, that, you know, you are listening. Because again, I think people just get to the which end of like, you know, there's things that are wrong here. I know they're wrong. I'll say, well, I know running on time, but sometimes when I get somebody on a long ramp, I'll say, Hey, why do you work here? And it'll throw them completely out of, you know, as, you know, off guard. And then normally the answer is, I like here, you know, to be like a 5 second delay where they got to think, right? Like, I like it here. I like the people just wish we could fix these things. Anyway, yeah, we can fix the things. We have to listen to each other too. I think that's really it and hearing people. That's it. My last question is the, the global transformation office, you and Andrew and Kevin and is there a set of engagement? Like, how do I get your capabilities in front of my customers? Is it a set of documentation to where it's like, here's all of our capabilities? Are you interested? I'll say this, you know, there's sort of two answers. One is, you know, any one of us in a pinch can be pulled in for just about any conversation. So this brings me back to before we started today's recording, creating the landing page on how to contact. And this is something that you and I and the team need to need to get that stood up along with all the presentations you guys have been doing on these transformation Friday. So here's my commitment to reaching out and pinging you guys again to do that. And Steven, thank you for the instigation to do that.