 Hey, welcome back everybody. Jeff Frick here with theCUBE. We're at PagerDuty Summit. It's in San Francisco at Pure 27. It's a new facility we've never been here. It's pretty neat. It's right between the Bay Bridge and kind of Pure 39. Beautiful day out on the water. And it's all about DevOps here with PagerDuty. And I'm going to tease Jen later to people even know what a pager is at this show. But we're excited to have Nicole Forsgren. She's a founder and CEO and chief scientist of DevOps Research and Assessment. I had to read it, it's a big mouthful but goes by Dora for short. Nicole, welcome to see you. Good to see you. Thanks so much, good to be here. All right, so you are the DevOps expert. You've got a really interesting past. Give some research on your LinkedIn profile. Industry, academe, industry, academe. And now you're out helping people. Yes, bounce around a bit. It's all about the pivot, right? We're here doing DevOps. Absolutely, so you do an annual report on the state of DevOps. So where are we? DevOps has been being talked about for a long, long time. How much is reality? How far are we on this journey? What are you seeing? Right, so it's really interesting you point that out, right? Because for years, everyone's been like, DevOps, what is it? Does it matter? And so with Dora, and by the way, Dora is myself, Jess Humble, Jean Kim, we just brought in Tsui Choi. But the core founders, we've partnered up with the team at Puppet. And for the last several years, we've put out this state of DevOps report. To kind of help define, at least from a research standpoint and from our standpoint, what it is, what are the key contributors to really drive value? And does it drive value? Because for years, and I'll talk about this later this afternoon in my closing keynote, for years, and when I say years, I mean decades of academic, rigorous peer review research, technology didn't matter, like it didn't matter at all. It just never delivered value to organizations. But then we started seeing patterns and really interesting patterns in companies saying, no, we're seeing results, we're delivering value, we're delivering outcomes, core essential outcomes for end users and customers in the business. And so we got together to say, okay, let's really take a look at this in a really important way. Right, but how far we've come, right? Because now most companies are technology companies. They just happen to wrap their technology around a particular product or a particular service. And now most people are leading with the technology in terms of a vehicle to drive value and to drive transformation. So DevOps is also very wrapped up in this whole concept of digital transformation. That's all anybody wants to talk about. It's in every earnings call. So how closely are the two related? And how do you see kind of the, because DevOps got a little bit more history in terms of the buzz of transformation. How are people applying kind of DevOps concepts beyond kind of strictly development and operations? So there's a lot to unpack there. So I think it, like you said, it's really, really involved, although it's also kind of a buzzword, right? Some people love it, some people embrace it, some people never want to hear it. So it's really all about what's important to the company in delivering value. But it's core, it's really about taking important methodologies, practices to deliver value, and it's about using technology and automation in conjunction with core values and practices and processes that we've adopted from the lean and agile movements. Right, right. And having a really good, healthy culture that's about more than just DevOps, right? Like you said, DevOps, QA, InfoSec, the business, marrying all of that, pulling all of it together, working in conjunction in the right kind of ways to deliver value, to deliver key outcomes, to help us pivot, move fast, learn, have fast feedback, so that we can do what we need to do for the company, for the business, because like you said, it's so many companies right now, really are technology organizations that happen to be wrapped around some particular industry. Right, right. Capital One is a financial institution, really they're a technology organization that happens to do finance, and deliver finance really, really well for their customers. So many other companies are doing retail, but it's driven by technology, right? Or they do insurance and it's driven by technology, or they're healthcare organizations that really can't do what they do unless they have technology to really drive it. Right, right. Well, the financial institutions are interesting, because if you talk to like my kids, they've ever been inside of an actual bank, and then, and how often do they go to the ATM? Not to, so not even ATM. So the way that people more and more interact with the company is through digital meetings. But I'm curious to get your input on the big question that we always ask people is, you know, how do I get started, right? What is the easy path to success? How do I get some early success so I can build on that success? What's interesting is you have a very unique approach to solve that question as opposed to, well, I think, you know, or based on what I'm really good at, I think we should start here. Yes, we really do. You're going to have a different tact. And this is really why Dora exists, and this is what we do. So myself, Jess Humble, Gene Kim, this kind of explains the genesis of Dora. So we kind of have a couple of different things. So the mission of Dora is to help companies get better through science and kind of proven methods. And so we have a couple of different things we do. The first is that state of DevOps report that we put together with Puppet, and those are all open sourced. And so if you want some ideas of what really statistically drives improvement, go find those. They're open source, they're totally free, right? We provide so many resources because we don't want companies to fail. You know, we've all lived through that awful dot combust. We've seen companies fail. Go find those resources. Now, your question though, where should I start? If I'm a company, what should I do? We've all gone to conferences myself. Gene, Giz, and we've had companies come up and say, well, where should I start? And the answer is always, it depends, right? The answer is always it depends because I can't tell you absent, no, absent context, absent data, absent information. If I don't know about someone's detailed information, I can't tell you. And so what we also have is we offer an assessment where I can collect data from the doers, right? There's this fantastic report from Forrester that's called The Dangerous Disconnect, and that's such a great title because if you ask executives, they drastically overestimate technology and DevOps maturity in organizations. So you shouldn't be, I mean, I love... They overestimate. Of course they do. I mean, because we need to be really, really optimistic about where our organizations are going. That is our role as executives. And so that's kind of appropriate, in certain conditions that's appropriate. But where it's not appropriate is when you're setting detailed strategy for your organizations. And so what we do is we offer an assessment where using the strong scientifically based measures that we have prepared and refined over now four years of rigorous academic research, we can go with a 15 minute survey, collect data from everyone in your organization that, like I said, are the doers. Dev, test, ops, QA, Infosec, including vendors, contractors, consultants, the people that are in the weeds every single day. I can measure you. I can benchmark you against the industry. I've got over 23,000 data points around the world, all industries, all company sizes. And then where should I start? I can algorithmically tell you what your bottleneck is. What your constraint is, where you should start to accelerate your performance. Based on my data. Based on your data. Based on your algorithms and based on your population data from this huge data set. And with the companies that we're working with right now, they're seeing amazing results. They're calling it outsized results. So a really great example we had was with Capital One. So they did the assessment across over a dozen lines of business. And by focusing on two core capabilities out of over 20, we focused them on the right two capabilities. They saw a 20X improvement in deploy frequency in only two months with zero increase in incident. 20% improvement in 20X. 20X. 20X. In two months. 20 times. Wow. So it's that ability to measure consistently, see visibility throughout that software engineering life cycle. So we also had feedback from a customer like Verizon. That that visibility, that consistency of measurement was also a really huge value add. Measurement's hard. Well it's interesting, I saw some of your videos and some of your prior keynotes and stuff and talking about everyone likes to take data as a new oil. But the data without context and the data without the right algorithms and you talk about a bunch of kind of data dirty things that, you know, and data problems, you know, data by itself is not the new oil. And so I want to dig into your report because that's kind of your benchmark, right? That's kind of your big stake in the ground. So how long have you been doing it? What do you do different than other things that are out there besides the fact that it's open source which I'll ask you about as a follow up but what makes your research special? So why is our report different from any other reports out there? I think there's a couple things. So the piece that makes me the proudest is that the state of DevOps report is so different because it's academically rigorous. It's a true research report. And I love that the team has been so loving and so patient with me because when I started working with the rest of the group four years ago, I sort of stepped in and I said, this is what I want to do. These are my ideas. I was still a professor at the time. So as you mentioned, I was kind of industry and then academia and now I'm in industry again. But I stepped in and I said, I think there's this really, really fantastic opportunity to take a look at what's going on. But we have to measure this in really rigorous ways. And by doing that, it allows us to look at predictive relationships which is interesting because it lets us say, if we focus on core capabilities, they will predict organizations ability to develop and deliver quality software with both speed and stability, which will in turn drive improvements in organizational performance, profitability, productivity, market share, effectiveness, efficiency, delivering mission and organizational goals. Notice I'm saying predict and drive. I'm not saying correlate, which is really interesting. And so in these years of research, we've been able to identify core capabilities that drive improvement. So it allows organizations to understand what's important to invest in. It's not just this work for my team, this work for that team. Hey, I think this is what I'm gonna try because as I'm fond of joking, anecdote is nice, but the plural of anecdote isn't anecdote data. Right? And that was my frustration when I was in tech before and when I was in consulting is, we wanna try a thing and we wanna apply it, but it's really hard if I only have one or two or three or five, maybe even 10 stories. We need so much data to really understand what will likely work for teams for industries as a whole. And like I said, God bless the team because I came in and I was really rigorous and I would say that doesn't work. We can't measure that, that doesn't work here. And sometimes I come back and I say, that doesn't hold, the stats don't hold. And they say, but it has to. I know it worked here and I know it worked here. And I'm like, but it's not, we have no evidence to support that. The stats don't hold. This doesn't work. We can't say that. And we're like, okay, we'll have to try it again next year and not try it again next year, but we have to find a different way to measure. We have to have a different hypothesis to test. But then we also find really amazing things like I've said a couple of times, it predicts a team's ability to develop and deliver code with speed and stability. Speed and stability. We found four years in a row, speed and stability go together. For years, we didn't know that was the case or we thought that in order to get stability you had to slow down. It doesn't show up anywhere in the data. Nowhere. High performers get both. So do the executives, do they realize the lever that having better internal software development has as an impact on their business relative to saving a few bucks on parts or spending a few more bucks on marketing as a real driver of value, as opposed to these facilities, internal apps that we have to build for whatever reason. They're starting to get there. And so what we're starting to do is we're really focusing heavily on delivering code with speed and stability. And then we're saying, okay, imagine if you could deliver with speed and stability here. What could you do with delivering features? How does that help you get to market faster? How does that help you beat your competitors? How does it allow you to respond to compliance and regulatory changes, right? And so that's really what helps us drive. And then another way that we're a little different from other reports that are out there, other industry reports are also very helpful, but they're very different. So what we don't say is, I don't say things like 27% of the industry is using configuration management. Other reports say that and that is interesting. Like I don't report on percentage of the industry that's doing something. Right, right. But those other reports cannot say what is predictive of improvement. So we go to prediction. Occasionally I'll report correlations if I don't have the statistics to go as strong as the prediction. And what moves it from correlation to prediction is the strength of the algorithms. Well, no, is the strength of the research design. The strength of the research design up front before you feed it in. Up front. So it's really an academic research rigor. It's the underpinning of the whole thing. And much of our data has been published in academic peer review. So we are still actively doing research. And I would imagine that the annual report is really kind of an ongoing longitudinal study across a whole lot of the same companies over and over and over year and year out. So you're getting kind of a longitudinal look as well. Awesome. All right, Nicole. Well, that is fascinating. And everyone should go to Dora and get the free research. And then if they want to bring you guys in then you offer custom services to help the particular company execute and do better. Yes, absolutely. So you can go to devops-research.com to find all of our research and anything else you want to find out about engaging with us or anything like that. Nicole Forsgrim, she's Dora the Explorer. She'll help you out with your DevOps. I'm Jeff Frick. You're watching theCUBE from PagerDuty Summit. Thanks for watching.