 We're back at the Red Hat Summit at the Seaport in Boston, the Cube's coverage. This is day two, Dave Vellante and Paul Gill, and Chris Wright is here, the Chief Technology Officer at Red Hat. Chris, welcome back to the Cube, good to see you. Yeah, likewise. Thanks for having me. You're very welcome. So you were saying today in your keynote, we've got a lot of ground to cover here, Chris. You were saying that's, you know, software, you know, Andreessen's software is eating the world, software ate the world is what you said, and now we have to rethink about AI, AI is eating the world. What does that mean? What's the implication for customers and developers? Well, a lot of implications. I mean, to start with just acknowledging that software isn't this future dream, it is the reality of how businesses run today. It's an important part of understanding what you need to invest in to make yourself successful, essentially, as a software company, where all companies are building technology to differentiate themselves. Take that, all that discipline, everything we've learned in that context, bring in AI. So we have a whole new set of skills to learn, tools to create, and discipline processes to build around delivering data-driven value into the company, just the way we've built software value into companies. I'm going to cut right to the chase, because I would say data is eating software. Data and AI, to me, are like kissing cousins. And so, here's what I want to ask you as a technologist. So we have the application development stack, if you will. And it's separate from the data and analytics stack. Well, all we talk about is injecting AI into applications, making them data-driven. You just used that term. But they're totally separate stacks, organizationally and technically. Are those worlds coming together? Do they have to come together in order for the AI vision to be real? Absolutely. So, totally agree with you on the data piece. It's inextricably linked to AI and analytics and all of the kind of machine learning that goes on in creating intelligence for applications. The application connection to a machine learning model is fundamental. So you got to think about not just the software developer or the data scientist, but also there's a line of business in there that's saying, here's the business outcomes I'm looking for, and it's that trifecta that has to come together to make advancements and really make change in the business. And so, some of the folks we had on stage today were talking about exactly that, which is how do you bring together those three different roles? And there's technology that can help bridge gaps. So, we look at what we call intelligent applications, embed intelligence into the application. That means you surface a machine learning model with APIs to make it accessible into applications so that developers can query a machine learning model. You need to do that with some discipline and rigor around what does it mean to develop this thing and lifecycle it and integrate it into this bigger picture. So, the technology is capable of coming together. You know, Amanda Pernell is coming on next. She was talking about getting insights in the hands of nurses and our co-orders, but they need data. But, I feel like it's, well, I feel very strongly that it's an organizational challenge more so. I think you're confirming it's not really a technical challenge, I can insert a column into the application development stack and bring TensorFlow in or AI or data, whatever it is, right? It's not a technical issue, is that fair? Well, there are some technical challenges. So, for example, data scientists, kind of a scarce kind of skill set within any business. So, how do you scale data scientists to into the developer population, which will be a large population within an organization. So, there's tools that we can use to bring those worlds together. So, it's not just TensorFlow, it's entire workflow and platform of how you share the data, the data kind of training models, and then deploying models into a runtime production environment. That looks similar to software development processes, but it's slightly different. So, that's where a common platform can help bridge the gaps between that developer world and the kind of data science world. Where is Red Hat's position in this evolving AI stack? I mean, you're not into developing tool sets like TensorFlow, right? Yeah, that's right. I mean, if you think about a lot of what we do, it's aggregate content together, bring a distribution of tools, giving flexibility to the user, whether that's a developer, system administrator, or a data scientist. So, our role here is, one, make sure we work with our hardware partners to create accelerated environments for AI. So, that's sort of an enablement thing. The other is bring together those disparate tools into a workflow and give a platform that enables data scientists to choose which is it PyTorch, is it TensorFlow, what's the best tool for you, and assemble that tool into your workflow, and then proceed training, doing inference, and tuning, and lather, rinse, repeat. So, to make your platform as receptive as possible, right? You're not trying to pick winners in what languages to work with or what frameworks. Yeah, that's right. I mean, picking winners is difficult. The world changes so rapidly. So, we make big bets on key areas. And certainly, TensorFlow would be a great example. It's a lot of community attraction there. But our goal isn't to say that's the one tool that everybody should use. It's just one of the many tools in your toolbox. There are risks of not pursuing this from an organization's perspective, a customer. If they kind of get complacent and they could get disrupted. But there's also industry risk. If the industry can't kind of deliver this capability, what are the implications of, if the industry doesn't step up, I believe the industry will, as it always does. But what about customer complacency? We certainly saw that a lot with digital transformation COVID sort of forced us to march to digital. What should we be thinking about of the implications of not leaning in? Well, I think that the disruption piece is key because there's always that spectrum of businesses. Some are more leaning in, invested in the future. Some are more laggards and kind of wait and see. And those leaning in tend to be separating themselves weak from the shaft. So that's an important way to look at it. Also, if you think about many data science experiments fail within businesses, and I think part of that is not having the rigor and discipline around connecting not just the tools and data scientists together, but also looking at what business outcomes are you trying to drive? And if you don't bring those things together, then it's sort of, it can be too academic and the business doesn't see the value. And so there's also the question of transparency. How do you understand why as a model predicting you should take a certain action or do a certain thing? So we're, as an industry, I think we need to focus on bringing tools together, bringing data together and building better transparency into how models work. There's also a lot of activity around governance right now. AI governance, particularly removing bias from ML models. Is that something that you are guiding your customers on or do you feel, how important do you feel this is at this point of AI's development? I mean, it's really important. I mean, the challenge is finding it and understanding, we bring data that may be already carrying a bias into a training process and building a model around that. How do you understand what the bias is in that model? That's a, there's a lot of open questions there and academic research to try to understand how you can fair it out, essentially bias data and make it less biased or unbiased. Our role is really just bringing the tool set together so that you have the ability to do that as a business. So we're not necessarily building the next, the machine learning algorithm or models or ways of building transparency into models as much as building the platform and bringing the tools together that can give you that for your own organization. So it brings up the question of architectures and I've been sort of a casual or even active observer of data architectures over the last whatever, 15 years. And they've been really centralized. Our data teams are highly specialized. You mentioned data scientists, but there's data engineers and there's data analysts and very hyper specialized roles that don't really scale that well. And so there seems to be a move. Talk about edge, we're going to talk about edge and the ultimate edge, which is space, very cool. But data is distributed by its very nature. We have this tendency to try to force it into this monolithic system and I know that's a pejorative, but for good reason. And so I feel like there's this push in organizations to enable scale to decentralize data architectures. Okay, great. And put data in the hands of those business owners that you talked about earlier, the domain experts that have data content, that have business context. Yep. Two things that, two problems that brings up is you need infrastructure that's self-service in that instance. And you need to your point, automated and computational governance, those are real challenges. What do you see in terms of the trends to decentralize data architectures? Is it even feasible? Everybody wants a single version of the truth, centralized data team, right? Yeah. And they seem to be at odds. Yeah, well I think we're coming from a history informed by centralization. So that's what we understand. That's what we kind of gravitate towards. But the reality as you put it, it's the world's just distributed. And so what we can do is look at federations. So it's not necessarily centralization but create connections between data sources, which requires some policy and governance, like who gets access to what. And also think about those domain experts, maybe being the primary source of surfacing a model that you don't necessarily have to know how it was trained or what the internals are. You're using it more to query it as the domain expert produces this model. You're in a different part of the organization just leveraging some work that somebody else has done. Which is how we build software, reusable components in software. So I think building that mindset into data and the whole process of creating value from data is going to be a really critical part of how we roll forward. So there are two things in your keynote. One that was kind of an awe of, you wanted to be an astronaut when you were a kid. I mean, I watched those moon landing and I was like, I'm never going up the space. But so I'm in awe of that. I got the space helmet picture and all that. That's awesome. It really hats off to you. The other one really pissed me off, which was you're like a better skier because you got some device in your boot. And the reason it angered me is because I feel like it's the mathematicians taking over baseball. Now he's saying you're a better skier because of that. But those are two great edge examples and there's a billion of them. So talk about your edge strategy, kind of your passion there, how you see that all evolving. Well, first of all, we see the edge as a fundamental part of the future of computing. So in that centralization, decentralization, pendulum swing, we're definitely on the path towards distributed computing and that is edge and that's because of data and also because of the compute capabilities that we have in hardware. So hardware gets more capable, lower power, can bring certain types of accelerators into the mix and you really create this world where what's happening in a virtual context and what's happening in a physical context can come together through this distributed computing system. Our view is that's hybrid. That's what we've been working on for years. Just the difference was maybe originally it was focused on data center, cloud, multi-cloud and now we're just extending that view out to the edge and you need the same kind of consistency for development, for operations in the edge that you do in that hybrid world. So that's really where we're placing our focus and then it gets into all the different use cases and it's really, that's the fun part. I think shift gears a little bit because another remarkable statistic you decided during your keynote was a forester study that said 99% of all applications now have open source in them. What are the implications of that for those who are building applications in terms of license compliance and more importantly I think confidence in the code that they're borrowing from open source projects? Yeah, well I think first and foremost it says open source is one. We see that that was audited code bases which means there's mission critical code bases. We see that it's pervasive, it's absolutely everywhere and that means developers are pulling dependencies into their applications based on all of the genius that's happening in open source communities which I think we should celebrate. Right after we're finished celebrating we got to look at what are the implications, right? And that shows up as are there security vulnerabilities that become ubiquitous because we're using similar dependencies? What is your process for vetting code that you bring into your organization and push into production? You know that process for the code you author, what about your dependencies? And I think that's an important part of understanding and certainly there are some license implications. You know what are you required to do when you use that code? What has your, you've been given that code on a license from the open source community or are you compliant with that license? That's, some of those are reasonably well understood. Some of those are newer to the enterprise so I think we have to look at this holistically and really help enterprises build safe application code that goes into production and runs their business. We saw Intel up in these keynotes today. We heard from NVIDIA, both companies are coming on. We know you've done a lot of work with ARM over the years. I think Graviton was one of the announcements this week so I'd love to see that. I want to run something by you as a technologist. The premise is, you know we used to live in this CPU centric world. We marched to the cadence of Moore's Law and now we're seeing the combinatorial factors of CPU, GPU, NPU, accelerators and other supporting components IO and controllers and NICs all adding up. It seems like we're shifting from a processor centric world to a connect centric world on the hardware side. First of all, do you buy that premise and does hardware matter anymore with all the cloud? Hardware totally matters. I mean, the cloud tried to convince us that hardware doesn't matter and it actually failed and the reason I say that is because if you go to a cloud you'll find hundreds of different instance types that are all reflections of different types of assemblies of hardware. Faster IO, better storage, certain sizes of memory, all of that is a reflection of applications need certain types of environments for acceleration, for performance to do their job. Now, I do think there's an element of we're decomposing compute into all of these different sort of accelerators and the only way to bring that back together is connectivity through the network. But there's also SOCs when you get to the edge where you can integrate the entire system onto a pretty small device. I think the important part here is we're leveraging hardware to do interesting work on behalf of applications that makes hardware exciting and as an operating system geek, I couldn't be more thrilled because that's what we do. We enable hardware, we get down into the bits and bytes and poke registers and bring things to life. There's a lot happening in the hardware world and applications can't always follow it directly. They need that level of indirection through a software abstraction and that's really what we're bringing to life here. We've seen now hardware specific AI, AI chips and AI SOCs emerge. How do you make decisions about what you're going to support or do you try to support all of them? Well, we definitely have a breadth view of support and we're also just driven by customer demand. Where customers are interested. We work closely with our partners. We understand what their roadmaps are. We plan together ahead of time and we know where they're making investments and we work with our customers. What are the best chips that support their business needs and we focus there first, but it ends up being a pretty broad list of hardware that we support. I could pick your brain for an hour. We didn't even get into super cloud. Chris, thanks so much for coming on theCUBE. It's great to have you. Absolutely, thanks for having me. All right, thank you for watching. Keep it right there. Paul Gill and Dave Vellante, theCUBE's live coverage of Red Hat Summit 2022 from Boston. We'll be right back.