 Welcome back to the CUBE coverage of Red Hat Summit 21, 2021. I'm John Furrier, host of theCUBE, it's virtual this year as we start preparing to come out of COVID, a lot of great conversations here happening around technology. This is the emerging technology with Red Hat segment. We've got three great guests, Steve Watt, manager, distinguished engineer at Red Hat, Earl Singh, senior software engineer at Red Hat and Luke Hines, who's the senior software engineer as well. We've got the engineering team. Steve, you're the team leader, emerging tech within Red Hat. Always something to talk about. You guys have great tech chops, it's well known in the industry and obviously now part of IBM, you've got a deep bench. How do you view emerging tech? How do you apply it? How do you prioritize? Give us a quick overview of the emerging tech scene at Red Hat. Yeah, sure. It's quite a conflated term. The way we define emerging technologies is that it's a technology that's typically 18 months plus out from commercialization. And this can sometimes go six months either way. Another thing about it is it's typically not something on any of our product roadmaps within the portfolio. So in some sense, it's often a bit of a surprise that we have to react to. So no real agenda. It's not, I mean, you have some business unit kind of probably, but you had to have first principles within Red Hat, but for this, you're looking at kind of the moonshots, so to speak, the big game-changing shifts, quantum. You know, you got now supply chain from everything from new economics, new technology. Was that kind of getting it right? Yeah, I think we definitely use a couple of different techniques to prioritize and filter what we're doing. The first is something will pop up and it'll be like, is it in our addressable market? So our addressable market is that we're a platform software company that builds enterprise software. And so, you know, it's got to be sort of fit into that. As a great example, if somebody came up, came to us with an idea for like a drone command center, which is a military application, it is an emerging technology, but it's something that we would pass on. Yeah, I mean, obviously it makes sense, but also what's interesting is that you guys have an open source DNA, so it's you have also a huge commercial impact. And again, open source is one of the fourth, fifth generation of awesomeness. So, you know, the goodness is open source is well proven, but as you start getting into this more disruption, you got the confluence of, you know, core cloud, cloud native, industrial and IoT edge and data. All of this is interesting, right? This is where the action is. How do you guys bring that open source, community participation? You got more stakeholders emerging there before. Steve, break down how you guys manage all that complexity. Yeah, sure. So I think that the way I would start is that, you know, we like to act on good ideas, but I don't think good ideas come from any one place. And so we typically organize our teams around sort of horizontal technology sectors. So you've got, you know, Luke who's heading up security, but I have an edge team, a cloud networking team, a cloud storage team, cloud application platforms team. So we've got these sort of different areas that we sort of attack work and opportunities. But, you know, the good ideas can come from a variety of different places. So we try and leverage co-creation with our customers and our partners. So as a good example of something we had to react to a few years ago was Knative, right? So that sort of a new way of doing serverless and inventing on top of Kubernetes, that was originated from Google. Whereas if you look at quantum, right? IBM's the actual driver on quantum science and that originated from IBM. We're parole, we'll talk about exactly how we chose to respond to that. Some things are originated organically within the team. So Luke talking about Sixth Law is a great example of that. But we do have a, we sort of use the addressable market as a way to sort of focus what we're doing. And then we try and land it within our different emerging technologies teams to go tackle it. Now you asked about open source communities which are quite interesting. So typically when you look at an open source project it's there to tackle a particular problem or opportunity. Sometimes what you actually need commercial vendors to do is when there's a problem or opportunity that's not tackled by any one open source project. We have to put them together to create a solution to go tackle that thing. That's also what we do. And so we sort of create this bridge between Red Hat and our customers and multiple different open source projects. And this is something we have to do because sometimes just that one open source project doesn't really care that much about that particular problem. They're motivated elsewhere. And so we sort of create that bridge. We got two great cohorts here and colleagues parole on the quantum side. And you got Luke on the security side. I'll start with you. Quantum is obviously a huge mentioned IBM. Great leadership there. Quantum on open shift. I mean, come on. Just that's not coming together from me in my mind. It's not the first thing I think of but it really sounds compelling. Take us through how this changes the computing landscape because heterogeneous systems is what we want and that's the world we live in but now with distributed systems and all kinds of new computing modules out there. How does this make sense? Take us through this. Yeah, Jones. But before I think I want to explain something which is called quantum supremacy because it plays very important role in the roadmap that been working on. So quantum computers, they are evolving and they have been around but right now you see that they are going to be the next thing. And we define called quantum supremacy as let's say you have any program that you run or any problem that you solve on a classical computer. Quantum computer would be giving you the results faster. So that is how we define quantum supremacy when the same workload are doing better on quantum computer than they do in a classical computer. So the whole drive is all the applications or all the companies they're trying to find avenues where quantum supremacy are going to change how they solve problems or how they run their applications. And even though quantum computers, they are there but it is not as easily accessible for everyone to consume because it's a very new area that's being formed. So what we were thinking how we can provide a mechanism that you can connect these two worlds. You have a classical world, you have a quantum world and that's where a lot of thought process went and we thought, okay, so with OpenShift we have the best of the classical components. You can take OpenShift, you can develop, deploy run your application in a containerized platform. What about you provide a mechanism that the workloads that are running on OpenShift they are also consuming quantum resources or they are able to run the computation on quantum computers, take the results and integrate them in their normal classical workloads. So that is the whole, that was the whole inception that we had and that's what brought us here. So we took an operator based approach and what we are trying to do is establish the best practices that you can have these heterogeneous applications that can have classical components talking to or interacting the results or exchanging data with the quantum components. So I got to ask with the rise of containers, now Kubernetes at the center of the cloud native value proposition, what workloads do you see benefiting from the quantum systems the most? Is there, do you guys have any visibility on some of those workloads? So again, it's a very new, it's a very, it's real very early in the time and we talk with our customers and every customers they are trying to identify themselves first where these quantum supremacy will be playing the role. What we are trying to do is when they reach there we should have a solution that they could use the existing infra that they have on OpenShift and use it to consume the quantum computers that may or may not be inside their own cloud. Well, I might want to come back and ask you some of the impact on the landscape. I want to get the look real quick because I think security, quantum breaks security potentially some people have been saying but you guys are also looking at a bunch of projects around supply chain which is a huge issue when it comes to the landscape whether it's components on a machine in space to actually handling data on a corporate database. You guys have SIGstore, what's this about? Sure, yeah, so SIGstore, a good way to frame SIGstore is to think of let's encrypt and what let's encrypt did for website encryption is what we plan to do for software signing and transparency. So SIGstore itself is an umbrella organization that contains various different open source projects that are developed by the SIGstore community. Now SIGstore will be brought forth as a public good nonprofit service. So again, we're very much basing this on the successful model of let's encrypt. SIGstore will enable developers to sign software artifacts, bills and materials, containers, binaries, all of these different artifacts that are part of a software supply chain. These can be signed with SIGstore and then these signing events are recorded into a technology that we call a transparency log which means that anybody can monitor signing events and a transparency log has this nature of being read only and immutable. It's very similar to a blockchain. It allows you to have cryptographic proof auditing of a software supply chain. And we've made SIGstore so that it's easy to adopt because traditional cryptographic signing tools are a challenge for a lot of developers to implement in their open source projects. They have to think about how to store the private keys, do they need specialist hardware. If they were to lose a key, then cleaning up afterwards the blast radius of the key compromise can be incredibly difficult. So SIGstore's role and purpose essentially is to make signing easy, easy to adopt by projects. And then they have the protections around there being a public transparency log that could be monitored. So this is all about open, being more open makes it more secure as the thing. Very much, yes, yes. It's that security principle of the more eyes on the code, the better. So let me just back up, is this an open, you said it's going to be a non-profit organization? That's correct, yes, yes. So all of the code is developed by the community. It's all open source. Anybody can look at this code. And then we plan alongside the Linux Foundation to launch a public good service. So this will make it available for anybody to use it, be a non-profit free to use service. So Luke, maybe Steve, if you can weigh in too on this. I mean, this goes back, if you look back at some of the early cloud days people were really trashing cloud as there's no security and cloud turns out it's a more security now with cloud given the complexity and scale of it. Does that apply the same here? Because I feel this is a similar kind of concept where it's open, but yet the more open it is the more secure it is. And then it might have a be a better fit for say an IT security solution because right now everyone's scrambling on the IT side whether it's zero trust or endpoint protection everyone's kind of trying everything in sight. And this is kind of changing the paradigm a little bit on software security. Could you comment on how you see this playing out in traditional enterprises? Because if this plays out like the cloud, open wins. So Luke, why don't you take that and then I'll follow up with another lens on it which is the operate first piece. Sure, yes. So I think in a lot of the ways this has to be open this technology, okay? Because this way we have transparency, the code can be audited openly, okay? Our operational procedures can be audited openly and the community can help to develop not only our code, but our operational mechanisms. So we look to use technology such as Kubernetes Openshift operators and so forth. Six store itself runs completely in a cloud. It is cloud native, okay? So it's very much in the paradigm of cloud. And yeah, essentially security always it operates better when it's open. I found that from looking at all aspects of security over the years that I've worked in this realm. Okay, so just to add to that some other context around six store that's interesting which is software secure supply chain. So six store is a solution to help build more secure software secure supply chains or more secure software supply chain. And so there's a growing community around that and there's an ecosystem of sort of cloud native Kubernetes centric approaches for building more secure software. I think we all caught the SolarWinds attack at sort of enterprise software industry is responding sort of as a whole to go and close out as many of those gaps as possible reduce the attack surface. So that's one aspect about why six store so interesting. Another thing is how we're going about it. So we talked about, you mentioned some of the things that people like about open source which is one is transparency. So sunlight is the best disinfectant, right? Everybody can see the code. We can kind of make it more secure. And then the other is agency where basically if you're waiting on a vendor to go do something, if it's proprietary software you really don't have much agency to get that vendor to go do that thing. Whereas the open source, if you're tired of waiting around you can just submit the patch. So what we've seen with package software is with open source we've had all this transparency and agency but we've lost it with software as a service, right? Where vendor or cloud service providers are taking package software and then they're making it available as a service but that operationalizing that software that is proprietary and it doesn't get contributed back. And so what Luke's building here as long along with our partners Dan Lawrence from Google, very active contributor in it the is the operational piece to actually run six store as a public service is part of the open source project. So people can then go and take six store maybe run it as a smaller internal service. Maybe they discover a bug. They can fix that bug contributed back to the operationalizing piece as well as the traditional package software to basically make it a much more robust and open service. So you bring that transparency and the agency back to the SAS model as well. Luke, if you don't mind before I end this segment portion of it, the importance of immutability is huge in the world of data. Can you share more on that? Because you're seeing that as a key part of the blockchain, for instance having this ability to have immutability because people worry about how things progress in this distributed world whether from a hacking standpoint or tracking changes, immutability becomes super important and how it's going to be preserved in this new six-door way. Oh yeah. So immutability essentially means cannot be changed. So the structure of something is set. If it is any way tampered or changed then it breaks the cryptographic structure that we have of our public transparency service. So this way anybody can effectively recreate the cryptographic structure that we have of this public transparency service. So this immutability provides trust that there is non-repudiation of the data that you're getting. This data is data that you can trust because it's built upon a cryptographic foundation. So it has very much similar parallels to blockchain. You can trust blockchain because of the immutable nature of it and there is some consensus as well. Anybody can effectively download the blockchain and run it themselves and compute that the integrity of that system it can be trusted because of this immutable nature. So that's why we made this an inherent part of six-door is so that anybody can publicly audit these events and data sets to establish that they're tamper-free. That is a huge point. I think one of the things beyond just the security aspect of being hacked and protecting assets. Trust is a huge part of our society now not just on data, but anything that's reputable whether it's videos like this at being deep faked or news or any information. All this ties to security again, fundamentally and amazing concepts. Really want to keep an eye on this great work. Pearl, I got to get back to you on quantum because again, people love quantum. It feels like so sci-fi and it's like almost right here, right? So close and it's happening. And then people get, oh, what does that mean for security? We'll go back to Luke and ask him, will quantum hack crypto? But before we get started, I'm curious about how that's going to play out as a project because is it going to be more part of like a CNCF? How do you bring the open source vibe to quantum? So that's a very good question because that was a plan, the whole work that we are going to do related to operators to enable quantum is managed by the open source community. And that project lies in the Q-Skit. So Q-Skit has their own open source community and all the modification, oh, by the way, I should first tell you what is Q-Skit. So Q-Skit is the SDK that you use to develop circuits that are run on IBM or Honeywell back in. So there are certain quantum computers back in that support circuits that are created using Q-Skit SDK, which is an open source as well. So there is already a community around this, which is the Q-Skit open source community. And we have pushed the code and all the maintenance is taken care by that community. To answer your question about if we are going to integrate it with CNCF, that is not in the picture right now. We are, it has a place in its own community and it is also very niche to people who are working on the quantum. So right now you have like the contributors who are from IBM as well as other communities that are specifically working on quantum. So right now I don't think so we have the map to integrated with CNCF, but open source is the way to go and we are on that trajectory. You know, we joke here at the cube that a cube bit is coming around the corner can help but weave that in a different, with a C. But Luke, I want to ask you, one of the things that while you're here, your security guru, I wanted to ask you about quantum because a lot of people are scared that quantum is going to crack all the keys on encryption with this power and more hacking. You're just comment on that. What's your reaction to that? Sure, yes, that's an incredibly good question. This will occur, okay? And I think it's really about preparation more than anything. Now, one of the things that we, there's a principle that we have within the security world when it comes to coding and designing of software and this aspect of future cryptography being broken as we've seen with the likes of MD5 and char one and so forth. So we call this algorithm agility. So this means that when you write your code and you design your systems, you make them conducive to being able to easily swap and pivot the algorithms that you use. So the encryption algorithms that you have within your code, you do not become too fixed to those. So that if as computing gets more powerful and the current sets of algorithms are shown to have inherent security weaknesses, you can easily migrate and pivot to a stronger algorithm. So that's imperative really is that when you build code, you practice this principle of algorithm agility. So that when char two, five, six, or char five, 12 becomes the char one, you can swap out your systems. You can change the code in a very least disruptive way to allow you to address that floor within your code and your software projects. You know, Luke, this is mind bender right there because you start thinking about what this means is when you think about algorithmic agility, you start thinking, okay, software countermeasures, automation, you start thinking about these kinds of new trends where you need to have that kind of signature capability you mentioned with this project you're mentioning. So the ability to actually who signs off on these, this comes back down to the paradigm that you guys are talking about here. Yes, very much. So there's another analogy from the security world. They call it turtles all the way down, which is effectively you always have to get to the point that a human or a computer establishes that first point of trust to sign something off. And so it's a world that is ever increasing in complexity. So the best that you can do is to be prepared, to be as open as you can to make that pivot as and when you need to. Pretty impressive, great insight. Steve, we could talk for an hours on this panel, the emerging tech with Red Hat. Just give us a quick summary of what's going on. Obviously you got a serious brain trust going on over there, real world impact. I mean, you talk about the future of trust, future of software, future of computing, all kind of going on real time right now. This is not so much R and D as it is the front range of tech. Give us a quick overview. So the first thing I would tell everyone is go check out nex.redhat.com. That's got all of our different projects, who to contact if you're interested in learning more about different areas that we're working on. And it also lists out the different areas that we're working on. But just as an overview, so we're working on software defined storage, cloud storage, Sage wheel, the creator of Zeph is the person that leads that group. We've got a team focused on edge computing. They're doing some really cool projects around very lightweight operating systems that and Kubernetes, you know, open shift based deployments that can run on, you know, devices that you screw into the sheet rock, you know, for that's really interesting. We have a cloud networking team that's looking at OVN and just intersection of EBPF and networking and Kubernetes and then, you know, we've got an application platforms team that's looking at quantum, but also sort of how to advance Kubernetes itself. So that's the team where you got the persistent volume framework from and Kubernetes and that added block storage and object storage to Kubernetes. So there's a lot of really exciting things going on. Our charter is to inform Red Hat's long-term technology strategy. We work, my personal philosophy about how we do that is that Red Hat has one branch in product engineering focuses on their product roadmap, which is by nature, you know, the six to nine months. And then the longer-term strategy is set by both of us. And it's just that they're not focused on it. We're focused on it. And we spend a lot of time doing disambiguation of the future. And that's kind of what we do. We love doing it. I get to work with all these really super smart people. It's a fun job. Well, great insights. It's super exciting, emerging tech within Red Hat. Obviously the industry, you guys are agile, you're open source, and now more than ever, open source is a productization of open sources happening. It's such an accelerated rate. Steve, thanks for coming on. Pearl, thanks for coming on. Luke, great insights all around. Thanks for sharing the content here. Thank you. Our pleasure. Thank you, Joe. Okay, we're more Red Hat coverage after this video. Obviously emerging tech is huge. Watch some of the game-changing action here at Red Hat Summit. I'm John Furrier. Thanks for watching.