 From theCUBE Studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR. This is Breaking Analysis with Dave Vellante. Confidential computing is a technology that aims to enhance data privacy and security by providing encrypted computation on sensitive data and isolating data from apps and a fenced off enclave during processing. The concept of confidential computing is gaining popularity, especially in the cloud computing space where sensitive data is often stored and of course processed. However, there are some who view confidential computing as an unnecessary technology and a marketing ploy by cloud providers aimed at calming customers who are cloud phobic. Hello and welcome to this week's Wikibon Cube Insights powered by ETR. In this Breaking Analysis, we revisit the notion of confidential computing and to do so, we'll invite two Google experts to the show, but before we get there, let's summarize briefly. There's not a ton of ETR data on the topic of confidential computing. I mean, it's a technology that's deeply embedded into silicon and computing architectures, but at the highest level, security remains the number one priority being addressed by IT decision makers in the coming year as shown here. And this data is pretty much across the board, by industry, by region, by size of company. We dug into it and the only slight deviation from the mean is in financial services. The second and third most cited priorities, cloud migration and analytics, are noticeably closer to cybersecurity in financial services than in other sectors, likely because financial services has always been hyper security conscious, but security is still a clear number one priority in that sector. The idea behind confidential computing is to better address threat models for data in execution. Protecting data at rest and data in transit have long been a focus of security approaches, but more recently, silicon manufacturers have introduced architectures that separate data and applications from the host system. ARM, Intel, AMD, NVIDIA and other suppliers are all on board, as are the big cloud players. Now the argument against confidential computing is that it narrowly focuses on memory encryption and it doesn't solve the biggest problems in security. Multiple system images, updates, different services and the entire code flow aren't directly addressed by memory encryption. Rather to truly attack these problems, many believe that OSes need to be re-engineered with the attacker and hacker in mind. There are so many variables and at the end of the day, critics say the emphasis on confidential computing made by cloud providers is overstated and largely hype. This tweet from security researcher, Rodrigo Brunco, sums up the sentiment of many skeptics. He says, confidential computing is mostly a marketing campaign from memory encryption. It's not driving the industry towards the hard open problems. It is selling an illusion. Okay, nonetheless, encrypting data in use and fencing off key components of the system isn't a bad thing, especially if it comes with the package essentially for free. There has been a lack of standardization and interoperability between different confidential computing approaches, but the confidential computing consortium was established in 2019, ostensibly to accelerate the market and influence standards. Notably, AWS is not part of the consortium, likely because the politics of the consortium were probably a conundrum for AWS because the base technology defined by the consortium is seen as limiting by AWS. This is my guess, not AWS's words. But I think joining the consortium would validate a definition which AWS isn't aligned with. And two, you know, it's got a lead with the Santa Perna acquisition. It was way ahead with ARM integration. And so it probably doesn't feel the need to validate its competitors. Anyway, one of the premier members of the confidential computing consortium is Google, along with many high-profile names, including ARM, Intel, Meta, Red Hat, Microsoft, and others, and we're pleased to welcome two experts on confidential computing from Google to unpack the topic. Nellie Porter is head of product for GCP Confidential Computing and Encryption, and Dr. Patricia Florisi is the technical director for the office of the CTO at Google Cloud. Welcome, Nellie and Patricia, great to have you. Great to be here. Thank you so much for telling us. You're very welcome. Nellie, why don't you start and then Patricia, you can weigh in. Just tell the audience a little bit about each of your roles at Google Cloud. So, I'll start. I'm owning a lot of interesting activities in Google and, again, security or infrastructure, security that I usually own. And we're talking about encryption and to-end encryption, and confidential computing is a part of portfolio. Additional areas that I contribute to GCP is my key into Google and our customers is secure software supply chain, because you need to trust your software that operates in your confidential environment to have end-to-end story about if you believe that your software and your environment doing what you expect. It's my role. Got it, okay. Patricia? Well, I am a technical director in the office of the CTO, October for short, in Google Cloud. And we are a global team. We include former CTOs like myself and senior technologists from large corporations, institutions, and a lot of success for startups as well. And we have two main goals. First, we work side-by-side with some of our largest, more strategic or most strategic customers, and we help them solve complex engineering technical problems. And the second, we are devised Google and Google Cloud engineering and product management on emerging trends and technologies to guide the trajectory of our business. We are a unique group, I think, because we have created this collaborative culture with our customers. And within October, I spend a lot of time collaborating with customers in the industry at large on technologies that can address privacy, security, and sovereignty of data in general. Excellent. Thank you for that, both of you. Let's get into it. Mr. Nellie, what is confidential computing from Google's perspective? How do you define it? Confidential computing is a tool. And it's one of the tools in our toolbox. And confidential computing is a way how we would help our customers to complete this very interesting end-to-end lifecycle of the data. And when customers bring in the data to cloud and want to protect it as they ingest it to the cloud, they protect it at rest when they store data in the cloud. But what was missing for many, many years is the ability for us to continue protecting data and workloads of our customers when they running them. And again, because data is not brought to cloud to have huge graveyard, we need to ensure that this data is actually indexed. Again, there is some insights driven and drawn from this data. You have to process this data. And confidential computing here to help. Now we have end-to-end protection of our customers' data when they bring the workloads in data to cloud thanks to confidential computing. Thank you for that. We're going to get into the architecture a bit, but before we do Patricia, why do you think this topic of confidential computing is such an important technology? Can you explain? Do you think it's transformative for customers? And if so, why? Yeah, I would maybe like to use one thought, one way, one intuition behind why confidential computing matters. Because at the end of the day, it reduces more and more the customer's thrush boundaries and the attack surface. That's about reducing that periphery, the boundary in which the customer needs to mind about trust and safety. And in a way, it's a natural progression that you're using encryption to secure and protect the data in the same way that we are encrypting data in transit and at rest. Now we are also encrypting data while in use. And among other beneficials, I would say one of the most transformative ones is that organizations will be able to collaborate with each other and retain the confidentiality of the data. And that is a cross industry. Even though it's highly focused on, I wouldn't say highly focused, but very beneficial for highly regulated industries, it applies to all of industries. And if you look at financing, for example, where bankers are trying to detect fraud and specifically double finance, where a customer is actually trying to get a finance on an asset, let's say a boat or a house, and then it goes to another bank and gets another finance on that asset. Now bankers would be able to collaborate and detect fraud while preserving confidentiality and privacy of the data. Interesting. I want to understand that a little bit more, but I'm going to push you a little bit on this Nellie, if I can, because there's a narrative out there that says confidential computing is a marketing ploy. I talked about this upfront by cloud providers that are just trying to placate people that are scared of the cloud. And I'm presuming you don't agree with that, but I'd like you to weigh in here. The argument is confidential computing is just memory encryption. It doesn't address many other problems. It is over-hyped by cloud providers. What do you say to that line of thinking? I absolutely disagree, as you can imagine with this statement, but the most importantly is we're mixing multiple concepts, I guess. And exactly as Patricia said, we need to look into end-to-end story and not again the mechanism how confidential computing trying to again execute and protect a customer's data. And why it's so critically important because what confidential computing was able to do, it's in addition to isolate our tenants and multi-tenant environments of cloud open to offer additional stronger isolation. We call that cryptographic isolation. And it's why customers will have more trust to customers and to other customers, the tenants running on the same course, but also us because they don't need to worry about against threats and more malicious attempts to penetrate the environment. So what confidential computing is helping us to offer our customers stronger isolation between tenants in this multi-tenant environment, but also incredibly important stronger isolation of our customers, the tenants from us. We also writing code. We also software providers. We also make mistakes or have some zero days. Sometimes again, us introduced, sometimes introduced by our adversaries. But what I'm trying to say by creating this cryptographic layer of isolation between us in our tenants and among those tenants, they really providing meaningful security to our customers and eliminate some of the worries that they have running on multi-tenant spaces or even collaborating together this very sensitive data, knowing that this particular protection is available to them. Okay, thank you, I appreciate that. And I think malicious code is often a threat model missed in these narratives, operator access. Yeah, maybe I trust my clouds provider, but if I can fence off your access even better, I'll sleep better at night separating the code from the data. Everybody's Arm, Intel, AMD and video others, they're all doing it. I wonder if Nell, if we could stay with you and bring up the slide on the architecture. What's architecturally different with confidential computing versus how operating systems and VMs have worked traditionally, we're showing a slide here with some VMs. Maybe you could take us through that. Absolutely, and Dave is a cool idea for Google and now industry way of dealing with confidential computing is to ensure that three main property is actually preserved. Customers don't need to change the code. They can operate on those VMs exactly as they would with normal non-confidential VMs, but to give them this opportunity of lift and shift or no changing the apps and performing and having very, very, very low latency and scale as any cloud can, something that Google actually pioneer in confidential computing. I think we need to open and explain how this magic was actually done. And as I said, it's, again, the whole entire system have to change to be able to provide this magic. And I would start with, we have this concept of root of trust and root of trust where we will ensure that this machine and the whole entire host has integrity guarantee. Means nobody changing my code on the most low level of system. And we introduce this in 2017 called Titan. So our specific ASIC, specific, again, inch by inch system on every single motherboard that we have that ensures that your low level former, your actually system code, your kernel, the most powerful system is actually proper configured and not changed, not tempered. We do it for everybody, confidential computing included, but for confidential computing, what we have to change. We bring in AMD or in future Silicon vendors and we have to trust their former, their way to deal with our confidential environments. And that's why we have obligation to validate integrity, not only our software and our former, but also former and software of our vendors, Silicon vendors. So we actually, when we're booting this machine, as you can see the validate that integrity of all of this system is in place. It means nobody touching, nobody changing, nobody modifying it. But then we have this concept of AMD secure processor. It's special ASICs, best specific things that generate a key for every single VM that our customers will run or every single node in Kubernetes or every single worker thread in our Hadoop or Spark capability, we offer all of that. And those keys are not available to us. It's the best keys ever in encryption space because when we're talking about encryption, the first question that I'm receiving all the time were the key, who will have access to the key? Because if you have access to the key, then it doesn't matter if you encrypted enough. So that's the case in confidential computing, why it's so revolutionary technology. As cloud providers, we don't have access to the keys. They sitting in the hardware and they fed to memory controller. And it means when hypervisors that also know about these wonderful things, saying, I need to get access to the memory that this particular VM trying to get access to, they do not decrypt the data. They don't have access to the key because those keys are random, ephemeral, and per VM, but the most importantly, in hardware not exportable. And it means now you would be able to have this very interesting world that customers or cloud providers will not be able to get access to your memory. And what we do, again, as you can see, our customers don't need to change their applications. Their VMs are running exactly as it should run. And what you're running in VM, you actually see your memory in here, it's not encrypted. But God forbid it's trying somebody to do it outside of my confidential box. No, no, no, no, no, no. You would not be able to do it. Now you will see Cypher test. And it's exactly what combination of these multiple hardware pieces and software pieces have to do. So OS is also modified. And OS is modified such way to provide integrity. It means even OS that you running in your VM box is not modifiable and you as customer can verify it. But the most interesting thing, I guess, how to ensure the super performance of this environment because you can imagine Dave that's increasing and it's additional performance, additional time, additional latency. So we're able to mitigate all of that by providing incredibly interesting capability in the OS itself. So our customers will get no changes needed, fantastic performance and scales as they would expect from cloud providers like Google. Okay, thank you. Excellent, appreciate that explanation. So, again, the narrative on this as well, you've already given me guarantees as a cloud provider that you don't have access to my data, but this gives another level of assurance. Key management as they say is key. Now, humans aren't managing the keys, the machines are managing them. So, Patricia, my question to you is, in addition to, let's go, pre-confidential computing days, what are the sort of new guarantees that these hardware-based technologies are going to provide to customers? So if I am a customer, I am saying, I now have full guarantee of confidentiality and integrity of the data and of the code. So if you look at code and data confidentiality, customer cares, then they want to know whether their systems are protected from outside or unauthorized access, and that we covered with Nelid that it is, confidential computing actually ensures that the applications and data internals remain secret, right? The code is actually looking at the data. Only the memory is decrypting the data with a key that is ephemeral and per VM and generated on demand. Then you have the second point where you have code and data integrity, and now customers want to know whether their data was corrupted, tampered with or impacted by outside actors. And what confidential computing ensures is that application internals are not tampered with. So the application, the workload, as we call it, that is processing the data, it's also, it has not been tampered and preserves integrity. I would also say that this is all verifiable. So you have attestation, and this attestation actually generates a log trail, and the log trail guarantees that provides a proof that it was preserved. And I think that there offers also a guarantee of what we call ceiling. This idea that the secrets have been preserved and not tampered with. Confidentiality and integrity of code and data. Got it, okay, thank you. Nellie, you mentioned, I think I heard you say that the applications, it's transparent. You don't have to change the application. It just comes for free, essentially. And we showed some various parts of the stack before. I'm curious as to what's affected, but really more importantly, what is specifically Google's value add? How do partners participate in this? The ecosystem, or maybe said another way. How does Google ensure the compatibility of confidential computing with existing systems and applications? And a fantastic question, by the way, and it's very difficult and definitely a complicated world because to be able to provide these guarantees, actually a lot of work was done by community. Google is very much operate and open. So again, our operating system, we've working with operating system repository, OSes, OS vendors to ensure that all capabilities that we need is part of their kernels, a part of their releases. And it's available for customers to understand and even explore if they have fun to explore a lot of code. We have also modified together with our Silicon vendors kernel, host kernel, to support this capability. And it means working with community to ensure that all of those patches are there. We also worked with every single Silicon vendor. As you've seen, and that's what I probably feel that Google contributed quite a bit in this world. We moved our industry, our community, our vendors to understand the value of easy to use confidential computing or removing barriers. And now I don't know if you noticed, Intel is following this elite and also announcing their trusted domain extension, very similar architecture and no surprise, it's seeing a lot of work done with our partners to again convince, work with them and make this capability available. The same base ARM this year, actually last year, ARM announced their future design for confidential computing. It's called Confidential Computing Architecture. And it's also influenced very heavily this similar ideas by Google and industry overall. So it's a lot of work in confidential computing consortiums that we're doing. For example, simply to mention, to ensure interop, as you mentioned, between different confidential environments of cloud providers, they want to ensure that they can attest to each other. Because when you're communicating with different environments, you need to trust them. And if it's running on different cloud providers, you need to ensure that you can trust your receiver when you're sharing your sensitive data workloads or secrets with them. So we're coming as a community and we have this attestation SIG, the community-based systems that we want to build and influence and work with ARM and every other cloud providers to ensure that they can interact. And it means it doesn't matter where confidential workloads will be hosted, but they can exchange the data in secure, verifiable and controlled by customers' way. And to do it, we need to continue what we're doing, working open again and contribute with our ideas and ideas of our partners to this world, to become what we see confidential computing has to become. It has to become utility. It doesn't need to be so special, but it's what we've wanted to become. Let's talk about, thank you for that explanation. Let's talk about data sovereignty because when you think about data sharing, you think about data sharing across the ecosystem and different regions. And then of course data sovereignty comes up. Typically public policy lags the technology industry and sometimes it's problematic. I know there's a lot of discussions about exceptions, but Patricia, we have a graphic on data sovereignty. I'm interested in how confidential computing ensures that data sovereignty and privacy edicts are adhered to even if they're out of alignment maybe with the pace of technology One of the frequent examples is when you delete data, can you actually prove that data is deleted with a hundred percent certainty? You got to prove that in a lot of other issues. So looking at this slide, maybe you could take us through your thinking on data sovereignty. Perfect. So for us, data sovereignty is only one of the three pillars of digital sovereignty. And I don't want to give the impression that confidential computing addresses at all. That's why we want to step back and say, hey, digital sovereignty includes data sovereignty where we are giving you full control and ownership of the location, encryption and access to your data. Operational sovereignty where the goal is to give our Google Cloud customers full visibility and control over the provider operations. So if there are any updates on hardware, software stack, any operations, there is full transparency, full visibility. And then the third pillar is around software sovereignty where the customer wants to ensure that they can run their workloads without dependency on the provider software. So they have sometimes is often referred as survivability that you can actually survive if you are untethered to the cloud and that you can use open source. Now let's take a deep dive on data sovereignty which by the way is one of my favorite topics. And we typically focus on saying, hey, we need to care about data residency. We care where the data resides because where the data is at rest or in processing, it typically abides to the jurisdiction, the regulations of the jurisdiction where the data resides. And others say, hey, let's focus on data protection. We want to ensure the confidentiality and integrity and availability of the data which confidential computing is at the heart of that data protection. But it is yet another element that people typically don't talk about when talking about data sovereignty which is the element of user control. And here Dave is about what happens to the data when I give you access to my data. And these reminds me of security two decades ago or even a decade ago where we started the security movement by putting firewall protections and login accesses. But once you were in, you were able to do everything you wanted with the data. An insider had access to all the infrastructure, the data and the code. And that's similar because with data sovereignty we care about whether it resides, who is operating on the data, but the moment that the data is being processed I need to trust that the processing of the data we will abide by users control, by the policies that I put in place of how my data is going to be used. And if you look at a lot of the regulation today and a lot of the initiatives around the International Data Space Association, IDSA and Gaia X, there is a movement of saying the two parties, the provider of the data and the receiver of the data are going to agree on a contract that describes what my data can be used for. The challenge is to ensure that once the data crosses boundaries that the data will be used for the purposes that it was intended and specified in the contract. And if you actually bring together this is the exciting part, confidential computing together with policy enforcement. Now the policy enforcement can guarantee that the data is only processed within the confines of a confidential computing environment that the workload is cryptographically verified that there is the workload that was meant to process the data and that the data will be only used when abiding to the confidentiality and integrity safety of the confidential computing environment. And that's why we believe confidential computing is one necessary and essential technology that will allow us to ensure data sovereignty especially when it comes to users control. Thank you for that. I mean, it was a deep dive, I mean brief but really detailed. So I appreciate that especially the verification of the enforcement. Last question, I met you two because as part of my year in prediction post you guys sent in some predictions and I wasn't able to get to them in the predictions post so I'm thrilled that you were able to make the time to come on the program. How widespread do you think the adoption of confidential computing will be in 23 and what's the maturity curve look like, you know, this decade in your opinion? Maybe each of you could give us a brief answer. So my prediction in five, seven years as I started it will become utility. It will become TLS as of again, 10 years ago we couldn't believe that websites will have certificates and we will support encrypted traffic. Now we do and it's become a ubiquity. It's exactly where our confidential computing is getting and heading. I don't know we are yet. It will take a few years of maturity for us but we will do that. Thank you. And Patricia, what's your prediction? I will double that and say, hey, in the future in the very near future you will not be able to afford not having it. I believe as digital sovereignty becomes ever more top of mind with sovereign states and also for multinational organizations and for organizations that want to collaborate with each other, confidential computing will become the norm. It will become the default if I say mode of operation. I like to compare that today is inconceivable if we talk to the young technologists it's inconceivable to think that at some point in history and I happen to be alive that we had the data at rest that was not encrypted data in transit that was not encrypted. And I think that will be inconceivable at some point in the near future that to have unencrypted data while we use. You know, and plus I think the beauty of this industry is because there's so much competition this essentially comes for free. I want to thank you both for spending some time on breaking analysis. There's so much more we can cover. I hope you'll come back to share the progress that you're making in this area and we can double click on some of these topics. Really appreciate your time. Anytime. Thank you so much. Yeah. In summary, while confidential computing is being touted by the cloud players as a promising technology for enhancing data privacy and security there are also those, as we said, who remain skeptical. The truth probably lies somewhere in between and it will depend on the specific implementation and the use case as to how effective confidential computing will be. Look, as with any new tech it's important to carefully evaluate the potential benefits, the drawbacks and make informed decisions based on the specific requirements and the situation and the constraints of each individual customer. But the bottom line is Silicon manufacturers are working with cloud providers and other system companies to include confidential computing into their architectures. Competition, in our view, will moderate price hikes and at the end of the day this is under the covers technology that essentially will come for free. So we'll take it. I want to thank our guests today, Nelly and Patricia from Google and thanks to Alex Meyerson who's on production and manages the podcast. Ken Schiffman as well out of our Boston studio. Kristen Martin and Sheryl Knight helped get the word out on social media and in our newsletters and Rob Hoef is our editor in chief over at siliconangle.com does some great editing for us. Thank you all. Remember, all these episodes are available as podcasts wherever you listen just search breaking analysis podcasts I publish each week on wikibon.com and siliconangle.com where you can get all the news. You know, if you want to get in touch you can email me at david.volante at siliconangle.com or DM me at dvolante and you can also comment on my LinkedIn posts. Definitely you want to check out etr.ai for the best survey data in the enterprise tech business. I know we didn't hit on a lot today but there's some amazing data and it's always being updated. So check that out. This is Dave Vellante for theCUBE Insights powered by ETR. Thanks for watching and we'll see you next time on breaking analysis.