 Welcome to the GATT, podcast for enterprise leaders delivering timely insights for today's global economy and tomorrow's competitive advantage. I'm your host, Chris Kane, president of the Center for Global Enterprise. We've broken this episode into two parts. In part two, the discussion is around data stewardship and responsibility, software security and liability, and cybersecurity market forces versus regulation. I'm here with Sam Palmisano, chairman of the Center for Global Enterprise and former vice chair of President Obama's commission on enhancing national cybersecurity. And Karen Evans, managing director of the Cyber Readiness Institute and former CIO of the U.S. Department of Homeland Security, as well as a number of other cybersecurity leadership roles in the U.S. government. There are two other elements of pillar three that I think are worthy to talk about before we move on to some more general questions. One would be, Sam, in particular, there's a section in here about holding the stewards of our data accountable. It talks about how organizations, when organizations that have data on individuals fail to act as responsible stewards for this data, they externalize the cost onto everyday Americans. I know that you have been leading an organization of leading companies, and it's called the Data and Trust Alliance. We've had actually episodes on this before. But it seems to me that the president's cybersecurity plan has validated the mission and the motivation for starting the DMTA. Could you want to talk a little bit about what you see the companies and the DMTA in the marketplace doing along this lines of being more accountable stewards? Chris, it's an excellent point. Since we started the initiative, they've addressed multiple areas around data issues, especially in the point of view of bias within the data. And there are 25 extremely large companies. I mean, if you add them up, there's over 7 million employees. So there are very large companies that CIOs participate to build a little bit on your point. They came up with a whole set of processes that affect HR and procurement and how they operate, dealing with the transparency of the data there for the bias in the data. So that you don't have bias reflected, whether it's in hiring, promotion, pay, all the associated practices around human resources. They've also done more work around mergers and acquisitions to have a data practice. So you analyze data as an asset component to that to ensure that you are avoiding a lot of the risk that could occur if you buy companies that have not had, I'll call it, appropriate protections of that data. The last one that they're working on today, quite honest, is you think of the provenance of the data itself. Where is the source of the data? The importance of that, Chris, is because the issue with the sourcing is the transparency. So that regardless of how you apply it and how it's used, you talk about where it came from and how it's being used, and therefore how your business benefits as a result of that. I mean, it gets down to one word, which is called trust. And more and more, if you look at these regenerative AI solutions and technologies that are out there, it's going to get down to trust at the end of the day. And I think business has a role to lead in that environment versus rely on government regulation to figure out how to actually do this. Karen, as the former CIO of the Department of Homeland Security, you have any thoughts on the accountability of data stewardship? I have a lot of thoughts in this area, and I appreciate everything that Sam was saying, but I can tell you in my former role, I had the opportunity to actually handle our incident response as it relates to SolarWinds. And one of the big questions from our oversight committees was how come DHS didn't be, how come they couldn't detect it prior to private industry seeing it, right? And the challenge is that our data stores aren't as vast as private industry data stores are. We just don't have them. And if you follow a lot of this process, what ends up happening is all the companies, and I believe in Market Forces myself, are saying, sure, we'll give you access to that data for a price. And everybody's arguing back, saying it's my data in the first place, so I should be able to have access to my data. And so that's part of the data stewardship here. If you're going to hold a federal agency accountable to really manage the risk and manage those services and keep them running, regardless of what is happening, then I have to have visibility into my own data. I'm not saying give me access into everybody else's data, but I am saying you have to give me visibility into the services that you're providing for me, because I can't do it all myself, right? So this is a partnership with industry, but I have to be able to analyze my own data. And the other part that they're talking about with this, which I also think, you know, if we get all the way back to our original discussions, is when the credit bureaus were all hacked. And then they passed on the cost of remediation of everything they did on to either the users of that data, which is every company, anybody who does anything with credit, and the actual users themselves. And then me as a user is responsible for cleaning up my own data, which they end up reselling and jacked up the cost because they had to do remediation on their infrastructure in the first place. So, so that's got to stop. And I don't know without working directly with industry, what is the best way to move forward to make sure that the data to Sam's point, you can trust in it, but that it's being protected appropriately. Okay, so before we move on to more general questions, there is one last section of tiller three that I would like to go a little bit deeper on it. We touched on it very quickly, which is shifting the liability for insecure software products and services. And for our listeners who happen to be in the IT services and software business, this is particularly important. Sam, let me, let's start with you. They, how do you envision business models and business operations changing in the software and IT services sector, if this liability is now shifted over to them. And if the administration in its own statements is successful at working with Congress in the private sector to develop legislation establishing liability for software products and services. Well, well, Chris, as I'll start with the first thing, which is how do you design, how do you define liability? And the problem is, as you know, it's a combination of the provider and the user. So is liability only on the provider of the technology or the services company offering the technology, or is it how people use the technology? I mean, for example, people that misuse technology in multiple different ways in society, you tend not to penalize the provider of that particular kind of technology, whatever it happens to be. Because they use it multiple different ways and you just really can't limit its usage. So my point is, though, it sounds like a very good thing to do, you know, right? Because therefore, it would stimulate quite honestly this idea of design security from day one, design it in day one, which is really the goal. I mean, if you had security design from day one, it would just make it more difficult for the people to hack into these systems. Look, doesn't mean it's going to go away. It just makes it more difficult and therefore more sophisticated, more expensive. So a lot of these small things that occur probably won't occur. Having said all that, the definition of liability, you could just create a whole series of litigation, which the trial lawyers would probably love by the way, you know, right? That no one can conclude whether it was the product itself or the use of the product itself. So it's really a hard thing to do. I think quite honestly, where I would start on this is not on worried about imposing liability. I would encourage a standard around what it means to design from software from day one, create the built materials for software so you can expect the vulnerabilities are. And then the companies that do that well should be rewarded and the ones that don't should be penalized. I mean, I'd come at it that way versus create a whole new segment of the legal industry. So the only ones benefiting are the trial lawyers. Yeah, hopefully we can use a model and create incentives for a race to the top as opposed to the race to the bottom, which is trying to mitigate and eliminate risk and liability exposure on behalf of a company. Karen, any thoughts on the software industry and what kind of models and operating changes we're going to see if this goes through? Well, this is one of the ones that's what is old is new again. And I don't add a high level. Everybody agrees with everything Sam says at a high level. Everybody agrees secure by design. So, you know, and a lot of these fit on a bumper sticker like secure to market versus first to market. The challenges in the definition, they'll get one level further down and talk about due diligence. And, you know, if you can define the due diligence and separate out the issues of what Sam is saying, like, did the developer, the software provider do due diligence and then was it used by a criminal element because there was something in there that they hadn't thought about and it was exploited. And that has always been the rub moving forward. One of the biggest things that came out when that strategy was actually released is there is a part of there in it that talks about harmonization of regulations. And so the industry folks are actually asking, are you going to harmonize regulations first and then issue new ones where you seek out? So are you going to issue new ones and harmonize later? Because if you issue new ones and harmonize later, that's going to like increase the cost. Now we're full circle back to the other questions that you're asking is, you know, I'm going to have to pass on that cost to the consumer in order to be able to take it to market. And can I actually get these principles built into the culture of my organization in order to be able to develop it in a way that, you know, I've done everything I can. And that's humanly possible to secure this so that a small business can implement and not necessarily have to think about it. Seems like it's a huge challenge, but one that's so structurally important that we have to do something around it. It won't be easy. Because thinking about all the discussions on data privacy and antitrust, how long they've gone on, that'll be short cycle relative to this. Yeah, this is probably a little bit more pervasive. Everybody in the world, as Karen says, has a point of view relative to this. Trying to put that into legislation is going to be pretty hard to do. Let's talk a little bit about some market oriented or general things that the US actions and cybersecurity like the president's new strategy, as well as those actions from other nations around the world seem to be telling us that there's little appetite for allowing companies to voluntarily opt in when it comes to cybersecurity. And that after years of increasing cyber incidents and the likelihood that they're only going to become more voluminous, not less. Do you think we've come to the point where market forces are no longer enough to encourage organizations to take the appropriate steps to protect their own businesses? So will we have more of a command and control approach to regulation than we've had thus far? It seems like based upon our conversation that you see that happening, but just relative to government actions, do you see this trend not only moving forward, but being locked in? Karen, how about we start with you? So I'm going to go to the Karen and a stick approach here. I do think the whole idea of voluntary participation or voluntary opt in. I just don't see that working because we've been trying it for the last 25 years. And so some people do it well, some people don't. I do think as you continue to go forward that some type of regulation is going to happen. So you can either be part of the solution and get ahead of it. And I think some of the incentives of what the federal government in the United States can do. And I think other governments are looking at this to see how the United States implements it is the incentive of the federal marketplace itself. It's a big chunk of change as it relates to acquisitions over $70 billion in the federal market. And Congress has given it several different tools to be able to implement secure products. So the piece then gets back to, okay, if I have the money because there's a modernization fund now that has like billions of dollars in it. And I have different bodies that allows me to take intelligence information, mix it in with acquisition information and then give guidance out. Now, how do you incentivize businesses to actually then deliver the solutions that the federal government needs? And that's through the competition. So if you know you can win a contract and it's not mandatory, so it's optional. You don't have to participate in the federal market. But a lot of companies do and they want to have that business. I think that's where you can get some of the incentives and drive some of the products because they're going to have to develop them for the federal government. You're not going to maintain two different development cycles. You're going to try to, you know, gain as much efficiencies as you can. And I think that the way the federal government is going to move forward is, okay, we'll put our money where our mouth is and we're going to only buy these types of products. Yeah, that will be an important milestone. The federal government and its purchasing power has always been a major factor in economic and societal change in many respects. And the ability for it to act as an incentive or a catalyst for marketplace response will be very interesting to see how quickly that can materialize. Sam, any thoughts about the marketplace and opt-in forces versus conscripted requirements? No, I'm where I'm where Karen is. Let's take the administrative side of a government because you have to step out and separate on defensive intelligence. But let's take the civilian side of government. Dot gov, not dot mill, right? Actually, within the procurement guidelines, they also could become a standard for corporations as well. I mean, if assuming that it works and it doesn't impede innovation, then I think you could see the adoption rate move from the government into... I'll give you an example in the PC industry. When the government started purchasing directly, that actually lifted Dell. I mean, that was their first user base was the government, right? Now, everybody just buys everything electronically. It started in PCs, but it's gone everywhere, needless to say. But you can see the effect of a large consumer out there weighing in that drove a lot of that change. So that's my analogy I'd make here. I sort of agree with Karen. Now, you have to separate out intelligence and the defense and all the rest of that here. Let's just take commercial environments. I think she's right. And I think that could lead to more kind of a commercial adoption, but that's the carrot again versus the stick. Because you can decide you don't want to sell them in governments and that's fine. You can decide you don't want to sell them in large companies. I mean, nobody's making you do that, right? You could define your market as something else and be very happy there. Now put ourselves in the seat of that CEO or business leader who has this opportunity ahead of them to either follow a carrot strategy or a stick strategy. What questions, Sam, if you were a CEO, would you be asking your board and your senior leadership team today about cybersecurity and how your company is going to be positioned both in the marketplace as well as in the regulatory compliance environment? Karen, maybe we can come to you. Yeah, I'll start. I mean, basically, CEO or the board, whatever perspective you want to take on this, Chris, fundamentally it's risk. It's no different than all the other factors of your enterprise risk. And all large companies and mid-sized companies have enterprise risk models today that are broader than just strictly financial risk. They're much broader today as a result of some of the other areas we've had problems with in the past. This is just part of that. So part of that process in the corporation, it goes through the audits of the other areas of risk. It's rolled out to come up with an enterprise-level model. A lot of that came out of the LA financial crisis because they lost control of the risk factors in the banking system. And we're seeing that again, though, by the way. But nonetheless, fundamentally, that that's how you should think about it is enterprise risk. There's audit, there's controls, there's the board level function, there's committees of the board. All that stuff already exists. So I would just include that as part of that. Now, the other thing I would do quite honestly is I think you need people on the board who can ask the right questions when they come into the board or on the audit committee when they come in. And so therefore the skill set of the, it could be a CEO with a technology background. It doesn't have to be a deep technologist per se. But at least they understand the issues and can ask the right questions to make sure that there's a check in balance in the process. Karen, some of your members in the cyber readiness institute are the largest corporations in the world and certainly some of the most revered brands. What questions should their CEOs be asking of their board or their senior leadership team to make sure they're positioned properly? Yeah, probably. And not just the CRI members, but of course any corporation. Well, I think it's the same question over and over again. And this goes back in my history, you alluded to this, I think, you know, my government experience just never goes away. I had the opportunity, Chris, to handle the very first hacking incident in the federal government in 1996. And it's what happens in the immediate aftermath. And so when you ask, like, I love serving on audit committees and people think I'm nuts. But that's usually, you know, like the question I ask, it's not what's going to happen 30 days later. Everybody's really good at the 30 days later. It's like, what happens? Do you even know that you've been compromised? What happens within the first 24 hours of an incident occurring? And most companies have a hard time answering that question. Like who's in charge? You know, what's the communications plan? How are we going to respond? Like everybody kind of goes off and does their thing. But the CEO is the one who has to answer. In my case, it's always the secretary of, you know, of the department or agency that has to go out and talk publicly to the president of the United States and to the American people. And it gets back into the trust. So what happens within, you know, the first 24 hours? And nine times out of the 10 companies can answer that question of who's in charge and who's leading in the first 24 hours. Yeah. And Chris, what I would do is once you establish it, you have to rehearse it. You know, you have to practice this. A terrible thing was 9-11. But at IBM, we had disaster recovery as part of the process. Who to convene? Who led? Who knew? Who called? So immediately after the plane hit the tower, the system just kicks in and we got the people, all the stuff that goes on, including getting all the people out of the city, right? Has to happen. So I mean, it's no different with that to me, to Karen's point. All those processes need to be established and then they have to kick in immediately, whereas the manager team knows exactly what to do when that occurs. So Karen, if a company wanted to establish that playbook and then also do those rehearsals, are those the kinds of things that the Cyber Readiness Institute can help with? Or do you know of other parties that could help? Well, absolutely, especially, you know, we're very focused on small and mid-sized businesses, but I do know large companies that have talked to me about using our materials because our materials are offered for free. We do put together a playbook. And the way I describe it is when you finish the CRI program just around these four core areas, which is automatic updating, phishing, multi-factor authentication, securing, removable media, you end up with a business continuity plan. And so a small business or a large business means some of the large businesses that I've talked to, they use it for their annual training to Sam's point. And then they exercise it out to see if people really know, you know, okay, if it's a small one, how does it continue to escalate? How do you handle this? And then, you know, we have sample communications, emails, things like that that you send out. And the reason why I'm really excited about it is because it's not the dollar's commitment for small and mid-sized businesses. It's actually the time commitment to understand because then you're creating a culture. So as you continue to grow, because we want all small businesses to grow to be medium and then large, you've already established this culture. So it's just, you've got the core pieces and you keep building the pieces like Sam said more and more onto it so that all the risks associated with the enterprise then can be addressed in your business continuity plan. So you want to build cyber security and muscle memory? Yes, I do. All right. Well, Sam and Karen, thank you for being with us today. But before we leave, we always like to ask our guests to give our listeners one strategic insight to consider and recall our emerging critical issues moment. And in one word or one phrase, tell us what issue you see on the horizon doesn't necessarily have to be about cyber security. But since we've been talking about it, feel free to go there. What issue do you see on the horizon that business leaders need to put on the radar that they may not be doing so today? Karen, why don't we start with you? So mine isn't necessarily going to be about cyber security, but it is going to be about an issue that we covered, which is the reliance or the use of machine learning artificial intelligence. I would answer the question to say why I answered the question. I've been a lot of these large CEO events and they asked me about cyber. I say you should assume that somebody's already penetrated your systems. So what do you do about it when you leave here tonight? How do you build that muscle memory of rehearsal? I just believe some foreign entity of some kind is already in there, especially if you're doing anything in technology or biologics or pharma or financing critical industries. You just have to assume they're there whether you know it or not. Okay. Great. Thank you very much. We'll come back to these insights and future shows, but I want to thank you very much for your taking the time to be with us today. You've been listening to the get sponsored by the Center for Global Enterprise celebrating 10 years of convening global enterprise leaders around the most important business transformation issues.