 Mary Ann is responsible for Oracle Software Security Assurance and represents Oracle on the board of directors for the Information Technology Information Sharing and Analysis Center and on the international board of the ISSA. And Dr. Ross leads the Federal Information Security Management Act Implementation Project. Sounds like a big job to fulfill. And developing the security standards and guidelines for the federal government. And this session is going to look at the cyber security and supply chain landscape from a standards perspective. So, Ron and Mary Ann, thank you very much. Thank you. I'll start, I guess. Good morning, everyone. I was listening to Dawn this morning and I was trying to reflect. She was talking from a technologist perspective. And of course, all of us are part of this technology explosion and revolution that we've been experiencing for the last couple of decades. And one of the things that I'd like to have you leave today with a couple of major points, at least from my presentation, things that we've observed in cyber security for the last 25 years, where we are today and where I think we might need to go in the future. And again, there is no right or wrong answer to this problem of cyber security. It's probably one of the most difficult and challenging sets of problems we could ever experience. And of course, in our great country, we work on what I call the essential partnership. It's a combination of government, industry, and academia all working together. And we have the greatest technology producers, not just in this country, but around the world now. They're producing some fantastic things that we are all... I'll use the word addicted. I think we have an addiction to the technology. And some of the problems we're going to experience going forward in cyber security are not just going to be technology problems. They're going to be cultural problems and organizational problems and how we organize ourselves and what our risk tolerance is and how we are going to be able to accomplish all of our critical missions and business operations that Dawn talked about this morning and do so in a world that's fairly dangerous and that we have to protect ourselves. And that is really the key issue. I think I can sum it up. I was at a movie. I don't go to movies very often anymore, but about a month ago I went to a movie and I was sitting there waiting for the main show to start, the main movie, and they were going through all the coming attractions. And then they came on the PA and they said, there's an app you can download. And I'm not sure you've ever seen this before, but it tells you for that particular movie, when is the optimal time to go to the restroom during the movie? Now, I bring this up because that's a metaphor for where we are today. We are consumed and this is great. Great companies out there producing great technologies and we are buying it up faster than you can take a stick at it. And we are developing the most complicated IT infrastructure ever. So when I look at this problem, I look at this from a scientist's point of view, an engineering point of view. And I'm saying to myself, knowing what I know about what it takes, I believe to, I'm not even going to use the word secure anymore because I don't think we can ever get there with the current complexity, but how do we build the most secure systems we can and be able to manage risk in the world that we live in? So you have to, in the Army we used to have a saying, you go to war with the Army that you have, not the Army that you want. So when Don talks about this morning, all the technology advances and we're going to be buying stuff, commercial stuff, and we're going to have to put it together into systems that are going to, whether it's the Internet of Things or cyber physical convergence, it all goes back to some fairly simple things. And this slide really says it all. The Internet of Things and all this stuff that we're talking about today really gets back to computers. That's the common denominator. They're everywhere. I talked about this morning in your automobile having more compute power than Apollo 11. Your toaster, your refrigerator, your building, the control of the temperature, industrial control systems in power plants, manufacturing plants, financial institutions, that's the common denominator, the computer driven by firmware and software. And when you look at the complexity of the things that we're building today, we have gone past the time when we can actually understand what we have and how to secure it. So one of the things that we're going to do at NIST this year and beyond, you know we've been working in the FISMA world forever, it seems, and we've had a whole set of standards and that's the theme of today. How can standards help you build a more secure enterprise? Well, the answer is we have tons of standards out there and we have lots of stuff, whether it's on the federal side with 853 or the risk management framework or all the great things that are going on in the standards world with the open group or just you can ISO pick your favorite standard. The real question is how do we use those standards effectively to change the current outlook and what we're experiencing today? And what we're experiencing today because of this complexity, the adversary has a significant advantage in this world because of complexity. They really can pick the time and the place and the type of attack because the attack surface is so large when you talk about not just the individual products and we have many great companies, just this country and around the world that are doing a lot to make those products more secure. But then they get into the engineering process and put them together in a system and that really is an unsolved problem. We call it a composability problem. We've lost the product here and one here, but what is the combination of those two when you put those together in the systems context? We haven't solved that problem yet and it's getting more complicated every day. So the hard problems, you know, we in the federal government, we do a lot of stuff in continuous monitoring now and we're going around counting our boxes and we're patching stuff and we're configuring our components. That's loosely called cyber hygiene. It's very important to be able to do all that and do it quickly and efficiently to make your systems as secure as they need to be. But even the security controls in our control catalog, 853, when you get into the technical controls, I'm talking about access control mechanisms, identification, authentication, encryption, audit, those things are buried in the hardware, the software and the firmware and the applications. Most of our federal customers can't even see those. So when I ask them, do you have all your access controls in place? Well, they can nod their head, yes, but they can't really prove that any meaningful way. So we have to rely on industry to make sure those mechanisms, those functions are employed within the component products that we then will put together using some engineering process. So this is the below the water line problem I talk about and this is why I talk about the, I think we're in some kind of digital denial today because below the water line, most consumers are looking at their smartphones and their tablets and all their apps, that's why I use that movie example and they're not really thinking about those vulnerabilities because they can't see them. And until it affects you personally, I had to get three new credit cards last year. I shop at Home Depot, I shop at Target and JP Morgan Chase is our federal credit card. That's not a pain point for me because I'm indemnified. They don't, even if there's fraudulent charges, I don't get hit for those. If your identity is stolen, that's a personal pain point. We haven't reached that national pain point yet where all of the security stuff that we do, we talk about it a lot and we do a lot of it. But if you really want to affect change, those words you're going to start to hear more of at this conference, assurance, trustworthiness, resiliency. You see, that's the world that we want to build and we're not there today. And so that really is the kind of the essence of where I'm hoping we're going to go. In these three areas, software assurance, system secure engineering and supply chain risk management. My colleague John Boyance is here today and he's the author along with a very talented team of co-authors of the NIST 800-161 document. That's the supply chain risk document. It's going to work hand in hand with another publication that we're still working on. It's the 800-160 document. We're taking an IEEE and an ISO standard, 15288, and we're trying to infuse into that standard. They're coming out with the update this year of that standard. We're trying to infuse security into every step of the life cycle. You see, today I think the reason why we're not having a lot of success on the cybersecurity front is because security ends up appearing either too late or by the wrong people for the wrong reasons. I'll give you one example. In the federal government, we have a huge catalog of security controls. They're allocated into different baselines, low, moderate, and high. You'll pick a baseline, you'll tailor, and you'll come to the system owner or the authorizing official and say, hey, these are all the controls that NIST says we have to do. The mission business owner was never involved in that discussion. One of the things we're going to do with the new document is focus on the software and systems engineering process from the start of the stakeholders all the way through requirements analysis, definition, design, development, implementation, operation, sustainment, all the way to disposal. And there are critical things that are going to happen at every one of those places in the life cycle. And the beauty of that process is that you involve the stakeholders early. So when those security controls are actually selected, they can be traced back to a specific security requirement, which is part of a larger set of requirements that support that mission or business operation. And now you have the stakeholders involved in the process. Up to this point in time, security operates in its own vacuum. It's in the little office down the hall, and we go down there whenever there's a problem. But unless and until security gets integrated and we disappear as being our own discipline, and we now are part of the enterprise architecture, whether it's TOGAP or whatever architecture construct you're following, or the systems engineering process, or the system development life cycle is the third one, and I think the last one is acquisition and procurement. Unless we have our stakeholders at those tables to influence, we're going to continue to deploy systems that are largely indefensible, not against all cyber attacks, but against the high end attacks. So we have to do a better job getting at the C-suite, and I tried to capture the five essential areas that I think this discussion has to revolve around. The acronym is TACID, it just happens to be a happy coincidence that it fit into an acronym, but it's basically looking at the threat, how you configure your assets, and how you categorize your assets with regard to criticality. How complex is the system you're building? Are you managing that complexity and trying to reduce it? Integrating security across the entire set of business practices within the organization, and then the last component which really ties into the open group and the things you're doing here with all the projects that were described in the first session, that is the trustworthiness piece. Are we building products and systems that are, number one, more penetration resistance to cyber attacks? And number two, since we know we can't stop all attacks because we can never reduce complexity to where we thought we could two or three decades ago, are we building the essential resiliency into that system? So even when the adversary comes through the boundary and the malware starts to work, and how far does it spread, what can it do? That's the key question, you try to limit the time on target for the adversary, and that can be done very, very easily with good architectural and good engineering solutions. That's my message, 2015 and beyond, at least from a lot of the things at NIST, we're going to start focusing on the architecture and the engineering, how to really affect things at the ground level. Now we always will have the people, the processes, the technologies, kind of this whole ecosystem that we have to deal with, and you're going to always have to worry about your cis admins that go bad and dump all the stuff that you don't want dumped on the internet. But that's part of, again, a system process. Processes are very important because they give us structure, they give us discipline, they give us the ability to communicate with our partners, and as I was talking to Rob Martin from MITRE, I was working on a lot of important projects there with the CWEs, CVEs. It gives you the ability to communicate a level of trustworthiness and assurance that other people can have that dialogue. Because without that, we're not going to be communicating with each other, we're not going to trust each other, and that's critical, having that common understanding frameworks provide that, common dialogue of security controls and a common process and how we build things, and what is the level of risk that we are willing to accept in that whole process. So these slides, and they'll be available, it goes very briefly into the five areas. Understanding the modern threat today is critical, because even if you don't have access to classified threat data, there's a lot of great data out there with Symantec and the Verizon report, and there's open source threat information available. If you haven't had a chance to do that, I know the folks who work on the high assurance stuff in the open group task look at that stuff a lot, because they are building a capability that is intended to stop some of those types of threats. The other thing about assets is we don't do a very good job of criticality analysis. In other words, most of our systems are running, processing, storing and transmitting data, and we're not segregating the critical data into its own domain where necessary. Now I know that's hard to do sometimes, and people say, well, look, I've got to have all this stuff ready to go 24-7, but when you look at some of the really bad breaches that we've had over the last several years, establishing a domain for critical data where that domain can be less complex, number one, which means you can better defend it, and then you can invest more resources into defending those things that are the most critical. I use the very simple example of a safe deposit box. I can't get all my stuff into the safe deposit box, so I have to make decisions. I put important papers in there, maybe a coin collection, whatever your things that... I have locks on my house on the front door, but they're not strong enough to stop some of those bad guys out there. So I make those decisions, I put it in the bank, it goes in a vault, it's a pain in the butt to go down there and get the stuff out, but it gives me more assurance, greater trustworthiness. That's an example of the things we have to be able to do. Complexity is something that's going to be very difficult to address because of our penchant for bringing in new technologies and make no mistake about it. These are great technologies, they're compelling. They're making us more efficient, they're allowing us to do things we never imagined, like finding out when the optimal time to go to the restroom during a movie. I mean, who could have imagined we could do that a decade ago. But as with every one of our customers out there, this flies below their radar, the kinds of things we're talking about. Because when you download 100 apps on your smartphone, people in general, even the good folks in cybersecurity, have no idea where those apps are coming from, where the pedigree is, have they been tested at all, have they been evaluated, are they running on a trusted operating system? And that's what this business is all about, ultimately. That's what 800-160 is all about. It's about a life cycle that talks about the entire stack, from applications to middleware to operating systems to firmware to integrated circuits to include the supply chain. The adversary is all over that, that stack. They now figure out how to compromise our firmware, so we have to come up with firmware integrity controls in our control catalog. And that's the world we live in today. So complexity, whether you go to the cloud, and I was smiling this morning when I was talking about the DNI, the Director of National Intelligence, and building their cloud, and if that's going to go to the public cloud or not, I think Dawn's probably right. You probably won't see that going to the public cloud anytime soon, but cloud computing gives us an opportunity to manage complexity, because you can figure out what you want to send to the public cloud, and they do a good job through the FedRAMP program of deploying controls, and they've got a business model that's important to make sure they protect their customers' assets. So that's built into their business model, and they do a lot of great things out there to try to protect that information. Then whatever stays behind in your enterprise, you can start to employ some of the architectural constructs that you'll see here at this conference, some of the security engineering constructs that we're going to talk about in 80 and 160, and you can better defend what stays behind within your organization. So cloud is a way to reduce that complexity. Enterprise architecture, the TOEGAF, all of those architectural things allow you to provide discipline and structure and thinking about what you're building, how to protect it, how much it's going to cost, and is it worth it. That is the essence of good security. It's not about running around with a barrel full of security controls or ISO 27000 saying, hey, you got to do all this stuff, or the sky's going to fall. Those days are over. Integration we talked about, this is also hard. We work in stovepipes today. Enterprise architects typically don't talk to security people. Acquisition folks, in most cases, don't talk to security people because I see it every day. You see RFPs go out and there's a whole long list of requirements and then when it comes to security, they say the system or the product they're buying must be FISMA compliant because they know that's a law and they know they have to do that, but they really don't give the industry or their potential contractors any specificity as to what they need to actually do to bring that product to the state where it needs to be or that system. And so it's all about expectations. I believe our industry, whether it's here or overseas, wherever these great companies operate, the one thing we can be sure of, they want to please their customers. And so maybe the message I'm going to send everybody is we have to be more informed consumers. We have to ask for things that we know we need. It's like if you go back and the automobile, when I first started driving a long time ago, 16, how many, maybe 45 years ago or so, 40 years ago, cars didn't, they just had seat belts. There was no airbags, no steel reinforced doors. And then you could actually buy an airbag as an option at some point. When you fast forward to today, every car has an airbag, seat belt, steel reinforced doors. It comes as part of the basic product. We don't have to ask for it, but as consumers we know it's there and it's important to us. And I think we have to start to look at the IT business in the same way. Just like when we cross a bridge or we fly in an airplane, all of you who flew here in airplanes and came across bridges had confidence in those structures. Why? Because they're built with good scientific and engineering practices. So least functionality, least privilege, those are kind of foundational concepts in our world in cybersecurity. You really can't look at a smartphone or a tablet and talk about least functionality anymore. At least if you're running that movie app, you want to have all of that capability. Well the last point about trustworthiness is we have four decades of best practices in trusted systems development. Now it failed 30 years ago because we had the vision back then of trusted operating systems, but the technology and the development far outstripped our ability to actually achieve that. We talked about a kernel based operating system having a couple two or three, four, five thousand lines of code and being highly trusted. Well those concepts are still in place. It's just that now the operating systems are 50 million lines of code. And so it becomes increasingly difficult and this is the key thing as a society we're going to have to figure out what's going forward with all this great technology. What kind of world do we want to have for ourselves and our grandchildren? Because all this technology, as good as it is, if we can't provide a basis of security and privacy that customers can feel comfortable with, then at some point this party is going to stop. And I don't know when that time is going to come, but I call the national pain point and this digital denial. We'll come to that steady state. We just haven't had enough time yet to get to that balance point but I'm sure we will. Well I talked about the essential partnership and my time is just about up but I don't think we can solve any problem without a collaborative approach and that's why I use the essential partnership government industry and academia. Certainly all of the innovation or most of the innovation comes from our great industry and academia is critical because the companies like Oracle or Microsoft they want to hire students who have been educated in, I call them, the STEM disciplines. Science, engineering, whether it's double E or computer science, they need those folks to be able to build the kind of products that have the capabilities functionally wise and also are trusted. And so this in the part, and government plays a some role, maybe some leadership, maybe a bully pulpit, cheerleading where we can, bringing things together. But the bottom line is we have to work together and I believe that we will do that and when that happens, I think all of us will be able to sit in that movie and fire up that app about the restroom and feel good that it's secure. So thank you guys very much. Put it up and make it easier for you to get in. Folks while Marianne's getting ready, just a reminder that we take our questions on card so please write them down. Staff will be around to collect them and my colleague Jim will be helping ask them at the end of the session. Thank you. Okay, well, so I guess I'm preaching to the converted if I can use a religious example without offending somebody. So one of the questions you ask is why do we even have standards in this area? And of course some of them is for technical reasons. Crypto it turns out is easy for even very smart people to get wrong. It's an unfortunate reason to find out. So there's technical correctness, another reason would be interoperability to get things to work better in a manner. I worked in this industry long enough to remember the first SSL implementation, woo-hoo. And then how they managed to, it turns out 40 bits wasn't really 40 bits because it wasn't random enough shall we say. Trustworthiness, ISO has the common criteria. It's an ISO standard. So how do you, when we talk about what does it mean to have secure software, what type of threats does it address, how do you prove that it does what you say you do. There are standards for that which helps, it helps everybody, it certainly helps buyers understand a little bit more about what they're getting. And last but not least, and the reason it's in quotes, best practices. There actually are no best practices. Why do I say that? And I'm seeing furrowed brows back there. Well first of all, lawyers don't like them in contracts because then if you're not doing the exact thing you get sued. There are good practices and there are worse practices. There typically isn't one thing that everyone can do exactly the same way that's going to be the best practice. So that's why that's quotation mark. And generally speaking, I do think standards particularly in general are forced, can be a force for good in the universe, particularly in cybersecurity, but they're not always a force for good, depending on other factors. And what is the ecosystem? Well, we have a lot of people, we have standards makers here, people who work on them. Some of them are people who review things, like when NIST is very good, which I appreciate about putting drafts out and saying here it is, take it or leave it. And that's actually a very constructive dialogue which I believe a lot of people appreciate. I know that I do. Sometimes they're mandators. Somebody, you'll get an RFP that says verily thou shalt comply with this lest thy be an infidel in the security realm. And that can be positive. It can help push, be a leading edge of getting people to do something good in many cases they should do anyway. Implementers who have to take this and decipher and figure out where they're doing it. People who make sure that you actually did what you said you were going to do. And last but not least, there are weaponizers. What do I mean by that? We all know who they are. They're people who will try to develop a standard and then get it mandated. And it actually isn't a standard. It's something they came up with which might be very good. But it's handing them regulatory capture. And we need to be aware of those people because it'd be like, I like the Oracle database, I have to say that, right? There are a lot of other good databases out there. If I went in and said purely objectively speaking, everybody should standardize on the Oracle database because it's the most secure. Nice work if I can get it. Is that in everybody else's interest? Probably not. You get better products in something that is not a monopoly market. Competition is good. So I have an MBA or had one in a prior life and they used to talk in the marketing class about the three P's of marketing. Don't know what they are anymore. It's been a while. P's of a benevolence standard which are problem statement, precise language, pragmatic solutions, and prescriptive minimization. And the reason I say this is, one of the kind of discussions I have to have a lot of times, particularly sometimes with people in the government, and I'm not saying this in any pejorative way so please don't take it that way, is the importance of economic analysis because nobody can do everything. So being able to say, I can't boil the ocean because you're gonna boil everything else in it but I can do these things. If I could do these things where it's very clear what I'm trying to do, it's very clear what the benefit is, we've analyzed it, and it's probably something everybody can do then we can get to better. Better is better than omnibus. Omnibus is something everybody gets thrown under if you make something too big. Sorry, I have to say that. So problem statement. Why is this important? I guess Marianne, except that it isn't because so often the discussions I have with people tell me what problem you are worried about. What are you trying to accomplish? If you don't tell me that then we're gonna be all over the map and you say potato and I say patato and the chorus of that song is let's call the whole thing off. And I use supply chain as an example because this one is all over the map. Bad quality. Well, buying a crappy product is a risk of doing business. It's not per se a supply chain risk. I'm not saying it's not important, but it's certainly not a cyber-specific supply chain risk. Bad security. Well, that's important, but again, that's a business risk. Backdoor boogeyman. This is the popular one. How do I know you didn't put a backdoor in there? Well, you can't actually. That's not a solvable problem. Assurance, supply chain shut down. I'd like to know that a critical part supplier isn't gonna go out of business. But they're all different problems. So if you don't say what am I worried about and it can't be all the above because if you're not careful, almost every business has some supplier of some sort, even if it's just healthcare. If you're not careful how you define this, you'll be trying to define 100% of any entity's business operations. And that's not appropriate. That is really not appropriate. Use cases are really important, because even if you have a problem statement, the next part of that should be for example. And I'll give you one, and this is not to ding this in any way, shape, or form. I just read this. It's the cryptographic key management system draft. And the only reason I cite this as an example is I couldn't actually find a use case in there. So whatever the merits of that, I'm saying, well, are you trying to develop a super secret key management system for government, very sensitive cryptographic things you're building from scratch, or you're trying to define a key management system that we have to use for things like TLS, or any encryption that any commercial product does, because that's way out of scope. So without that, what are you worried about, and also in here are the use cases, what's going to happen is somebody's going to cite this in an RFP, it's going to be, are you compliant with Blotty Blot, and you have no idea whether that even should apply. So that's problem statement really important, because without that, you can't have that dialogue in groups like this. Well, what are we trying to accomplish? What are we worried about? What are the worst problems to solve? Precise language, also very important, why? Because it turns out everybody speaks a slightly different language, even if we all speak some dialect of geekies. And that is, for example, a vulnerability. If you say vulnerability to my vulnerability handling team, they think of that as a security vulnerability that's caused by a defect in software. But I have seen it used to include, well, you didn't configure the product properly. Well, I don't know what that is, but it's not a vulnerability, at least not to a vendor. A policy, you implemented a policy incorrectly. Well, it might lead to a vulnerability, but it isn't a vulnerability. So you're getting, see where I'm going with this? If you don't have language to find very crisply, same thing, you read something, and you go off and do it, and you realize you solved the wrong problem. I'm very fortunate to have one of my colleagues here from Oracle who works on our hardware. And I also saw a presentation by people in that group at the cryptographic conference in November. Well, I talked about how much trouble we got into because if you say module to a hardware person, it's a very different thing from what it meant to somebody trying to certify it. And this was a huge problem because, again, you say potato, I say potato. It's not the same thing to everybody. So it needs to be very precisely defined. Scope is also important. And I don't know why I have to say this a lot, and it does get kind of tiresome, I'm sure, to the recipients. Cots isn't dots. Commercial software is not government software. And it's actually globally developed. That's the only way you get commercial software. That's feature rich, revs frequently. We have access to global talent. It's not designed for all threat environments. I mean, it can certainly be better. And I think most people are moving towards better software, mostly because we're getting beaten up by hackers and then our customers, right? And it's good business. But it's not, there is no commercial market for high assurance software or hardware. And that's really important because there's only so much you can do to move the market. So even a standards developer or a big, U.S. government is an important customer market for a lot of people. But they're not big enough to move the marketplace on their own. And so you're limited by the business dynamic. So that is important. You can get to better, but you can't. And here's the thing I tell people, okay, anybody here have a Volkswagen? Okay, is it an M-wrapped vehicle? No, it's not, is it? You bought a Volkswagen, you got a Volkswagen. There is no market, you know, you can't, you can't take a Volkswagen and drive it around the streets of Fallujah and expect it to perform like an M-wrapped vehicle. And even a system integrator, a good one, cannot sprinkle pixie dust over that Volkswagen and turn it into an M-wrapped vehicle. Those are very different, you know, threat environments. And yet I say, well, why do you think commercial software and hardware is different? It's not different. It's exactly the same thing. You might have a really good Volkswagen. It's great for commuting. It is never going to perform, you know, in an IED environment. It wasn't designed for that. And there's nothing you can do to it that will make it designed to perform in that environment. Pragmatism. I really wish anybody working on any standard would do some economic analysis because economics rules the world. And even if it's something really good, a really good idea, time, money, and people, particularly qualified security people are constrained resources. So if you make people do something that looks good on paper, but it's really time-consuming, it's an opportunity cost is too high. That means what is the value of something you could do with those resources that would either cost less or deliver a higher benefit? And if you don't do that analysis, then you have people say, hey, that's a great idea. Well, that's great too. I'd like that. It's like asking your kid, do you want candy? Uh-huh. Do you want new toys? Uh-huh. Do you want more footballs? Uh-huh. Right? Instead of saying, hey, you've got 50 bucks. What are you going to do with it? And then there are unintended consequences because if you make this too complex, you just have fewer suppliers. You'll never say, well, I'm just not going to bid because it's impossible. And I'm going to give you three examples. And again, I'm trying to be respectful here. This is not to diss anybody who worked on these. In some cases, these things have been, subsequent revisions have been modified, which I really appreciate. But there are examples of, when you think about it, what were you asking for in the first place? So for example, and I think this is an early version of NISTER 7622 has been excised. Thank you. For, there was a requirement that the purchaser wanted to be notified of personnel changes involving maintenance. Okay, what does that mean? I know what I think they wanted, which is if you're outsourcing the human resources for the Defense Department and you move the whole thing to Pakistan, I said Pakistan, obviously they would want to be notified or I got that. But that's not what it said. So I look at that and say, we have 5,000 products at least at Oracle. We have billions and billions of lines of code. Every day somebody checks out a transaction, meaning some code, and they do some work on it and they didn't write it in the first place. So am I going to tweet all that to somebody? What's that going to do for you? Plus you have things like the German Workers' Council. Hey, we're going to tell the US government that you're going to work on this line of code. Oh, no, that's not going to happen. So again, what was it you were worried about? Because that is not sustainable. Tweeting people 10,000 times a day with code changes is not going to solve a problem. It's just going to consume a lot of resource. Another one, this is DHS had this in an early version of something that we're trying to do. They wanted to know for each phase of development for each project how many foreigners worked on it. Well, what's a foreigner? Is it a green card holder? Is it somebody who's got a dual passport? What about if they were born in what... What is that going to do for you? Now again, if you had a super custom code for highly, you know, intelligence, I can understand there might be cases in which that would matter. But general-purpose software is not one of them. And I said, I can give you that information. We're a big company, we've got lots of resource. A smaller company probably can't. And again, what will that do for you? Because I'm taking resources I could be using on something much more valuable and putting on something really silly. Last but not least, and again, with respect, I think I know why this was in there, I think it might have been the secure engineering draft standard that you all came up with, which had many good parts to it. And I think vendors will probably understand this pretty quickly. Root cause analysis. If you have a vulnerability, one of the first things you should do is root cause analysis. Okay, if you're a vendor and you have a CVSS-10 security vulnerability in a product and it's being exploited, what do you think the first thing you're going to do is get a patch in your customer's hands or work around? Yeah, probably. That's probably going to be number one priority. Also, yeah, root cause analysis, particularly for really nasty security bugs, is really important. CVSS-0, really, who cares? But for 9 or 10, you should be doing that kind of analysis. I got a better one. We have a technology we have called Java. Maybe you've heard of it. We put a lot of work into fixing Java. One of the things we did is not only root cause analysis for CVSS-9 and higher, but also for my boss and every Java developer. Every Java developer had to sit through that briefing. How did this happen? And last, but not least, looking for other similar instances. Not just root cause. How did that get in there and how do we avoid it? Where else does this problem exist? I'm not saying this to make us look good. I'm saying for the analytics, what are you really trying to solve here? Root cause analysis is important, and context. Telling me I have to do it for everything is probably not the best use of a scarce resource. My last point is to minimize prescriptiveness within limits. For example, probably some people in here know how to bake. Maybe you've made a pie. There is no one right way to bake a cherry pie. Some people go down to Ralph's and they get a frozen Marie calendar out of the freezer and they stick it in there and they got a pretty good cherry pie. Some people make everything from scratch. Some people use a prepared pie crust and they, I don't know, do something special with the cherries they picked off their tree. But there is no one way to do that that is going to work for everybody. So best practice, for some things, for example, I can say truthfully a best development practice would not be just start coding, number one, and number two, it compiles without too many arrows on the base platform, ship it. That is not good development practice. So if you mandate too much, it will stifle innovation and it won't work for people. Plus, as I mentioned, you will have an opportunity cost of I'm doing something that somebody says I have to do, but you know what, there is a more innovative way of doing that. We don't have a single development methodology in Oracle, mostly in part because of acquisitions. We buy a great company. We don't tell them, you know that agile thing you're doing? So last year, you have to do waterfall. That's not going to work very well. But there are good practices even within those different methodologies. Allowing for different hows is really important. Static analysis is one of them. I think use of static analysis is kind of industry practice now and people should be doing it. Third party is really bad. I've been opining about this this morning. Let's just say I have a large customer I won't name who used a third party static analysis service. They broke their license agreement with us. They're getting a letter from us. They gave us a report that included vulnerabilities from one of our competitors. I don't want to know about those, right? I can't fix them. I did tell my competitor, hey, you should know this report exists because I'm sure you want to analyze this. Here's the worst part. How many of those vulnerabilities the third party found do you think had any merit? Run tool is nothing. Analyzing results is everything. That customer and the vendor wasted the time of one of our best security leads trying to make sure there was no there there and there wasn't. So, again, and last but not least, government can use their purchasing power in a lot of very good ways, but realizing that regulatory things are probably going to lag actual practice. You could be specifying buggy whip standards and the reality is that nobody uses buggy whips anymore. So last, again, it's not always about the standard particularly if you're using resources in a less than optimal way. And actually one of the things I like about the open group is that here we have actual practitioners. This is one of the best forms I've seen because there are people who have actual subject matter expertise to bring to the table which is so important and saying what is going to work and be effective. The last thing I'm going to say is a nice thank you to the people in the trusted TTPF because I appreciate the caliber of my colleagues and also Sally Long, you know, they talk about this type of an effort is herding cats and at least for me, it's probably like herding a snarly cat. I can be very snarly, I'm sure you can't pick up on that. So I truly appreciate the professionalism and the focus and the targeting, targeting a good slice of making the supply chain problem better, not boiling the ocean but very focused and targeted and with very high caliber participation. So I thank you to my colleagues and particularly thank you to Sally. And that's it. I'll turn it over to others. Thank you, Mary Ann and Rod. I won't join you on stage. I think I'll leave that for Alan. But Jim, do we have a couple of questions for our panel? We do. We have a few. So the first one and both of you could feel free to chime in on this, something you brought up, Dr. Ross, building security in, looking at software and systems engineering processes. How do you bring industry along in terms of commercial, off-the-shelf products and services, especially when you look at things like Internet of Things, where we've got IP interfaces grafted onto all sorts of devices? Well, I think the, as Mary Ann was saying before, the strength of any standard is really its implementability out there. And when we talk about the, in particular, the engineering standard, the 15288 extension, if we do that correctly, every organization out there who is already using a, let's say, a security development life cycle, might be the 27,034, you can pick your favorite standard. We should be able to reflect those activities in the different lanes of the 15288 processes. So the idea is, and I think this is a very important point that I got from Mary Ann's discussion, is that we have to bring along, we have to win the hearts and minds and be able to reflect things in a discipline and structured process that doesn't take people off their current game. If they're doing good work, we should be able to reflect that good work and say, hey, I'm doing these activities, whether it's an SDL, and this is how it would map to those activities that we're trying to define in the 15288. If we, and that can apply to the Internet of Things. Again, it goes back to the computer, whether it's an Oracle database or a Microsoft operating system. It's all about the code and the discipline and structure of building that software and integrating it into a system. So I think this is where we can really bring together industry, academia, and government and actually do something that we all agree on. I want to have a slightly different take on this. I know this is not, what is it, a voice crying in the wilderness. My concern about the Internet of Things, it goes back to things I learned in business school and financial market theory, which unfortunately has been borne out like in 2008. There's certain types of risk you can mitigate. So if I cross the street, busy street, I'm worried about getting hit by a car, I can look both ways, I can mitigate that. You can't mitigate systemic risk. It means that you've created a fragile system. That is the problem with the Internet of Things, and that is a problem that no degree of engineering will solve. And if it's not a problem, why aren't we giving nuclear weapons IP addresses? I'm not making this up. The Air Force thought about that at one point. You're laughing. Armageddon, there's an app for that. That's the problem. I know this is going to happen anyway, whether or not I approve of it, but I really wish that people could look at this, not just in terms of how many of these devices and what a great opportunity, but what is the systemic risk that we're creating by doing this? My house is not connected to the Internet directly, so I do not want somebody to shut my appliances off or shut down my refrigerator or lock it so that I can't get into it, or use that for launching an attack. Those are the discussions we should be having, at least as much as the, how do we make sure that people designing these things have a clue, as important as that is? I agree. I want to stay on track here, Jim, so I want to maybe take one more question. Okay. So there's a couple of questions related to... Well, the first one is, how do customers and practitioners value the cost of security and then a kind of related question, what can global companies do to get C-suite attention and investment on cybersecurity, so the whole ROI value discussion? Well, I know they value it, because nobody calls me up and says, I'm bored this week, don't you have more security patches for me to apply? So, well, that's actually true. We know what it costs us to build these patches, and again, I know it's important. Frankly, for the round of resources we spend on that, I would much rather be putting them on building something new and innovative that we could charge money for and provide more value to customers. So it is cost avoidance, number one, number two, more people have an IT backbone, so they can't, they understand the value of having that be reliable. Probably one of the reasons people are moving to cloud, it's hard to maintain all these and hard to find the right people to maintain them. But also, I do have more customers asking us now about our security practices, which is be careful what you wish for. I said this 10 years ago, people should be demanding, they know what we're doing, and now they are, and now I've spent a lot of time answering RFPs, but that's good. It means people are aware of this, and they want to know what kind of care, I'm running my business on your stuff, what kind of care are you taking to make sure you're protecting as if it were yours. Well, the ROI question is very difficult with regard to security, and I think this goes back to what I said earlier. The sooner we get security out of its stovepipe and integrate it as just part of the best practices that we do every day, whether it's in the development work at a company or whether it's in our enterprises, as part of our mainstream organizational management things like the SDLC or if we're doing any engineering work within the organization, or if we have an internal enterprise architect group, that integration making security less of, hey, I'm special, and more, it's just a part of the way we do business. So customers are looking for reliability, dependability. They rely on this great bed of IT products, systems, and services, and they're not always focused on the security aspects. They just want to make sure it works and that if there is an attack and the malware goes creeping through their system, they can be as protected as they need to be, and sometimes that flies way below their radar, so it's got to be a systemic process and an organizational transformation I think we have to go through, and we're not quite there just yet. Yeah, and you really do have to bake it in. I mean, we, you know, I have a team of, what are we now, I've got three more headcount. So, you know, 45 people, whatever it is now. But we have about 1600 people in development whose jobs it are to be security points of contact and security leads. They are the boots on the ground who implement our program. Because I do not want to have an organization that peers over everybody's shoulder to make sure they're writing good code. Not cost effective. Not a good way to do it. It is cultural. And that is how you do, one of the ways that you do that is seeding those people in the organization so they become, you know, the boots on the ground and they have authority to do things. Because you're not going to succeed otherwise. If you can't fix the cultural problem, which was back to Java, that was on the executives that this is a cultural thing. Everybody needs to feel that he or she is personally responsible for security. Not those, you know, 10, 20, whatever those people are who are there with the security we needs. It's got to be everybody. And when you can do that, you really have a seed change in how things happen. Everybody's not going to be a security expert, but everybody has some responsibility for security. I want to thank Ron and Marian for setting the stage for us on this. Big round of applause for their presentation.