 This session was pretty much intentionally not well publicized. My name and Dan's name and Moxie's name was not on it. Partially because in Rich's name, partially because I wasn't sure who was going to be on the panel with me. Excuse me? Oh, he's on the phone. So I'll just kind of give you a brief background of what we're going to do and we want to involve you in it. That's why it's a call to action. And so before we get started, I just want to introduce everybody. Here's Rich Marshall. Say hello, Rich. Yeah, yeah, just say hi. All right. A couple of ground rules. I'm currently the director of global cybersecurity management for the Department of Homeland Security. But the views that you're going to hear expressed by me today are not necessarily those of the department. Because I'm big on information sharing. There's a price to pay. I'm giving you information. You're going to give us information and we've got to trust each other. And part of that trust means protecting it. So I'm going to be speaking primarily as a public citizen, just as you're going to be speaking primarily as a public citizen. This is not an advocacy piece. This is a socializing thing that we want to discuss today. And then many of you know Dan Kaminsky on my left. Hi. I'm Dan. And his left is Mozzi, Marlon's bike. I don't have any ground rules. I can walk up and punch him in the face and that's cool. Bring it. And then... Art. Like Rich, I've been in, I've been in career civil servant for a long time and I'm here just representing my own views. I think this is a very important topic that's being raised. And I just want to, I came up here because I want to encourage what the panel is going to propose. Okay. Great. So let's get started. So originally the idea here is that I want to, I want to start off, tell you about my ideas, have a debate amongst the panelists and then start involving you. And between all of us we're going to come up with a call to action, a letter, a request for signatures, and then ultimately during October, which is how many people here knew that October is the seventh annual cybersecurity awareness month? Wow. Yeah. So I figured, well, hey, October is coming up. May as well use it as a time to publicly launch this open letter. So what is this open letter? Well, at Black Hat I talked briefly about, I had done 13 Black Hats and I was asking the audience rhetorically, like, what have we actually solved as a computer security community? I mean, what problems can we just say, you know, we nailed it. It's gone, it's, you know, did we ever solve dealing up with, you know, war dialing, modem banks? Or did we just move on to something else? Did we ever really solve problems with X-25 or did we just move on to TCPIP? I mean, what did we really ever solve? So my contention was that the underpinnings of everything we do online is fundamentally email, web, name resolution and routing. And each one is pretty much has problems. And so how can it be that we have a thousand vendors selling us a thousand security products that we still can't email each other securely. We can't route. We can't, you know, resolve names and we can't browse. There's something broken about that. And let's try to fix that because I'm getting really tired of talking in circles about this stuff. So what encouraged me was after 11 years, DNS sec has finally got the root sign and for part of that I'd like to credit Dan's bug because I think that accelerated the process by two or three years. I mean, that really moved things up quickly. So I'm thinking, okay, well what else on this hit list can we move up quickly? What's within our grasp to change? And I don't know if it's in our grasp to fix SMTP. And I don't know if it's in our grasp to fix secure BGP. I mean, there's some really bright people. DHS, S&T has been spending a lot of money working on this. They have vendor working groups. That one seems like it's a problem almost above the heads of average people. But it's being addressed and it's budgeted and everybody recognizes it's a problem. It's the secure browsing portion I want to talk about today. And what inspired me was if you guys noticed that browsers are sort of moving toward this HTTPS always capability, not necessarily by default, but there's a capability starting to be baked in. And you saw this in no script supporting STS, strict transport security. And you see EFF releasing a tool. Chrome browser supports STS by default. Future browsers are going to support this HTTPS always sort of capability by default in the future. And it seems like the browsers are starting to move in that direction. We're starting to get to the point. A couple years from now, every browser will support the capability of just always having and always encrypted experience online. And boy, wouldn't that be great. Now, there's some problems with this. And Moxie is probably the foremost expert with all the problems with SSL and TLS. But what I want to do is have a conversation with these guys. Is this a worthy goal to gather the energy of the community and say to companies at large, this is the direction the world is going. It is too important to have publicly, to encourage us to submit to you our private details, whether it's social networking site or email or web mail or chat. You're encouraging us to use your services to share our private details. Yet we do not have an option to experience it all in a secure fashion. And I'm starting to say no, that's not right. That's not acceptable anymore. And you used to be able to argue expense or complexity. But I'm saying over the next several years, I bet you can innovate. I bet you can find a way, just like Google is finding a way, to have an always HTTPS experience. And so that's kind of what I want to talk about. So is that something you guys might be interested in listening to and helping out with? Yeah. Okay, well, let's start off the conversation. So I'm kind of surprised, or you might be sort of surprised that, you know, somebody from rich, from somebody from DHS or in your prior life here at NSA might have such strong privacy feelings. But, you know, it didn't take hardly anything to convince rich that, you know, this was something he would want to be involved in. So why don't we start with rich? I mean, and anybody can speak up at any time. The goal is at the end, we will have a consensus of is this worth pursuing? And if it is, come October, how do we present it? And initially, my idea is we'll come up with a, we have a problem statement, we'll come up with our proposed call or action of what we would view as success. Would it be two years from now the top five internet sites are all SSL? Would it be, it will come up with some metric? And then we'll take, we'll take that to experts in the community, we'll try to gather as many signatures as we can, we'll go to the broad community, get you guys to do maybe an online petition and then we'll either take ads out and newspapers, we'll do something high profile, editorials and big newspapers, whatever. But we'll really try to swing public consensus toward the idea that there's really not any more roadblocks. The browsers are moving in this direction, the technology exists, so now is the time to do it. And not wait around for another five years. You know, 11 years to get secure DNS is, I don't want to wait 11 years. So I want to get this rolling. Right? Right. Let me share with you some observations at kind of a high level and then we'll work down. Probably my greatest claim to fame is having been the legal architect for eligible receiver 97. If you don't, if you're not familiar with the eligible receiver 97, Google it, but I'll give you the capsule summary. It was an information warfare exercise that demonstrated to the United States leadership at the highest levels that computer security needed to be on the forefront. That information warfare was real and that was under President Clinton. So we started getting high level interest in this delicate subject way back in 1997. And as my colleague Jeff has said so many times, we just have not made the appropriate amount of progress. I think we're starting to see that 72 bills in Congress associated with cyber security. That's a big focus. Well, in last year, there were zero. That's correct. Right. Zero to 72 in one year. So momentum is building very quickly. And if we don't get involved, it's going to zoom right past us. And I don't want to have any of this stuff really mandated. I'd rather have it be voluntary adoption. The Senate Majority Leader, Harry Reid, has encouraged his colleagues to consolidate those 72 bills early, come up with one that the administration can support. So the point is, there is interest at the top, both on the administration and in Congress to change the environment. Now, as taxpayers, as users of the system, we've got three choices. We can make things happen. We can watch things happen, or we can wonder what's happened. We've been wondering too long. It's time to start shaping our environment. We need to shape our computer ecosystem. Part of the responsibilities that I have in my current job, which I've had for six months in one week and one day, is secure. Six hours. A secure computing. Now, there's been a tremendous amount of effort in that. Safe code has been a very strong advocate of that. But in my personal opinion, there's been too much. Let's have 10 meetings and talk to the converted. We need to start nailing some skulls to the wall. That's a crude expression to say, let's get some results. Let's get people doing secure coding, not just the major vendors, but downstream as well. Second area of concern is supply chain risk management. That has been ignored far too long. Thomas Friedman in one of his books, The Earth is Flat, talked about 72 countries, 72 producers, vendors in over 12 countries that put together his laptop computer. Tremendous risk exposure for someone to put in an undocumented feature with a phone home capability that we may or may not be interested in. Education is another big thing that I push. A big push on STEM, science, technology, engineering, and math. I'm going to put a C in front of that because we're just not doing enough in that. The American public spends more money on astrology than the American government spends on basic research, R&D for computer security. We need to change that. Now I'm not necessarily endorsing any of the techniques that my technical colleagues are going to talk about, but I'm happy to be up here on this panel because we need to start affecting change in a positive way in making the ecosystem and the computer world better. And I'm convinced we can do it. Any thoughts, Dan? So if you want to improve the nature of at least computer security, three things have to happen. First, you've got to find a problem. It has to be something that is actually wrong that would need to be remediated. Second, you have to actually get a fixed build, meaning the bits actually have to be written, the code has to exist, but you're not done yet. The third thing is deployment. Whatever it is that was wrong, where the potential fix could be applied, the fix actually needs to get out the door. This sort of full life cycle of bug to repair is the challenge because it's very easy to just find the bug. It's very easy to just get the fix and say, well, you know, we'll leave the rest as an exercise to the reader. Leaving the rest as an exercise has left a lot of work undone. We can do better. Part of doing better is figuring out what it actually means to get stuff deployed, what it actually means to make things available and accessible in the larger market. At the end of the day, there's a lot of people who do not care about security in the sense of they don't want to do any work for it, but they still want to be safe, they still want to be secure. And you know what? That's okay. I want to take my key and put it into my car's ignition and go. And I don't want to have to care about anything happening under the hood. You guys might be car people, someone in the audience. I'm not one of you. I just wanted to work. And how do we make stuff so... Well, in this context, how is your browsing experience just fundamentally more secure than it is now? Or more private? Why do I have to type in HTTPS versus HTTPS or whatever? Why isn't it just default to the secure option? Precisely. And I think the ultimate reason why is because too few people have... So few people have deployed entirely secure sites that if you're a browser manufacturer, you say, well, look, okay, .000, I'll support Gmail just fine. Okay, well there's the rest of the web as well. So there are interesting things you can do when you work 0.1% of the time, when you work 1%, when you work 10%, when you work 50%, when you work 90%. We have to make it... We have to exert force so that the secure solution is available and functional and easy more and more of the time. And that's work that I think the next few years is going to involve. So we don't get too distracted. It's going to be, I mean, the focus of this call to action is HTTPS experience for browsing to try to solve one of the four things I talked about, the browsing the web in a more secure fashion. And I believe we're going to get there. I would just like to get there three years faster, five years faster. And the way I would envision it is that at some point we'll look back on this five years from now, 10 years from now, and we'll say, oh yeah, remember back when everything was unsecured. And then this will sound really silly, but right now it doesn't. Remember when everyone used Telnet? Yeah, FTP. And then SSH happened. Yeah. I mean it's not like things can't get better. Things have. Right. Yeah, and the concept of default settings is a concept that works. Several years ago the Air Force mandated when they purchased computers from major vendors that there be a series of default settings before the computers even went into the boxes. That free, the system administrator, a lot of extra duties of configuring that particular computer. So the concept of default settings works. The other point I want to make that's related is we need to influence the outcome. It's not just the legislation that some people are concerned about. But banks, I read today, are starting to shift the burden of liability from online banking. All of us are aware that in the past they have absorbed the losses that were caused by inappropriate practices, either by the bank or by the end user. Now they are going to change that to shift the liability to you. So if money is lost because you were not using the secure practice because your computer was not hygiene correct because you as part of the ecosystem did not act as a responsible entity, not withstanding the fact that the bank may not be acting as a responsible corporate citizen. Because they're trying to make money by pushing it your way. You've got to help influence the outcome so that we're not stuck. So that leaves us then without going down a rats hole before we open it up to the public to the technology of it. And we're pretty much banking on, I mean you've got SSL, and when we say SSL we really mean TLS. Because there's not really any other alternatives. And then that means it's kind of voting for the lesser evil, right? And this is where MOTC comes in. Because there's not a whole lot of goodness out there in the protocol, right? Yeah, so I would say that putting SSL everywhere is certainly the low hanging fruit. There's no reason not to do this now. You know RC4 is 26 in order instructions per bite. That's the cost. I mean that's nothing compared to rendering out your rails web app or whatever it is that people are doing now. So, you know, certainly this is very, this should be technically easy and it should be everywhere. And there's no reason not to have it everywhere. But my concerns are that that's not enough. You know that what I don't want to do is go down this road where we say oh well we want SSL everywhere and then you know people actually finally implement these things and then we think oh well now the war is won. Because it's not, you know, SSL itself is a pretty shaky protocol and a lot of it requires us to trust people that we should not be trusting. Again and again the people who ask for our trust in the certificate system have proven that we should not be giving it to them. And right now we don't have any other options but I think that it's also worth trying to come up with technical solutions to cut those people out of the picture. And so, you know, while I think definitely it's, it's should be easy to put this everywhere we can't stop there. And we need to continue to think about, you know, what we want out of privacy, right? And you know also these concepts of privacy extend beyond the network layer. You know that it's very easy for Google to say oh well now we have SSL search and so now we're helping you with your privacy. And that's simply not true, you know, that there's the majority of the privacy concerns that I think we should justifiably continue to have with companies and probably will always be in a continual tension with, you know, trying to maintain the privacy of our data are not necessarily, are certainly not going to be solved simply by putting SSL everywhere. So I think, you know, while we should, you know, definitely go down this road, that's not the end of the road and that this, this is a struggle that I think will continue far beyond that. But you can't really get too much, if this is the beginning of the path you can't, you can't jump this step I don't think. I think you have to progress through this phase before. Sure, but I mean because how you progress through the phase but, you know, be conscious of exactly where it's taking us. Because if, I share a lot of your concerns with the certificate authorities and you know, like, how many, how many secondary, you know, certs to say network solution sign that nobody knows? You know, how many of these secondary? So, yeah, I mean, recently there's a, you know, there's an issue where there's a certificate, a root certificate in the trusted root store of Firefox and no one knew where it came from. And, you know, then people went and it said RSA on it and people went and asked RSA and they're like, no, this is an RSA and people are like, oh wow, there's some rogues certificate in there. And then finally RSA came back and was like, oh, no, that one is ours. We just forgot. Whoops. So, you know, certainly if, you know, people can't even keep track of what certificates are theirs that doesn't necessarily inspire confidence in the long term. So, one of my things is, I think a lot, some of the problems is adoption rates increase. It will be an opportunity for companies to innovate and come up with solutions to solve either easier ways to deploy this, you know, lower CPU load, lower cost, whatever it happens to be. I mean, there will be definitely some challenges that need to be solved. I view those as opportunities. So, but yeah, I don't like being beholden to certificate authority. I'm more like, I'm liking the perspectives or something, but there's got to be some, yeah. Okay. So, that's it. That's all I have to say. Yeah. It's my naysaying. Yeah. So, anybody in the audience? We've got open mics right over there. If you have anything to say, step on up. And, uh... Well, Jeff, while we're waiting for the first question, one of the pushbacks that we had discussed earlier is that part of the reluctance on the part of industry might be the increased costs, which of course they would pass on to consumers, standard practice. But I always find it interesting, there's always enough money to correct a problem after it's happened, but people are reluctant to spend money to prevent or preclude a problem from happening. So, I think long-term it's going to be a smart process. Right. I don't want this to be a doomsday scenario. This has got to be about, the sooner you implement this, the sooner you can enable other types, you can enable other types of behavior on the internet. You know, if this was already the state of the world and the Iranian green movement was happening, it would maybe be a lot harder for the government to monitor who their friends are, who they're tweeting at, who's their social, no, networking friends, for them to come back and potentially persecute people because they're holding a different political view. But that's not the state Iran or the rest of the world is in right now. The rest of the world, we're sitting captive to whoever wants to sniff us upstream, and that's not a world I like being in. So, isn't a lot of this about what's an acceptable norm, both for the vendors and for the users, and what are we going to accept and not accept as a community? Well, so I guess. But the technical focus I want to work on is just that the capability for always SSL is coming. Do we accelerate that or do we just bumble through it? So, and I think I accept your premise. I think it's important for us to But, and I would love to have that debate and informed debate about, you know, as a society, what is the tension between privacy, how much security, how much SSL, so on and so forth. But that's a larger debate. Right. So, if we just narrow it down to the technology is coming, is it good enough or how good is it? And what are our trade-offs when we use it or don't use it? Right. So, that we use it. I accept your premise. It's going to be there. People are going to use it. How good is it really good enough? And if it isn't, does everybody understand what risk you're accepting? Right. Or not. And that's Moxie's point, right? It's not very good, but it can be improved, potentially. But it's not like it's a very rich ecosystem. Your option is SSL or TLS. Which one do you prefer? You know? So, let's move to the first question or observation. Hi. My name is Michael X-ray. And I actually work in the financial sector. And always SSL is an awesome idea. However, that only protects the end-to-end communication. What I'm seeing is, I think maybe we're missing the boat. I think it has to start with awareness. Part of my job when a business customer loses money to Moscow or wherever is to actually look at their system and find out where the compromise is. Now, the compromise hasn't been at our bank as long as I've worked there. It's very easy to find a compromise on a small business system or a consumer system. And I'm not wondering if the first point of attack shouldn't be awareness at the consumer level. I mean, we have TV commercials, don't drink and drive, click your seatbelt, don't throw your soda cups out the window. And we even had a little bit of, you know, identity theft that really didn't say anything about it. And I'm not wondering if the first step in this process is to educate the American consumer. And I'm going to agree with you that I think education pays off the best in the long run, right? A well-informed and educated populace is the best defense to a lot of things. But back to my original four points, you know, as technical people, if we're in a position to help accelerate the ability to compute, to browse securely, we should, I'm not in a position to, you know, through, besides these conferences trying to raise awareness, I'm not in a position really to affect wide-scale education. I'm rich in a much better place, but we are in a position to try to further secure browsing. So that's why I've narrowed the scope. Small bite-sized chunks I can try to make some move the needle on. I don't know how I'm going to move the needle on worldwide awareness on safe computing practices because we've been trying that for 20 years, right? But maybe we can move the needle on this. Mike, I think you're in an excellent position to make some change to help influence an outcome. I am aggressively an online banker, but because I'm so paranoid, I check my balance every single day. Not everyone does that, but that's a good practice to do. And when my home bank web page comes up, it alerts me to a phishing attack but doesn't give the details. Now, that's alerting me, but that's not educating me. That's not even training me. So why don't you get the banking industry to be more aggressive in explaining what a phishing attack is on the web page and also to start commercializing this by making commercials, television, radio, the whole bit. People are going to listen to banks when it talks about the metric that you love, which is called money. Especially when it's their money. And actually, that's something that we've started to do because, well, because this is a passion of mine. But, yeah, I would like to see a coordinated approach. I think more of us need to get involved in that process. So, thank you very much. Thank you. So we've got about 15 minutes for questions. So, I've noticed that things get fixed when bad things happen. When Dan breaks DNS, things get fixed. When Google says that, you know, geez, we just got hacked by a bunch of Chinese people, you know. What do you know? The date they announced that is the date that they enable HTTPS by default on Gmail? Yeah, things happen. And so the naive approaches say let's make more bad things happen. But we don't need to do that because bad things are happening. So what we really need to be doing is making sure that people actually talk about the bad things that are happening. The big thing about the Google story wasn't that it happened, it was that they talked about it. Right. And that's a big deal. And I think of them talking out loud, same thing like with Dan on the DNS bug, accelerated things a number of years. I think Google speaking out loud accelerated and brought awareness to a much higher level. Because for the last decade they've been, people have been getting hacked. Nothing changed. The only thing that was different was the Google. There's all these cases where things are getting hacked and no one hears about it. Zip or not it's been compromised but no one talks about it and they can't for obvious reasons. But I mean there's plenty of corporations that could be talking and they aren't. And that hinders our ability to make change. So that was the comment. I also have a question if you don't mind me barreling through. Before you go into your question I just like to say we are an industry that for better or worse operates an incredible amount on anecdote. On stories told over beers we can't tell anyone and it's great in that we sort of get an idea of what's going on there. I'm sure half this audience is like, oh, Zip or not it's been hacked. I heard from this guy on a question. Our data is terrible. We make awful scientists. And if what it takes to fix the data problem is mandatory disclosure of events that might be something to consider. Well that'll be next year. But that already happens. There's the breach notification laws and actually Adam Shostak has done a lot of good work of taking all of the breach notification data we're now credit card processors have data breaches they were required by law to publish something that says we were breached. This is how many records we lost. And Adam Shostak has done a pretty good job of becoming a scientist and looking at the state to actually figure out well what is actually happening? How are people being compromised? Okay, and your question? Yeah, I'll try and be brief. This has been an idea that's been around for a while but I haven't seen it implemented and I don't know why. The bad part about TLS that we all seem to dislike is the certificate model in the BKI. So if we dislike the certificate model in the BKI but we like say the SSH BKI which seems to work fairly well. Basically the fundamental thing is if I give my data to someone I trust them with the data. So I should be remembering their certificate. If someone else comes in with a different certificate signed by a different authority I still don't trust them. And if we did it if we did it that way then that would solve a lot of the problems. It would solve the problems of rogue CAs to some extent. It wouldn't help you with the initial bootstrapping but the initial bootstrapping would use the existing model and then for continued interaction with the site you would use the SSH model which allows you continued strength beyond what we have now. So the model we have now can be continued to be reused for only the initial acceptance. So why don't we do this? So I'm a former SSH developer and let me walk through this stuff. Very quickly, every time there's an error in SSH key generation the user is asked please type yes to trusting this new key or please go into your known host file and delete that value and every last time they do it because it's always the fault of a server misconfiguration. The SSH model is cool and don't scale. And I would just add that what you're talking about is called trust on first use or tofu and there's a project that I'm involved in called Perspectives that tries to leverage that to be less confusing than the pure SSH model and I think it's a really great project and you should check it out if you're interested in alternatives to the CA system. Perspectives. Is Perspectives just because there was also one that was taking a look at the host from a bunch of different perspectives? That's the one. That's the one. Okay. Hi, my name's John and I've been doing security research for about 30 years back to the Arpanet days when I was at Xerox Park. So I have a long history on this stuff and my perspective is that things are getting out of control and you guys are setting the bar too low. I really agree with the value of doing this. I think it's kind of a no-brainer. There's value to it. I support it completely but I think you're setting the bar too low. If you're going to get attention, I want to support Moxie's position that you need to state how out of control things really are and that's the education that's needed right now. This is important but it's just the first step of a long road. And let me tell you, the computer scientist, there's a lot of good science that's been done. Computer scientists know how to build secure programming environments. Okay? At Park, they built great ones. Strong typing, garbage collection, you name it. I built bulletproof stuff. I wrote the Xerox.com and I'll get it. We have bulletproof code. Nobody can hack it. You know, we knew how to do this years ago. Where's the computer science? You know, where are people going back to that science and supporting that and getting that rolled out? We know how to do this as computer scientists. At least to that part. And look at web application threats. I mean, how is TLS going to solve that huge problem? It's just totally out of control. Okay, so the takeaway though is the big don't aim too low with this. My name's Luke. I work at a non-profit attached to a university. I don't have any big ideas but I just kind of wanted to bring up a small technical thing just to keep in the back of our minds that so much is done on phones these days rather than PCs and what not. I think that one of the big blind spots or bumps in the road that we'll encounter is the fact that nobody, most people never update their browser on their phone. They just get a new browser when they get a new phone and if they're the type of user that keeps their phone for three years they might have a browser that's got poor capabilities that, you know, that might break on websites or they have all mobile browsers I've ever experienced recently all support not only SSL but compression and they have pretty complete experiences. There's a larger point which is absolutely accurate which is why are we ever asking users about patching? They don't care. Give them a system that works. Every user interface that has ever been designed around patching from Xbox to Windows Update to Adobe Update to Winamp Update they're all bugs. They shouldn't exist. It should just work. And if we're going to get HTTPS everywhere there are going to need to be some updates to some browsers and the the browser should just be updating. Just like they update their safe browsing lists to say find out the new sites they shouldn't go to they should be getting the latest code and it should just have it. Right. And we're not proposing that this be turned on tomorrow. And this is coming to mobile browsers just as it's coming to Chrome and Firefox and everything else. So it just when it gets there we want everything to be ready. We don't want any more delay. Yeah. Thanks. I'm going to go back to work and brag to people that Kaminsky took my side when I was arguing with DarkTangent. What's up now? I'm too short for this mic. Hi. My name is Megan. This isn't exactly a technical question. It's more of an opinion question but so you on the left obviously have understood the power that a good legal background or legal system can have in regards to computer security like sometimes it just has to go to court. What if computer security made the change not starting from the code but holding programmers liable for holes in their code so big that it causes people to be victimized? Like when a when a contractor builds a house and something is dangerous and it kills them when they go to jail what if the same concept was applied to programmers? A bunch of code wouldn't ship till 2023. Well, yeah, and we we we've been building houses for hundreds or thousands of years so that that model is fairly well understood. Yeah, and I know just because I'm a lawyer you're going to think I'm going to agree with you and I am in favor of the Lawyers Full Employment Act but that's exactly that's exactly what it would turn out to be the lawsuits would grind down and halt technological advances. See, okay. And then the software you buy would come from China. Sure. Okay. Well, what about masks? Like I'm not talking about banks being held responsible for fishing attacks. I'm talking about irresponsible coding like insecure code that that causes people to be actively victimized. But 2023. But see, the problem is that we don't know of a vulnerability and tomorrow I'm going to go into a talk and I'm going to learn about it. And then the poor programmer who developed it yesterday is now liable It's hard to get my head around that. Well, and a lot of the poor coding that you're talking about is also known as malware and it's doing what it's designed to do which is, you know, to create chaos. Okay. Maybe a better way would be to identify it as it comes across the line. I'm going to use the phrase deep packet inspection without fully understanding every aspect about it. But maybe the internet service providers can and I know I'm going to upset two of my very good friends in that particular business but maybe they could be a little more aggressive in making sure that there is good computer hygiene practices. I think we've got the technology that's readily available so that when my computer talks to your computer even over the internet there should be an exchange of data to confirm that your system is updated just as my system is updated and we can have safe correspondence, you know, safe practices. The technology is there. It just takes the public will to implement it. Okay. Thank you. And one comment I have is that I feel like people have tried this project, right? The I think most notably the open BSD and the open SSH projects where, you know, people sitting down and be like, you know what? We're going to do it right. We're going to write secure software. And they tried really hard and they were still bugs. For a long time, years they've been working on it. And so my take away from that is that there's always they're always going to be bugs. Software is always going to be insecure and we need to just start with that acknowledgement and then deal with it from there. That to me is a fundamental problem. How can you tell deliberate from a bug? In a legal sense. I mean, I'm not trying to say like every little bug but like massively. If you get my friend Rich down there who's the lawyer, he's going to want to know that. Yeah. But let's move on. Let's move on. We've got only five or six minutes left, so. Hi, I'm Brian. I work in higher ed. I think your scope of what you're trying to do is a little small because, you know, in a lot of products that we deploy, I think the problem is more of what we accept from commercial vendors today. I think there needs to be some mandates with software that people develop and they sell that it be secure from the ground up. It's really just not. You ask a vendor today when you're buying a product, if they have any guide to harden the SSL server or the IAS behind it for their product and whatever they've come up with and they look at you. It's just, I think the task at hand has to be more of creating a set of mandates for commercial providers and developers. I wouldn't disagree with you, but that's kind of outside of our capabilities right now. But I mean, we have somebody here from, I believe it was the NSA, right? But what I mean is, I don't want to, yes, we could do that. We could have another call fraction for that and better education and stem cell, I mean stem cell. Why not? Fix that too. It's not just an education thing. I mean, I'm not cat the other day. I mean, the SSL is good. The authentication is weak. We have all sorts of tools for poor authentication. I mean, the exploit poor authentication. So, I mean, isn't it that somebody has to get after these people who are giving all these things to us? Shouldn't that be the task at hand rather than just worrying about SSL? Well, so you're saying, why don't we have like a housekeeping seal of approval and code quality audit? No, but I mean, I mean, we can look at all sorts of the different products that people use. All right, why don't you, why don't you try this approach? Take one that's good, other than SSL. Take this approach. It's called contracting. Build into your contract that whoever sells you a particular product or software or hardware or whatever, to your specifications that it reaches certain security standard that you judge. You're gonna pay more for that. Are you willing to pay more for that security? Yes. All right, then do it. But how about it? Then do it. Do it down too. How about the general mandate that it all has to be encrypted? If you come out with a product today, it has to be encrypted as a foundation. Not just- Right, but see, that's a more general statement and we're in a position where we're in a situation where there's more specific solution. There's something we can actually act on as opposed to another broader, everything must be encrypted. Well, let's talk about that for the next three years. You know, this is actually something that's within reach. And I just want us as an industry to grasp something and solve it and then move on. What would it take to get the government to make a mandate that if you have come out with a commercial protocol that you sell, that that protocol be encrypted? Then everybody runs Free Linux and they never have to mandate that and they never have to deal with it. But the people who use Free Linux are more likely to do it than anybody else. Look, this is the bottom. Yeah, yeah, go ahead. Look, we have a lot of major websites on the internet today. People use them every day. They transit lots of personal information on it, lots of private information. And you know what? Unless you're a nerd, you think when you go to that site, that's the site. And it's totally okay to give all the information. The only people who are seeing it is just the site and everything's fine. Well, you know what? Everything's not fine. When you go to most sites on the internet today, you hope you're speaking with them. You think you're speaking with them. The technology exists to know that you're speaking with them. And the browsers are really meeting halfway to make it so you can know. But there's a gap. And the gap is deployment on the server side. If I want a site that I can access SSL only, I got Gmail, I got PayPal. Okay, let's start talking about the other 99.9% of sites that are out there. And that's kind of what we're talking about here. Why aren't we seeing SSL adoption on a much wider scale? I got you, but my SSL works today. It's the piece behind it that I'm concerned about. After I provide that information through the secure transport, what happens to it on the back end? Okay, well, we're almost out of time. I'd like some other people questions. But I'd be happy to talk with you afterward. Great, thank you. Hello, Sam, I work with a healthcare provider. And my fundamental question is from a consumer standpoint, if certain companies that made steps in the right direction were called out, like, this company has just implemented this, et cetera, et cetera. Exactly. And I mean, I tell the people I know, I tell my wife, my family, my family's friends that, hey, this company just has this, you need to use this provider. And that's the example of using society to apply pressure. Maybe not through legislation, but through social pressure to do the right thing. I think, I mean, if there's, if people in the know, people with the skill set that can identify when companies make good, when they invest their resources to give something to the consumer from a secure standpoint, if that's highlighted, if that's brought out by people in the know, and as opposed to, hey, this company just did this, and their competitors have not. So it's a dual-edged sword, rewarding the companies that are moving and contrasting that with the companies that aren't. And I mean, how we spread that is, it's our responsibility to spread that because we're the only ones who can do it. But that's basically my overall comment, is because only by, well, a company's gonna do something when it's profitable. And if it isn't profitable, the government can force them, then it becomes, then there's a risk associated with it. The incentive model would, initially I would propose the incentive model would be peer pressure, and if that fails, there's 79,000 bills in front of Congress that you might try to level, you know. But if there's like a voluntary organization, consumer reports are something where, where we as a community get together and start doing that. Thank you. Thank you. My name's Mike. I wanna piggyback off of what he just said, but I wanna take it a step further. There is no incentive for anybody to do any of this. We have things like TJX, one of the largest data breaches in the world. What happened to TJX? They're still just as profitable today as they were yesterday. Why? Because the consumer is not gonna be held accountable for their mistakes. You look at the Electronic Funds Transfer Act. If I write my pen on the back of my card and I lose my card, I'm not responsible for it, the bank is. So part of it has to be education. We have to get out to the communities and we have to work with them. Well, I think part of what Dan, well, I'll let Dan say this, but that ideally the consumers would almost be oblivious. The server operators would implement this, the browsers wouldn't support it, and you would just go off and have a safe browsing experience. But you'd have to have the companies back that. So, for instance, let's just say a financial institution. They have to support that SSL. Why are they gonna go out? I mean, I see companies every day. You guys mentioned FTP, Telnet. I see companies every single day that's still using those protocols. And you talk to them and ask them, well, why are you still doing that? Well, because the consumers have not upgraded their systems to support the new technology. Right, right. Okay, I'm sorry I'm gonna have to cut you off or out of time, but we'll be around. I'd love to talk to you and we're gonna be posting more about this. And I'd really like your feedback because if we as a group think this is worth doing, we're doing this. And from the feedback I've gotten so far, we should think larger, don't aim too low, as the gentleman said, and see what kind of response we get from the larger community. So I wanna thank the panelists, Moxie and Dan and Rich and Art, that's right, I keep thinking. I'm like, that's not Art Bell. Not yet. Yeah, all right, so thank you very much. Thanks for listening to this.