 Good morning. Good afternoon. Good evening, everybody. It is a civic duty that if you see something, you say something. That is the underlying principle behind vulnerability disclosure, which I will talk about today. Give me a second here as I turn on my slide deck. As I said, I'm here to talk about vulnerability disclosure for digital systems in vital use by society, such as voting machines. Vulnerability disclosure is the neighborhood watch of cybersecurity. If you see something, say something. When we surface the bad news to everyone, we can take corrective action. Vulnerability disclosure is the bad news that turns into good news. Weaknesses you don't know of, you cannot fix. When you know of them, you fix them and turn them into good news. There are two critical legal aspects of vulnerability disclosure. Number one, the right for anyone to look for vulnerabilities in systems. And number two, the right to share the findings, not just with the owner of the system, but with the public. My name is Martin Mikos. I'm the CEO of Hacker One, representing three-quarters of a million ethical hackers all over the world. We're here today to talk about voting and voting machines. And if you look at the word vote and the Latin translation of it, it is the decision-making atom of nations and societies and has been so for thousands of years. We've built our countries and our nations based on voting systems. It is a fundamental piece of democracy. And throughout history, we've made good use of the latest innovations, such as paper, to build mechanisms and systems for voting. But tragically, today, we have a world with digital capabilities beyond our wildest dreams. Yet we have been unable to construct reliable and trustworthy digital voting systems. What's wrong here? How could we allow this to happen? And what is the fix? There's a logical conclusion that has been missing. If you believe in democracy with one person, one vote, and if you do believe in the freedom of speech, which the US does by means of its First Amendment, there's just one logical conclusion for voting machines and voting technology. You must believe in vulnerability disclosure, meaning the practice of inviting external, unbiased people to test the security and validity of the system. If you don't believe in this form of external and public scrutiny, you are not believing in democracy or freedom of speech. Let us think about how computer systems come to be. Somebody designs a system, the design is approved, product is developed and built, and it is tested. To make something truly secure and worthy of the trust of everyone, we must design security in from the start. Security must be there from the get-go. If you bolt it afterwards as an afterthought, it will never work. It must be there. Security must be there from the first blueprint. Then secondly, once the system is ostensibly ready for public use, you must turn around and let it undergo external, unbiased and unlimited voting. Now an inside, outside-in view is needed. Journalists don't see their own typos. Bookkeepers can't see their accounting mistakes. So we need proofreaders and we need auditors. These people are coming in with a fresh objective mind and no bias of ownership. They spot the flaws quickly and the flaws can be fixed. It's the same with software. We must subject software to external scrutiny by people who we do not personally know. Ethical hackers, white hats, security researchers, they come by many names, but it is the same thing. They are masters at detecting flaws in systems built by others. Hackers are the best mechanism for finding vulnerabilities so they can be fixed. In essence, hacking is the immune system of the intimate. They are a vaccine. They can think like an adversary, but they act in your benefit. Too often, however, the simple principle of security designed from the inside out and tested from the outside in is not used. We have voting machine bugs that have been known for several election cycles and are still there, unfixed, out there in the wild, ready to be exploited by people who do not believe in democracy. We must fix the simple but fatal problem. Governments should mandate vulnerability disclosure for every manufacturer and vendor of technology used for voting or any other vital societal function. It's that easy and it's difficult, but yet it's easy and we should make that decision. The principles I'm talking about here are not new. Over a hundred years ago, there was a Dutch scientist by the name of Kerkhoffs who postulated that when you're building a system intended to be safe and secure and maybe protect secrecy, the mechanism of the machine need not be kept secret. On the contrary, if you make the blueprint open for anybody to test and study and vet, you have a better chance of making a secure system than if you don't. It's only the keys that need to be secret, but not the mechanism of the system. This is an important principle of openness of design that leads to better security. Much later, a man called Shannon Ruther Corollary of Kerkhoffs's principle. He said, you must assume that your enemy will learn how your system was built. Share the design with everybody and in that way you share it also with the good people who will help fix it, who will help improve it. And by the way, there are many, many more good people than bad people in the world, maybe at a ratio of 1,000 to one. So to repeat and summarize, if we have a commitment to govern our nation so that every person has one vote, it logically follows that we must let each such person conduct their own validation of the system, the system of voting being used. It's a very simple principle and it is very powerful. Public scrutiny makes every system better. The only reason to object to this principle would be extreme greed or disdain for democracy. If you look at what's going on inside at large, it's in much better shape. The National Institute of Science and Technology has long published the cybersecurity framework with very good advice and recommendations for large organizations. And they define what vulnerability disclosure means. They say, you need to have processes established to receive, analyze and respond to vulnerabilities disclosed to the organization from internal and external sources, such as internal testing, security bulletins, or security researchers. When you follow the cybersecurity framework from NIST, your organization will be in much better shape. The Department of Defense fully embraces this. And they are the Pentagon, by definition, have some of the most sensitive, secure, secretive, vital systems in the world. And they have realized that to be fully secure, they need the help of the outside world of hackers whom they don't even individually know, but to help them report vulnerabilities so that the DOD can fix it. They've been running a vulnerability disclosure program for over four years by now with amazing results, fantastic results. The Department of Homeland Security, DHS, finds vulnerability disclosure so important that they are preparing a binding operational directive about it. Under this DOD, civilian federal agencies will be required to invite good hackers to hack them. It's the only logical thing to do in a democracy, and it will improve the cybersecurity of all systems and increase trust among citizens. And hot from the press, just in the last few days, the CISA agency within DHS delivered a terrific guide for election administrators. The guide is called Guide to Vulnerability Reporting for America's Election Administrators. And they say in their foreword that election administrators should know that the cybersecurity research community can help ensure these systems are safe so that the choices of the voting public can be clearly heard. This guide offers a step-by-step guide for election administrators who seek to establish a successful vulnerability disclosure program. Looking more broadly and into the future, I see three principles that will enable us to establish a true digital civilization, one that actually works, a society that is well-governed by digital mechanisms and which citizens can trust. First, it will be seen as negligence to ignore the useful input from external security researchers. Yes, negligence, meaning you would be stupid not to listen to the amazing advice and input of vulnerability reports that you can get from the outside. Number two, cybersecurity will be a collaborative effort with organizations pooling their defense to provide formidable obstacles to the adversaries. And we have learned over thousands of years that whenever we face an asymmetric threat, the best defense is pool defense. So we must do it. Thirdly, the only way for any organization to achieve trust by customers, consumers, or citizens is through transparency, openness. By sharing vulnerability information and cybersecurity policies and practices publicly and with each other, trust will grow. In summary, we have a system here for vulnerability disclosure where if you see something, you say something. It has been rejected and avoided by many vendors, but it is the only practical thing to do in order to increase the security of any computer system. Two legal rights are important here, the rights for the hackers, the ethical hackers to do the testing, and the rights for them to disclose their findings. We know what to do. We should have done it a long time ago. We're tragically delayed. We just have to make the decision and start going. We must listen to hackers. We must work together collaboratively on the defenses, and we must build openness into the area of cybersecurity, sharing what we are finding, sharing the vulnerabilities and our fixes, because that is the only way to build trust with our citizens and with our constituent groups. And when we do that, we will truly build a society that can function well in the digital world. Thank you for listening to me today.