 All right. Good evening, everyone. Thanks for coming in today. I realize it's a weekday. It's in the middle of the Dasara holidays. So you're not on holiday. You're here. I think that's fantastic. So I want to start off by introducing this event, introducing Mozilla and our speakers. How many of you in the room know what Mozilla is? What do they do? They make Firefox the browser. And if you think of Mozilla as just being the Firefox company, I think that would be underplaying Mozilla quite a bit. And I mean quite a bit. So I'm going to take the liberty of introducing where Mozilla comes from. It started off as an idea in the late 90s, in 1998, as a grand experiment in what we now know today as open source, but was a completely new idea back then, in that the term did not exist before 1988. It used to have free software and commercial software. And open source came in as a new way of thinking of software, of saying that a company could open up its most important intellectual property to the public, and somehow become better as a business, and not worse. And so Netscape took the first big gamble, opening up their web browser back in 1988, the Netscape web browser. They started what was then called the Mozilla organization, which was just a subsidiary inside of Netscape Corporation. Netscape was acquired by AOL. And in 2003, it became an independent organization called the Mozilla Foundation. Eventually, the Mozilla Corporation was established, a for-profit subsidiary of a non-profit organization, which is extremely unusual. I mean, most of the time you hear of it the other way around. There are for-profit companies as a non-profit subsidiary, which is what they use for their corporate social responsibility. But here you quoted the other way around, that the for-profit is a subsidiary of the non-profit, and the non-profit is the CSR. Which essentially then means that the entire point of the corporation is to support the public interest represented by the foundation. That's extremely unusual. You're going to see very few cases of this happening anywhere in the world. And I think we need to listen to Mozilla a lot more just because of this bold experiment in how they've structured themselves. For the past two decades, Mozilla has been a champion of an open internet in the public interest, whether it comes to open standards or having an open browser, or as what's currently becoming more and more important, the idea of privacy on the internet. For too long, we have dismissed privacy as being a fringe idea. And several events around the world have boiled over to the point where they say, no, we have to think about privacy as being first and foremost the approach that you use to think about how you work with your data, how you build your organization, how you build your data models. So this is something that also ties in with what I've been working on with Hasgeek, which is my organization. I assume some of you are familiar with it. Some of you may not be. So just to give you a brief, about a decade ago, I used to work at a company that did government work. And one of the projects that we took on back then was biometric identity for ration cards in Karnataka. And I worked partly on the project. I worked partly on rural delivery of services in rural areas from the government. And one of the things I kept encountering working in this space was that the programmers I worked with, whether they were inside a corporation or inside the government, had a fairly poor understanding of humanities, which is that you're writing code that determines someone's right to life because you're controlling whether they're going to be giving food or not today. And if you're doing that, then do you understand how important it is that your code works? And if it doesn't work, who are you going to pass the blame to? So there was this on the one aspect. There was this missing sense of the humanities in understanding the importance of your own work. And second, there was also missing an understanding of the power of your work, that if you're going to write code that determines whether somebody gets food or not, then are you not God? Are you not playing God here? And if you're playing God, do you realize that you're playing God and you need to be responsible for what you're doing? So this I found severely lacking. And what I realized was that a lot of it came down from the fact that a lot of us technologists simply did not think we were important because that's what we were told. We were told that we were doing a job. We're going to get paid to write code. So you write the code that you've been instructed to write. And then you go home. And it's not your problem of what happens afterwards. So this thinking had to change. And this bothered me a lot. So one of the things I started doing back then, along with my partner's end up, is we started getting programmers to just talk about what they do for feedback from outside their own organizations. Because inside your organization, you're stuck in your corporate hierarchy that determines what you're allowed to do and what you're not allowed to do, and what kind of feedback you're allowed to take and incorporate in your work. And so we figured that if you break organization boundaries and get people across organizations to discuss their work, we'll help them build a little more confidence in what they're doing. And over a period of time, you can take it down to saying, then you first become a little more assured about what you're doing. And you know what is OK for you to do and not OK for you to do. And then you can start crossing domains and looking at things in the humanities, including issues like privacy. Because most of us in tech don't really understand privacy. Because we think of privacy as saying, I will not share your data, even though I have it. But that's not privacy. Because you think your God once again, when you say that I will not do it, you assume a position of importance, which is a position that you can fall from. And one of my pet peeves about this is in a fairly large project in India, which is ADAR, which, if you read the ADAR Act, is extremely clear that they collect as little data as possible. That you're allowed to collect biometrics. You're allowed to collect extremely little demographics that you collect the person's name, their gender, and their address, and that's it. You're not allowed to collect anything else. And you look at this and say, this sounds like lean data, that you're collecting as little as possible. But then you made a mistake. When you link something with ADAR, you're putting the ADAR number in some other database. And now that database has a lot more information about the individual. And you assume that whoever is responsible for the database is going to be responsible with it, is not going to share it. But that's not how society works. Because there are multiple ways in which you can extract data from an organization, including coercion, including the power of a court, including malpractice. And then you get a data leak that you believed was not possible because you're not collecting the data in the first place. But in fact, you made extremely possible by making a mistake and assumption about what privacy is. So this is something that we started working on to say, look, it's not enough to say, just be a good technologist. We need to take this a lot more to saying, let's understand what role our workplace in society, what external forces act on the way we operate as an organization. And therefore, what is the appropriate way in which to run an organization? So for this, I'd like to introduce three speakers from Mozilla today. Stan Leung, Dr. Rebecca Weiss, and Mika Shah. And Stan will have his computer back up in a moment.