 Hi, as I told him, nobody knows what it's supposed to be, so there's no ground truth for him to worry about on how to pronounce my last name. So I'm from Turkey. That's why I have that hard to pronounce last name. Turkish is not related to Indo-European languages. And I started programming pretty much as a kid. What happened was I just threw sheer luck. I bought myself a computer before there were a lot of personal computers around. And I got to experience that early, almost magical stage when things were a lot less complicated than they are now. And the way I learned programming was when you got the computer I had, which was a ZX Spectrum, there were no ready-made programs for it that were available to me. So what I did was we had magazines and we would type up the programs because that's all I knew what to do. And all the programs had bugs, none of them worked. They were supposed to be games, but none of them would work. So I would figure out what the bugs were. And I soon realized finding out what the bug was was a lot more fun than whatever the game was supposed to be and I was kind of hooked for that. So I just started teaching myself what I could and I'm still a teenager. And then because of my own family circumstances, I had to start working pretty much as soon as I started college, the very first semester. And I'd been thinking, what do I do? You know, what kind of a job will I have? I had thought about that for a long time in high school too. And I was interested in a lot of things. I was interested in physics. I was interested in a lot of sort of math related things. But I thought, you know what? Physics has all these ethical problems, you know, nuclear weapons and all of that. So I said, why don't I pick a profession? That doesn't have a lot of ethical issues to deal with where I'll just be able to, you know, program and not have anything to worry about. So that's how I got into computers, which tells you that I probably don't have the best foresight on where the ethical issues are going to be because I don't think there's a profession that is more at the center of so many thorny ethical issues in terms of where things are, where things are going and our generation. And by which I mean people who are, you know, sort of designing and thinking about computing as an infrastructure at the moment as going through and making decisions. And I got to live through that early stage. I think that really exemplifies some of what we mean when we say open and free software. It was that era where you could just email someone and you'd be just telling them, you know, I don't know you, I have some questions and there's a good likelihood they could reply back to you even if they were some big name professor somewhere and you were nobody, a kid from Istanbul. So it was really amazing. And just within my lifetime, I have seen computing go from that kind of freewheeling early stages to what it is now where we're seeing increasingly enormous amount of centralization. Asymmetric power transfer, if anything, computing and computers are becoming one of the tools for concentrating power and disempowering people in many ways and that's the part that I think about a lot from those very early days where it was just like magic to me to nowadays where I am worrying about all the things that we see. So, you know, here at a Linux conference in Open Source Summit, I wanted to really think about the philosophy of open because you know how we would try to, when you said free and open, you try to explain to people free but not as in beer, as in free speech, right? That kind of free. So I think sometimes it's really good to think about what do we mean by open and what are the philosophical underpinnings of that idea and in this historical junction, how do we keep the philosophy and how do we think about what it could be? Because there are a lot of technologies, if you look at them, they kind of start decentralized and open and then over time they become closed and they become instruments of power, they become instruments of war, they become things that are not what they started with. One very sort of obvious example is how ham radio, original radio, you know what we think of as ham radio was what radio was. It was an interactive, global, interesting, free-willing thing and within the space of a generation it became first an instrument of war and then a mass of centralization and then finally it became this one-way product which just spoke to people and advertised to them. So how do we think about that kind of historical moment? So when I think about the sort of what makes open source products work, in some ways this is, like you're supposed to have triads but I have four. So there's, and I know there's been a lot of discussion about the reliable part and I know security is a huge topic that's come up. I heard it in the keynotes yesterday so I'm not gonna talk about it. Usability has been discussed forever in this community so these are not the ones that I'm going to talk about either. I want to go back to this, what do I mean by open and open as in what kind of open? So when we have sort of open source, the first thing you think about is, well it's what it says, right? The source is open, you can read it, you can see it. But obviously, increasingly, where computer is going is access to the source code even if it is possible is not sufficient to provide what open was supposed to provide philosophically, which was access and which was a way to not concentrate power but to distribute it and to decentralize it. And a lot of the sort of cutting-edge stuff, we're talking in San Francisco, a lot of the cutting-edge stuff are where the money is and when you sort of look at where the venture capitalists are going and where you look at where the business are investing and all the companies they're buying, it's very much focused on artificial intelligence and machine learning and various classes of machine learning systems like deep learning. And these are systems that are by nature and by definition closed even if you have access to the source code. So I want to throw it out as a challenge. I know this is not something that you do say in Linux but it's a challenge to think about how do we keep computing open the way it was originally envisioned in the open source movement, in a world in which access to the source code, even if you have it right in front of you is not giving you the transparency, the decentralization and the understanding that open was supposed to involve. So obviously a lot of you know how this works but just sort of in general, what's increasingly happening is we have these computers that know how to learn, we have these programs that function not the way I used to type in programs and not the way most of us have programmed but through chewing through large amounts of data and creating layers and layers of neural networks and then churning out classifications. And just as you can't take a cross-section of my brain right now, which I hope nobody would but if you did, you wouldn't be able to say exactly what I was thinking, right? There's consciousness, it emanates from my brain and my mind there but it's not something you can just look at the organization of my neurons and say, increasingly computing is becoming something similar to that and how do we deal with that challenge and how do we think about open in a world like this? And the other thing is because of this, not the triad but the triad part, the reliable, secure and usable have increasingly become vectors that promote centralization. Users, I deal with a lot of people that are very, they're not programmers, they're ordinary people and for them, things that are pretty basic usability questions are big insurmountable challenges and I know this has been discussed a lot in this community, how do we make our products more usable? But what has increasingly happened is they gravitate towards the programs that where there's large numbers of people that are working on the usability side and therefore easier to work and also security is such a big issue without large security teams, increasingly it's hard to keep software secure and unfortunately, as we keep saying again and again, there's a huge amount of the infrastructure of the internet that is open and open source software but it's maintained usually by a few volunteers here and there or small groups or sometimes few foundations and we've seen these lead to security mishaps again and again, the heart bleed was one example we've recently seen some of the Apache stuff come up with the EcoFX security breach so usability, reliability and security have increasingly, have tipped a balance towards centralized platforms and users are going to those so that's one way in which that the philosophy of open is being challenged too but so let me sort of say the challenges here I think the computing is closing to interpretation the way our generation exercised and understood this to mean I mean, it didn't mean you could just look at the some large code base and exactly understand what was going on but theoretically it was possible if you worked on a program and if you kind of dove into it you could understand what was going on and if there's a problem you could understand what it was doing and what it was not doing the new coming wave that I think is going to come everywhere is not this model it is a data surveillance model that turns through that data and comes up with opaque and non-transparent and uninterpretable machine learning systems that make decisions it's closing the decentralization it's becoming I fear a little bit like the radio situation where yes, you know, there's open source software that is running large amounts of internets I think it's fixed large amounts of the internet's infrastructure but for ordinary people increasingly the usability and security issues have just made them move away from that so it's becoming more and more centralized and computing is becoming increasingly asymmetric with regards to how it's distributing power so when I started, right when I started in this world it was a way in which that I felt empowered I got to talk to people around the world I got to build stuff which was amazingly fun I got to build stuff that worked I could do things and it was a way in which that I think a lot of us and still in our current generation it has empowered us but right now if you look at the way computing is exercised and the way it's functioning in the world it is increasingly a way in which people are surveilled they're controlled they're replaced their jobs are threatened their operating systems spy on them they don't understand what's going on there's great control so how do we deal with this and what are the resources and what is the philosophy to try to not make this inevitable? I want to give two examples just from this week like the kind of research I'm seeing just like from I just pulled two things from literally this week and one of them was a paper that was using machine learning to identify protesters whose faces were covered by scarves or caps or glasses they were using just churning through the machine learning stuff it was an early paper and it may not work a lot but today was Apple Day and there was a lot of announcement on how good Apple's face ID is and Apple is pretty good with its phones and security but we now have computing technologies that can identify your face and Apple claims that the error rate is as low as one in a million, 10 in a million those are very low rates I don't know if it's gonna hold up but that's the direction so what does it mean when computing instead of the thing that sort of empowered me in my youth it's now the thing that's going to pick up everything I'm doing even if I have my face covered with a scarf who I am there was another paper this week that researchers were using facial recognition tools to predict sexual orientation I mean I don't even know where to begin with this that we've gone back to some form of phrenology that we have these things being done and these are being done by academics who are probably demonstrating things but it almost doesn't matter because even if academics didn't do this these are the technologies that authoritarians are going to be using at a mass scale and these are just from this week so when we think about a future for the open and open as in human controlled open as in human interpretable and more importantly open as in human centered where is that movement I mean just like the way in the early days of the internet there was a movement to say you know what we're not gonna turn this into yet another television although I don't think we've completely succeeded but it's really not just another television it's a very complicated technology with still a lot of opening and with still a lot of dynamism and how do we sort of create a movement that tries to reclaim that kind of open where we don't surrender to these overlords as computers which is how most people are increasingly experiencing it and how do we keep interpretability and understanding in the way we do our programming and the way we build our computing machines and more importantly what are the tools we need to develop to empower people so that it is not this asymmetric thing that is just surveilling them and controlling them and I think just the source code being open is obviously not going to be enough for this we're going to need platforms we're going to need mechanisms we're going to need ways of interacting with this new technology that allow the other parts of the triad the reliability, usability and security that provide those that we've struggled with so long in the open source community at least but also puts in a philosophically different way of interacting with computing power and why that I mean interacting with the large platforms interacting with nation states that are using these technologies to surveil the interacting with corporations that are using these technologies to understand and manipulate at the moment a lot of this is coming in the form of ad technology ad technology for example to recognize your emotions in your face to figure out how to sell you things better now that doesn't sound great but it doesn't sound as bad as it can be people kind of roll their eyes we've always had ads and that's kind of true but the infrastructure we're building here this kind of surveillance infrastructure we're building mostly for ads but also for data collection it's the infrastructure of authoritarianism I mean no two ways about it the infrastructure we're building is the infrastructure of authoritarianism and ironically it's coming from a technology whose originators mostly and a lot of its early practitioners and to this state this enormous community is philosophically not just not in that side of things is absolutely and completely opposed to it and yet step by step for the past 10 years every year I see it becoming more and more in this direction less controlled by us less interpretable by us and more serving those in power and less serving us so this is kind of a challenge I obviously don't have answers if I did I would say here let's do this and let's do that and we'll be done but we need new things and different things that reach a mass base and obviously doing something like that is going to require resources it's gonna require a lot of volunteer energy it's gonna require a lot of refusing to work in some of the ways in which are computing infrastructures becoming an infrastructure of authoritarianism and centralized control and maybe we can also think about what are the technologies that people need in their everyday lives and is there a way to give them reliable secure usable versions of these technologies that do not also rob them of power that do not also surveil them that do not also create this world in which their objects to be manipulated and can we figure out how to deal with the sort of onslaught of artificial intelligence especially in the form of machine learning that is just this black box that is being imposed on people I think there's great room for that because when I spoke about this stuff maybe five, six years ago people in the technical world would understand but ordinary people would say what are you talking about? This is great, I've got my phone I can talk to everyone it was true, it's really great I mean they're lauded, that's the paradox it really is great and I would have a hard time kind of talking about these fears I had even then because you could see the centralization you could see these trends and nowadays I think there's a large number of people that have felt what we're just talking about directly in the just ordinary world outside the tech community so my plea is, is there I think that it's a historical turning point and I mean I don't wanna sound like I'm just hyping something but I really think that we have maybe five, maybe 10 something in that order of scale before we had, we closed this window and just the way radio became something interactive into a way to this one way corporate broadcast mechanism it's quite plausible that digital world and technology, information technology will become something like that in a way that is very hard to get back from in our generation and we'll be telling stories of how it used to be but we're not there yet there's time to do things there's time to really change things and there's now, we're right at that moment there's sufficient awareness of these issues and there's a large community of people in the technology world who are bothered by these and could take things another way so that's my plea and that's where I'm going to end and this is sort of any questions, comments again obviously I'm not gonna have an answer to this but I think this is a big challenge facing everyone concerned about these technologies we're building and their effects in the world thank you