 Es ist mir eine Freude und Ehre zugleich. Oskar auf die Bühne zu rufen, er ist Web-Bamp-Viclo, Typograph und natürlich auch Antifaschist. Und darum ein warmes Willkommen für The Technicalist Political Society and Resistance. Mikro? Hallo? Ah, besser. Okay, cool. Welcome, my name is Oskar. My pronouns are he, him, his. If you want to address me after the talk, come do so. I will give you a talk called The Technicalist Political. It revolves around three topics. They are not strictly separated from each other. It's obviously about technology, but not about code. I will not show a single line of code, but I will really more talk about technology, about its use and its misuse in the society we live on. Now, this society has evolved and changed very much through technology and I want to show some of the ways it did. And lastly, I will talk about resistance, because I personally think that some of the things that technology did with our society weren't necessarily positive. Okay, so some last remarks before I start and I want to ask the person, the PSA, Dolphin on the screen. So everything I said has been said before and I'm not the one who said it. They were mostly queers or women or people of color, sometimes all of the above who said it. And I don't trust you to believe me, but I would ask you to believe them. Secondly, this talk is very much an introduction. I will cover a broad range of topics, but I will not go into depth into some of them, so there will be questions left unanswered. Yeah, so just please keep that in mind. And lastly, if you want to see the slides, you can find them at r.ovl.design. Slash cw-ccc-19 or by scanning this QR code in the top right corner. Or after my talk, I will tweet them at underscore ovlb. Yeah, these are I think the opening remarks. So, let's go somewhere. I want to tell you a story and a story that has some facts, that has some opinions, that has some swearing, because I'm kind of angry at some of the things I will talk about. It's a story about connections and connectivity. And as we live in the 21st century, I would kind of be amiss to not at least briefly talk about the internet. And I would like to start with a bit of like overview of the internet. There is a spoiler, there is no cloud, there are just other computers for those. So, like most great things, the internet started from very humble beginnings. In December 1969, like 50 years ago, the predecessor of the internet, the ARPANET, which stands for the Advanced Research Projects Agency Network, was four universities connected to each other. In October 1969, the first message was sent on the ARPANET and it was low. The connection dropped in the middle of the message, which was sent from the University of California in Los Angeles to Stanford. It probably was supposed to be locked in or lull, but time can't tell us anymore. For us as technicians, this is kind of a relief because even the ARPANET started with a bug and every bug we wrote has some historical predecessors. Now from then on things evolved pretty much. This is a schema of the internet from 1997 around roughly from the Federal Agency for Security Information Technology. But it's a bit oldish and a bit funny maybe, but it still roughly holds up even though we have no ISDN anymore. But roughly the structure is the same. We have internet service providers and we have content providers, people who host websites. We have the domain name system and the right site there. And then we have clients. And they had the virtue, the foresight to make the internet a cloud thing, even though I think in 1997 this wasn't really a word already. But it's only a word. It's no infrastructural reality. The infrastructural reality very much looks ... Sorry, Adam. Today we managed to connect more than 50%, almost 60% to the internet, which is kind of amazing, given the size of the Earth. But this also means that almost 60%, almost 40% are not connected to the internet. And how they will be connected to the internet admit us very much how they experience the internet. For us, we probably have this promise in mind of the internet, this was going to give voice to the voiceless visibility to the invisible and power to the powerless. This is a quote from an article by Mike Montero, who was an activist and writer, who writes very much about technology and power dynamics and technologies and how we have to fight. And I will quote him more often going on. Yeah, so this is kind of the promise that we maybe grew up over, that we experienced when we got connected to the internet. Now, quickly, reality check. This is more likely the central infrastructure of the internet, this deep sea cables running around the oceans. This is a projection to how it would look like by 2021. You can see basically every continent is connected by many strengths. This is a graphic from the New York Times. They ran an article where they explained the technology behind it. And I forgot one thing, the link for the slides, there's also an exhaustive list of resources. Basically everything I say has a kind of going deeper article. I will also show the link at the end again. So, if you're interested in something that I mentioned, this page will help you out. This article, for example, is linked there. But this graphic also shows the yellow strings. And the yellow strings are cables that are owned or will be owned by Amazon, Facebook, Google or Microsoft. So they are owned by private companies. And as we see the breadth and the width of the internet and how many people are connected to it, it feels like a public good, but still more and more of the infrastructure, of the central infrastructure is privately owned. And I think there's something clashing there. And it is okay, as long as these companies don't make their interest count, I think they will at one point. And besides the private-owned infrastructure, there's also another thing. More and more states are trying to centralize the internet infrastructure into central access points. And the Mozilla Health of the Internet Report 2019 reported that there were 188 shutdowns in 2018. A shutdown is not a route of failing or something. A shutdown is a region or a complete country disconnected from the internet, basically not being able to use it anymore. For example, largely unnoticed in the western media, the longest shutdown of a complete region is currently going on in Kashmir and India. It's offline since four months. This is the longest shutdown in the democracy of the whole internet. There have been longer shutdowns of single services, social media, for example. But the whole Kashmir is currently not connected to the internet anymore. And this is just one of many cases in 2019 also currently going on. But even if people are connected to the internet, their connection to the internet might look like this. They are probably and sometimes only experienced Facebook. There has been one survey by a researcher called Helena Galpaya in Indonesia in 2015, where she found the staggering number that more people are saying they are using Facebook than they are using the internet. And she thought at first, there has to be something wrong with the numbers because obviously Facebook is using the internet. But they are so connected through Facebook, they basically just use the Facebook app and they are so entrenched in the Facebook ecosystem that they think Facebook is this thing. There is no internet, there is just Facebook. Now, if you are working for Facebook, you can say this is a success. I think that's dangerous as fuck, because we know how magnificently Facebook managed to destroy everything they touched basically. Okay, so this was kind of the basic connection part. The thing with Facebook, centralising access to the internet also shows something else. And I want to be a bit utopian in the next section. It's called the web we lost. And this is a quote that I kind of stole by Daniel Deschus, an activist and an entrepreneur, working in trying to build meaningful technology. And he wrote this piece in 2012 and he gave many examples like Technorati, which some of you might remember, which were offline then by years already. So this web we lost might not be more like a distant memory for some of us. Some maybe have never experienced it. And he starts this article like this. The tech industry and its press have treated the rise of billion-scale social networks and ubiquitous smartphone apps as an unadulted win for regular people. He goes on to say they seldom talk about what we have lost along the way in this transition. I want to talk about some of the things we lost. And I want to start with maybe an unlikely candidate. This is Tom from MySpace, who some of you might remember, because he was the first top friend we got on MySpace and stayed our top friend until the end, until he luckily lost our data. An unlikely candidate, because MySpace feels like a centralised platform. We had just MySpace and our MySpace profile. But MySpace enabled something. Like this. Which isn't probably like the top of web design ever. But, and this is vitally important, this is personal. This is something someone built. Someone set there and added HTML and CSS to their MySpace profile and made it their own. They didn't necessarily own the infrastructure, but they owned the design. It was their personal MySpace page. No other page looked like this for better or worse. MyPages probably looked even worse. But it enabled us to learn CSS, to learn HTML, to get kind of entry point into working with technology. To quote Mike Montero again of the article I have mentioned. At the beginning of the internet or the worldwide web, we kind of put our stories and songs and messages and artwork where the world could find them for a while. It was beautiful, it was messy and it was punk as fuck. And how punk it has been. This is a screenshot of a GeoCities page which was another hosting platform where you could put your important, kind of important things on there and kind of work with the web. It reads in large comic sense letters. If you study the material on this website, you will hopefully understand what our purpose here on Earth has been. This page is intended to be useful. It's written in smaller letters at the top. I don't know if you are able to read this. Also, please welcome home. I feel very much like home and I believe someone had the time of their life building this website. Really, there must be a ton of fun happening there. And then some usefulness maybe, I don't know because I can't look it up. Today we are more or less stuck like this or most people are stuck like this with the standard Facebook News Team. There is basically nothing we can do about it. We can post links there. We can't make it beautiful. Und jetzt, natürlich bin ich ein Designer, ich habe ein bisschen über Design gesprochen und versucht, das Design zu illustrieren, aber wir haben nicht das Design verloren. Um zu quote any dash again, wir haben lost key features, die wir used to rely on, and worse, we have abandoned core values, that used to be fundamental to the web world. He goes on in this article, naming examples, where big technology companies were cooperating together to solve a goal, which basically does not happen anymore on surveillance capitalism. Twitter builds Twitter and builds proprietary formats of data that you haven't bet on into your web page for Twitter, and Facebook does the same for Facebook. And there's no collaboration anymore. Basically everything, every platform tries to win. So we have lost the access to make stuff our own. We have lost the sense of openness and we have lost the sense of collaboration. And overall we have lost the ownership of our data and our content by mostly pushing it into one of the platforms and not building our own infrastructures. So I would go so far as to say that we kind of lost our voice. But still the web can be a plethora, intermingling of all kind of awesome things. There are still very awesome websites out there. Some are like the all of human knowledge. Some are helping out other people. So there is still awesomeness on the web. And the web can be a plethora of creepiness. We don't have to have experience rotten.com or similar sites to know that there is stuff on the internet. We don't necessarily want to see really. That we maybe just want to forget. But some of this is harmful. And we will talk about this in a second. So the web really can be everything. And I think that it's very vital that we are aware of this and that we try to preserve this. That we fight against the centralizing the infrastructure and fight against these shutdowns. That we fight for net neutrality and that we fight for open access to our data and our content. Okay. So it might be easy to say write the web as for everyone. But I want to rephrase this as a question for a moment. And I would give you an example. This is a screenshot from a Tweet of Margarita Stukowski, a feminist writer in Germany, a very claimed one of the most popular in a moment. Great at this. She just published a paperback of her last book, The Last Days of the Patriarchy. And it took, I think, 10 Minutes until some guy showed up and asked, why don't you cry, babies? Just do your own thing. Without men. Start your own companies and destruct the man. Are you afraid of that or what? And I chose this for two reasons. One, I have to give this talk in dark mode because of the screen. And when I gave the first version of this talk, I had just another Tweet of her in light mode. I went back, and I had just posted a three hours ago and her comments were full with stuff like this. And second, because it's really easy to go full or really on this and say, ha, ha, this is basically stupid, right? But harassment, be it physical or be it digital, isn't harmless. And why this example, the single example might be easy to dismiss or kind of the ongoing torrent of hate and vitriol that women or queer or people of color have to experience online as soon as they make their voices heard isn't harmless at all. And often we talk about free speech and how we have to write to say everything. I want to quote a woman of color on this. Tatiana Mack, who wrote Canary in the coal mine, one of the most, she wrote We must protect safety over speech and she wrote so in the wake of the Christchurch attack in March not in the wake and the aftermath of the Christchurch attack. And in this piece she goes very into depth how technology is failing endangered groups not just on the World Wide Web but in the digital realm in general. If you read one thing I mentioned here you have to read this because this is really, really, really important. Okay, so we have to protect the safety of those who are online about speech of those who are already speaking all the time and are very loud anyway. And as soon as you hear this you guess it, some random white guy comes up and says you really aren't allowed to say anything in a war man. And I have a very simple answer to this, which is shut the fuck up. But with shut the fuck up you basically are not going to introduce some kind of healthy discourse so obviously while you're correct this can't be the end of our work against this. We have to kind of talk about what is free speech what enables free speech and are you really not allowed to say anything no more or are you just constantly saying everything um to quote Nesri Malik from his speech the myth of the free speech crisis she said that the purpose of the free speech crisis missed so the myth that you aren't allowed to say anything no more is to give people and to giving up the right of response to attacks and to destigmatise racism and prejudice. She goes on to say it aims to blackmail good people Und ich möchte das stressen. Wir haben ein legitimierter Recht, um das zu verabschieden. Es gibt nichts in der Welt, das uns zu uns stecken und nicht etwas gegen online oder in der realen Welt zu sagen. Wir haben das Recht, um das zu verabschieden und das zu verabschieden. So, ja, die Web ist für alle. Aber wir können es nicht sagen, dass es online ist. Und natürlich, das ist nicht nur ein Tech-Problem. Das ist nicht nur ein Tech-Problem, weil Technologie-Kompagnen auf marginalisierende Arbeitern und Datacentren versucht, das ganze zu filterieren und das ein sehr, sehr reales Problem mit allen kindenpsychologischen Implementationen. Und es ist nicht nur ein Tech-Problem, weil man nicht mehr den Hater kicken kann. Die Gesellschaft ist kein IC-Chad-Rum, wo man den Menschen auskicken und sie sind weg. Das ist nicht wirklich so, wie die Opinien arbeiten. Okay, so, wenn es nicht ein Tech-Problem, aber ein Gesellschaft-Problem, dann macht es keinen Sinn, vielleicht zu reden über die Gesellschaft. Ich habe die nächste Sektion von dem Gespräch mit den Menschen, die wir gebrochen haben, und mit dem wir, ich meine, den weißen Menschen sind. Ich möchte starten mit einem der Probleme. Problem Nummer 1 ist, dass weiße Menschen Geld zu weißen Menschen geben, um die Probleme der weißen Menschen zu lösen. Also in einer sehr weißen Gesellschaft, in der es zuerst ist, um immer Geld in den Weichen der weißen Menschen zu lösen, haben wir noch immer mehr zentraler Weidheit als aufgrund unserer Lösungen und aufgrund der Diskurs- und Technologien. Und basically, alles andere war entdeckt, along the way. And there's just recently, I think, a rickering and trying to get this back. Problem 2, male people gave money to male people to solve the problems of male people. It's to this day that companies founded by women have it very much harder than companies founded by men to acquire some sort of capital. And if they do, they are faced confronted with a larger scale of scrutiny and easier failing basically. Now, if one of these things gets problematised, often one or the other of the employers or of the employees of such companies shows up and said, but we didn't mean to. They didn't want to build companies that exclude not male people or that exclude not male people. And to some extent, I won't believe them and I want to believe them because if they would mean to, they would have just been full blown shitheads. They would be like assholes basically. So I kind of want to believe them because maybe humans are not worse after all, maybe hopefully. Okay, and that's okay that they didn't mean to. But that they didn't mean to does not solve the problem because the problem here is to quote Tatiana Mack again that intent does not erase impact. It does not really matter how you intended your technology to be used or what your intentions behind building something was. What really matters in the end is the impact of those people that have to feel it. And if they feel discrimination, if they are discriminated by your products, it really does not matter that you didn't want to build a product that is discriminatory in the first place. So, tech is not neutral and it has never been neutral. The technology we build and the technology we will build will always be or can always be used for bad things. This makes it necessary for us as technicians that we can't be neutral. We can't say we put this out there and then we see what happens because then it will, shit will happen basically. So we have to take a stand. So in 1985, the scholar Joseph Weisenbaum was interviewed and he said it is not reasonable for a scientist or technologist to insist that he or she does not know or cannot know how technology is going to be used. And this was like 1985, 24 years ago basically. But the problem hasn't really changed and tech wasn't really good in resolving this so far. Maybe we can change this. This brings me to my next post. I want to stress this, Facebook is bullshit. And Facebook PR representatives who go in publications after their ads found to be discriminatory and they say, yeah, but inclusion is at the heart of our company they are blatantly lying because the thing that is at the heart of Facebook's company it's 98.5% of their revenue, it's advertisement. And if they say, yeah, but inclusion TV is a core they know that they have or they should know that they have problems I don't know if they know. And I say they are lying because the intent that they have does not erase the impact that their products had and it's kind of naive to still think, yeah, we put something out there and it will be used for good. It will only be used for good if we care about using it for good. But the problem with technology and the discriminatory technology goes deeper than ads if ads aren't bad enough in the first place. But we have to be aware of the use of algorithms for example to make decisions about the future of people saying who will commit a crime, predictive policing and the reality that these algorithms are over and over again discriminating against people of color. We have Amazon providing the footage of the ring home security cameras to police departments basically without any control over what the police does with it and if you don't control police I think most of the people in this room know that this might not end so well. And then we have companies just focusing on enabling and modernizing IT departments like Palantir who worked with the police and Hesse to build Hesse data, a large scale data harvesting project with the local police and they do this all over the world. It's not just the police in Hesse. And they also use algorithms to predict the future. The thing, the problem with these algorithms is that the underlying models are wrong because the algorithms, they were given data of the past and out of the past they should predict the future. Now the problem is if you give an algorithm data from a racist past and it should build the future, the future is likely to be racist if you don't control your algorithms and this often doesn't happen. For example, this is a kind of famous example of Google's photo algorithms which discriminated against black people and labelling them as gorillas. Google has pulled this off and still isn't able to teach the algorithm that it does not is like this. The algorithm still just does nothing with it. It's so ingrained into the data. Or more recently Apple's credit card discriminating against women by giving women with a higher income a lower credit score than males with a lower income. Again Goldman Sachs, the bank behind the credit card said, but we didn't mean to. But you did. You didn't mean to, but you built technology that did basically. So it really does not matter if you didn't mean to and it really also does not matter if you put race or gender as an explicit input into the algorithm. To quote the researcher Roger Thomas, even if race and gender are not inputs to your algorithm, it can still be biased on these factors. Machine learning, axels at finding latent variables. And if you don't control your algorithms, these latent variables will result in a runaway feedback loop. So the algorithm over and over reinforcing itself and setting the next set of data on which it learns and then suddenly it is so ingrained into the data that you can't do anything about it anymore. To quote Nishinyat Stani, who I think concisely and rightly said, data from the past should not build the future. We can use data from the past to learn about the past. But what we are currently doing is trying to predict the future and this has to fail. This is something that this data can't do. And Tatjana Mack said that the technology that we use, accelerating at a frightening rate, a rate faster than our reflective understanding of its impact. But we still let the technology or we still evolve this technology ever faster and faster and we still don't try to really understand what's happened there. Or if we do, we do it in kind of niche groups but not as a society as a whole. This also is, I think, a problem because of the humans that have access to computers. So I want to talk about means of production for a moment or a very short history of computing. I want to start with a kind of imaginative thing. So, for a moment, please imagine a programmer. Okay, I guess it's very likely that you came up with someone who, more or less, plus, minus the moustache, maybe, looked like this, a kind of white, able-bodied male person. And rightfully so, because programmers today mostly look like this. Hi. But next question, for a moment, please imagine a computer. And chances are again that you kind of came up with something standing on the desk here, like a modern-day laptop or something. If you know about the history of computing, you maybe went a step back and thought about something, something like this, this room-filling, big giant computing machines that were doing less than the phone that is in our pocket but was still like the history. I want to go even further back into the year 1892. In 1892, the New York Times posted a job ad. It reads, a civil service examination will be held May 18 in Washington and, if necessary, in other cities to secure illegibles for the position of computer in the nautical almanac office where two vacancies exist, one at $1,000, the other at $1,400. The examination will include the subjects of algebra, geometry, trigonometry, and astronomy. Application blanks may be obtained of the United States Civil Service Commission. Now, this really does not sound like they were searching for some kind of machine, right? And they didn't because they weren't. They were searching for humans, and more specifically, they were searching for women. Historically, into the midst of the 20th century, a computer was a human, was a woman. They were doing really complex and amazing mathematical computations. They, at Harvard, for example, they were calculating how stars travel across the sky. And they did so right that some of the data they calculated back then is still in use today. And how women were so synonym with computers that by the mid-20th century computing was so much considered a women's job that mathematicians would guesimate the horsepower by invoking girl years. And when, there refers to mechanical computing machines. And when they were describing the units of machine labor that one of these machines has, they were describing it as a killer girl. Today we talk about killer bytes. Maybe they were referring to killer girls. And it all started with her. This is Ada Laughles. Now, finally, we have a room named after her. And she was next to her, the biggest room of this conference. And rightfully so, I think. She was working in the mid-19th century. And together with Charles Babbas, she worked on a thing called the difference engine. It was really an engine steam powered and such, which was doing calculations. And she's widely regarded as the first programmer because she wrote the programs for this engine. Now, going into the midst of the 20th century, its very short history of computing. We, for example, have women working as telephone operators. And when the first computers came up, they were like these huge things I just showed. And these women had the skill necessary to work on these machines. So some of the women who were programming the first real computers that we kind of know or understand today as a computer, were, for example, Grace Hopper, sorry, working for the Navy during the Second World War. Or we have the ENIAC-6, namely Kesslin, McNulty, Betty Jennings, Elizabeth Snyder, Marlin Veskov, Francis Billers and Ruth Lichterman, working at the University of Pennsylvania at a machine called the ENIAC, which is generally considered to be the first all-purpose computer. Also, they were calculating missile ranges for the Army, but it's generally like this computer should be able to calculate basically anything. Or we have, in Great Britain, at Bletchley Park, we had the Colossus Mark II, a code-breaking computer, which broke the enigma code of the Nazis basically in real time. And here we have two women operating this, namely Doris Sidi Bouzon and Asi Bukke. And this continued well into the 1960s, I think, where people working with computers were mostly women. They weren't doing punch cards anymore. No, they had keyboards and stuff, but they were mostly women. And women also were among the people creating the first compilers and high-level programming languages, so the things we probably interact with today. And then a question was raised, or who is a programmer, really. It was raised among other things because there was a shortage of programmers. And some newspapers at the time talked about a software crisis, was more like a labor crisis, actually, and I think it still really isn't resolved today. And this crisis, or crisis, however, went as far as the NATO had a conference in Germany in Garmisch. They weren't including any women on the guest list, so basically just male people deciding the future again. And here they made the change. And Clare Evans in the history of women in computing wrote, the most significant change they made arguably was semantic. Programming, they decided, would heretofore be known as engineering. The title we probably used today. The shift on vocabulary was a shift in focus. It was more focusing on technical skills than maybe on the skills that humans have to possess as the so-called soft skills, working with each other, collaborating. But from then on you have to be kind of an engineer, a very serious thing. In this crisis there were also more concrete factors at play, for example a lack of childcare, a lack of mentoring, and of course, because we live in patriarchy, wage discrimination against women. So there were also other factors driving women out of the workforce. Factors, I think, that are still very relevant today. And these changes or these problems, they worked. This iconic Apple ad, introducing the Apple II, it's from 1977, so roughly 10 years later. And here we see the woman back in the kitchen doing the household and very admiringly looking down at her man, who is in the front very earnestly working at a computer with some kind of economical data, so he's probably destroying the economy, I don't know. But this ad shows that in the public view, programming and working with computers was getting more manlier by the day, and it still is today. Now, what we still have today is again white men who hire white men to solve the problems of white men, and they hire their peers basically in their companies and by that reinforcing this cycle, this focus on a male workforce that we have today. And I think we have to broaden the access and we have to break the cycle. Now, there are organizations like here in the Club, the Hexen, or globally Women Who Code, who are actively working on getting more women and more non-binary people, basically everybody who's not male in the computing workforce. There are feminist tech spaces, and we saw a talk yesterday on the stage about feminist tech spaces, for example the Heart of Code in Berlin, who try to make programming more accessible and trying to teach this to more people. And what they do is fighting, and this brings me to my last section. I just named you got to fight. We have to fight because it shows the breadth of things, and these are not the only things. I had to leave out such things because in my talk would have been 45 hours instead of 45 minutes, probably. And I want to talk about ways we could fight. The first one is maybe a bit unlikely. I want us to be playful. I want us to work with technology in a way that is playful, that's fun. Because being playful is one of the ways, which we can use to reclaim the web we lost in the first place, but it can also create different entry points into technology again. Casey Evans said, keep making fun stuff and talk about web animations with SVGs where you animate graphics. Casey Evans does stuff like this, which maybe isn't ... it's no real algorithmic magic, but it's super cute and super amazing, so cool as a code. It's code that does that. It makes the jellyfish go up and down and smile and flinkle their eyes. I think it's super cool. And she also does stuff like this. This is a blob that is a bit shy and stops dancing when you look at this. To do this, she uses the Shape Detection API that's built into Google's Chrome Browser. It's available at the flag. And the Shape Detection API has the ability to do this kind of sounds dystopian if you weren't animating blobs with it, but you can also use this technology that maybe at first sight or at first sound, sounds a bit dystopian to use to build good things. Tech maybe can be plain useless. This is a Twitterbot called EveryDNA. It just combines two emojis and builds a double helix DNA out of it. It's a rocket DNA, and I think it was about damn time that we have this. And it's more or less useless. It's fun. It's built with a Twitterbot called CheapBots done quick. So there probably wasn't even a lot of time involved to build this. But it's still great. I love this. Every time a wonderful combination of these pops up in my Twitter feed, I'm lucky. I'm happy. So, to quote as Casey Evans again, in a world that becomes more dystopian by the day, we can also use technology for good, and it's important that we use technology for good, because there's something in this playfulness that's more than fun. And something about this playfulness that we can use to teach technologies and that we can use to build communities. And over the course of computing history, we're mostly not male people doing this kind of thing. So also to the men in the room, please be more fun and build more communities. And we can also use this technology to educate ourselves and to educate others. We can use this to learn stuff, but we can also, and this is also important, use this to unlearn stuff. Not just forgetting, but actively unlearning Ways of thinking, for example, that we learned in a racist society in where we have to make a conscious effort to unlearn this. We can't just forget this. We can build book clubs or communities in which we exchange ideas about technology, for example. And it was Grace Hopper, actually, in 1968, who already said that you needed people with more vocabularies, that we as technologists need vocabularies that are not just strictly about technology, to be able to interact with the world as a whole. And the more technology builds the world as a whole, the more vocabularies we need. And the more important it is that we welcome people who know these vocabularies into our circles. One project that's very near to my heart, is Self-Defined. It's built by Tatjana Mack and maintained by our community. A modern dictionary about us. We define our words, but they don't define us. So it's available at self-defined.app. And it's basically a way to kind of reclaim languages and make languages not ours, because I'm a white man. Language is probably already mine, but to make language again formed and defined by the people who are impacted by this. And this also brings me to the next thing, which is that we have to hear and amplify the voices of endangered groups. We have white people have to use our safety to stand up for people who are targeted by racism, because we have safety and we have privilege in the society and we have to use this privilege. Of course also we as men have to use the safety we have in a patriarchal society to fight against this and use the safety. And of course also straight people have to use the safety they have as straight people to fight against homophobia, to stand up. If a dude in your workplace make a homophobic slur, then stand up and say something, not just sit there. Make your voice heard, because your voice is actually more heard. We can have to change this, but as long as it is we have to make our voices heard. This isn't strictly about empathy, right? Because empathy kind of requires you to feel the way the other person is feeling. And I think this can be a dead trap, because I as a white person cannot possibly know how it really feels to go around the street day by day and be targeted by racial microaggressions or something. I as a man cannot know how it feels to have a Twitter account and be bombarded with all these kind of bullshit comments, because I just don't get them mostly. So to quote Krim Kraton, empathy relies on someone else being able to understand the suffering of others in order to do anything about it. And she goes on to say, we don't need to understand the suffering of others to take action to minimize their pain. We only need to be aware that the potential for suffering exists. We need to be aware that they are speaking up. We need to be aware of the dynamics. And this is this awareness that makes it possible to consistently act. It's not the empathy, because we just might not be able to feel this, but we can always be able to be aware. Now, thing, organize. Of course, we have to go on the streets and we have to go and we have to take collective action. Not only if you're feeling currently feeling the impact, but by being aware that others are feeling the impact or might be in danger of feeling the impact. We have to prioritize communities over competition because in the capitalist market space we are all competing against each other and this is how this works. By focusing on communities and building healthy communities over this competition, we can go a step into resolving this competition and making this a thing of the past. We have to prioritize communities over companies. Tech companies are very good in building like a company community and basically all you have to do is live and breathe for the company and I think that's not necessarily correct. There are vital communities inside companies probably and it's cool that they are there, but the company is not your community. The company is someone who has hired you to make money out of your labor. That's not a nice basis for a community. And we have to prioritize communities over nations. We have to be aware that other people in other states are basically fighting the same fights we do. For example, Github currently has a contract running with the immigration enforcement agency in the US and this is an agency that puts children into cages to deport them and separate them from their families which is a very, very evil thing to do. And still Github refuses to drop their contract and just a reminder to anyone at Github who is currently listening, please drop the contract. So, we are in this together. We are in this together against nation boundaries or any other boundaries and we can make this count. I know I'm running out of time so quickly come to the end. Just in time arrival, perfect. And quickly come to the end. Yes, so a warm welcome at the first beginning before we start with the Q&A session. Can I take the one last thing? Okay, collective election will be challenged. We have to take back the streets and we have to make races afraid again. We have to, for example, build big data against technology, this is a piece from Wyatt. It's linked in the resources, which I will show in a second. And we have to be like Rampikette. And we need to act because if we don't act we will look like a bunch of overpaid bollocks and I think this is a very bad look. And I want to give the last work of this talk to Rosa Luxemburg who in 1990 and 100 years ago said act graciously, resolutely and consistently. Okay, so Dolphins again. This is the slide so you find all resources there. Thank you very much. Applaus, Applaus, Applaus, Applaus, Applaus. Ja, wenn jetzt noch, wir haben noch zwei Minuten für Fragen. Links und rechts sind Mikrofone. Wenn die Lichter angehen, einen sehe ich schon auf der rechten Seite. Hey, thanks for your talk and thanks for the positive points you made in the end. Very fast. But I think it's hard to keep the fun and the feeling of freedom in an Internet that we lose more and more to private companies. And you made the point at the beginning that a lot of people look at Facebook and think it is the Internet. And I think the step is one further. A lot of people look at their smartphone and think this is a computer and this is the Internet. But it's just a bunch of applications. And as long as young people get used to the Internet through this, I think it is hard to break this feeling. It's hard to break. This is why it's so vitally important that we have another feeling and maybe have another approach to technology. Go out there and teach people, young people, middle-aged people, make the point for an alternative and try to build something positive. And this is also where I think fun can be very vital to do this. Thanks. On the left, there's a question. Yeah, thank you a lot for your talk. If you don't mind, I would like to compliment on two aspects you mentioned. And the first is the notion of free speech. I actually think rather than talking about which speeches should we allow, we should just be conscious that there's a difference between the freedom to speak and the freedom to discriminate. And there's actually plenty of really problematic opinions which can be voiced without being upfront hateful. So I think this is what this is actually about. That people go back to actually voice their really problematic opinions in ways that don't directly threaten individuals. And so, I think, yeah, rather than just always mention this free speech and oh, it's threatened, let's just make this differentiation really clear. And then the other point is about the disappearance of women from technology. There's actually something really interesting and it's that as soon as any kind of profession becomes valued and that this value is reflected by actual pay or higher pay, it starts becoming a male-dominated profession. And I just wanted to mention that, because yeah, you showed how as soon as it was becoming engineering, it became a male profession. And statistically, and this is true for all, at least I know that it's true for almost all European countries, we can really observe this phenomenon. Yeah, I actually had this in my speaking notes and didn't say it, so thank you for your point. Is there any question from the internet? No, so at the end. Thanks a lot, a big applause for Oscar and for this value input.