 Keynote speaker is Max, Max Schrems, really personal hero of mine for many many years and also should be your hero by the way Because he is really defending our privacy rights here in Europe in a big way Had the guts to sue like Facebook two times and won two times against it Now I think Katrin It's worth having an introduction to present you you are here today. Thank you for joining us And you're also doing a keynote tomorrow. So thank you for that as well Katrin is a technology technology and climate researcher a Consultant as well senior program manager at the Green Web Foundation Which we'll hear much more about tomorrow Chair of epicenter works Austrian an Austrian digital rights organization doing I would imagine really amazing work So thank you for that. Yes co-founder and also co-live co-led motif an institute for digital culture co-initiator of feminist futures doing important work there as well and Nominated for the Forbes 30 under 30 list, which is extremely prestigious. So thank you for being here. We really appreciate it Thanks for the invitation Daphna Muller is also here We're lucky to be friends, which is great and Daphna is part of the next cloud team as well And we're very lucky to have her as manager of alliances and eco Ecosystem and support. She's doing amazing work with our team. So thank you Daphna You are also a researcher on the future of technology its impacts on society and the industry the technology industry with a special focus on privacy So Extremely important work as well You've had several publications in that area as well, which is really amazing Also Daphna is a speaker at TEDx if you look up or talk it is very enlightening So I would definitely recommend you have a look there. It's about data privacy and platform capitalism and how those two collide So all of you, thank you for being here I think I have a few rules. I think the rules would be no topic is off the books, but we'll have our interests and Also our expertise, but you know feel free to dive in wherever you'd like the idea is to be less structured and more Kind of weave our way through the conversation and ask the hard questions So if you want to speak feel free to indicate to me or just go for it if it feels natural and I would encourage everyone on our instance if you want to have some discussions there Maybe we said something interesting. Please continue to have that chat on the next cloud instance for the conference We'd encourage that Now I wanted to start with trust transparency and Togetherness which is our theme and I wanted to use artificial intelligence kind of as a jumping off point Just to ask some hard questions because I think artificial intelligence is an area where all of these questions are coming up We're asking questions like Environmental impact privacy clearly is an implication that we've been talking about But I think when I say trust transparency and together in that area I'm curious from the three of you what immediately comes to mind or to heart even you know what passionately just kind of Comes up in you so catcher and maybe we can start with you You mean about the keywords trust transparency and together well from an Kind of climate few transparency has become a really big topic and with tech particularly when we look at CO2 emissions of AI for example and how to report those So this is a really big area that is still kind of under researched and where I see a lot of potential That's coming to mind to transparency. However, I also Observe that transparency definitely is not enough I would also encourage to for example ask questions around Democracy or equity in tech that it's just some other kind of perspectives that like transparency is a part of it But it's not only transparency that's gonna make technology more equitable. So I think when it comes to questions of trust. It's also Basically thinking who trusts into technology who can afford to trust into current technologies And we've seen in the history of technology or specifically also if we look at AI that there has been a lot of discrimination a lot of favoring of certain people and a lot of pushing away of Marginalized people so I think also when it comes to trust. I'm curious about asking trust Into whom and from whom and do you feel like that trust is a bit of a privilege that it's not afforded to everyone Definitely. Yes. That's my question Any ideas on how we what we can do about that I guess Well, that's where transparency obviously comes in but also the question around accountability So who is held accountable for example for data breaches? Privacy violations, etc. And then also really looking into the bigger System, so you were talking about the US quite a lot. So if the demographic system is kind of falling apart Then obviously this is also a big issue. So That's been one of the themes that kind of reoccurred in my work Throughout the last years was really to ask. I'm more coming from the digital human rights field I worked a lot with NetsPolitik.org for example and really to ground these Conversations also into bigger conversations around democracy and justice That's like part of my work and we can explore later more Max trust transparency together. Does that bring up any anger or maybe passion in you? I have to say so right away I'm typing like my notes down on the code on my phone So because I had that sometimes so people are like he's texting while he's on the stage. I'm not Multi-tasking. Yeah, I'm just remembering that's usually my biggest issue So for me, I'm taking very much like my privacy hat here. My lawyer's hat I'm probably the wrong person for the big story sometimes But I think transparency always triggers me a lot in the sense of that. That's kind of a Wishful solution to a lot of issues. It's like put a bigger sticker on it And that's what we see at least in the privacy bubble is like the transparency usually boils down to just have a longer policy That says we may do even more shitty stuff with your with your data But it doesn't really solve the problem. My favorite example is in and it's kind of what the US calls notice and choice principles There's a big notice and then your choices take it or fuck off And my favorite example is in California There's usually a law that you have to put anything that has cancer as material in it have a big sticker on it And there's even a sticker on Disneyland in California apparently I found the picture once But that doesn't really solve it so transparency I think it's a first step and interesting to see what's going on But typically only like real experts of people that I really have the time are able to use Have the time to really go through that transparency and make use of it The bigger issue is how do we then solve it so transparency for me is always a bit of a trigger word of like have solution put it potentially And I think trusting together and as I would just say two things in combination What's at least from privacy perspective really interesting is we need to be able to share data with other people So that is this trust part or also the togetherness part because there is oftentimes an answer with especially in the tech community is How can we just you know segregate it wall it off have a technical solution to it? And that partly is it makes a lot of sense But partly we also want to share with other people we need to you know have a contract somewhere or you know Interact interact with other people and that is usually in a developed society only possible if you have good trust You can say I can give you my most secret stuff But I trust you you're not going to do shitty stuff with it and that's a let's say very Like a high level of solving issues, but that is probably what in the long run We want to have not easy and and not simple, but we usually for example I mean mostly trust banks that they don't run away with our money tomorrow, maybe in two months But but there's a certain level where our interaction only works on trust And I think that is interesting how we can develop that further and I can't partly also be technical that basically for example I can see proof of Stuff actually happening in a way that they promise because I have some code that shows that to me That's the end of what I'm gonna say on that. Oh, that's useful Yeah, one of the challenges I have with trust is that you know, we give a simplified example of trusting You know if I'm trying to have a conversation with you. Well, we trust each other and that's okay But the reality is the trust as you explored was goes through many many many entities. Yeah How do you trust the entire thing? Well? That's a challenge. Yeah, I think but however we usually overcame overcame that in let's say developed societies largely So I usually make an example as if I go and I see a train back to Vienna I do not trust that it runs on time if it's Deutsche Bahn But I do trust that it doesn't derail instantly and I have enough trust that some engineers somewhere in some room Manage that without me ever checking it and I think we do that in most stuff like no one checked If this if this place is built up to the building code, but we kind of trust it doesn't fall on our heads Even though there's a lot of concrete above us and so we usually do have that trust in a lot of areas I think in a tech bubble we're we have a way along good way to go there partly But in most other areas of the law we managed that trust and there is with new technology That was always an issue like if you look at the the history of law whenever a new technology came around There was first distrust and and also misuse some problems and things that didn't work at the beginning But usually the more that matures the more the trust is building up gradually and we have to say with a lot of these Discussions we're having we're at the fucking beginning like it's like industrial revolution 10 years in And you know yeah, we had some workers lost but no one trusted them either now a hundred years later We have some feeling that usually you get your paycheck in the end of the month and that built up And I think we have to be a bit like Positive on that as well, even though we usually deal with all the shit that doesn't work But you have to remember, you know where you are on that timeline that you're still at the very beginning Any thoughts on trust transparency and togetherness I think products developed by the computer science industry Most users will have absolutely no reason to trust most of them There's just no discussion about it I was attracted to open source and next cloud because I was wondering what it would be like to be surrounded in an Environment where people actually have values Because that doesn't exist elsewhere. Oh, it's not a given either. No, so why should we trust it? I Don't know But I think Maybe that's something that's unique in the open source communities that we exist in is you have to build trust often these are complete strangers that you're working with on things you're passionate about and It takes a while to build up that trust and like I have friends I've worked with for years that I don't even know their names for instance, and that's surprisingly Okay, just to put it in like we have our all our legal shit in the next cloud And we had a discussion at beginning was like can we trust that open source thing that no one of us has read every line Either and in reality as a user you just trust that some of you guys have checked on each other and it somehow works PS sometimes it doesn't But that is like largely what you do as a user as well You trust that these are people that are well-intentioned that did the best that they could and and if there is a mistake because it's open Source someone else would find out that's exactly that trust model you have and no one has checked that from our Side either and I mean we do have a second wall around it to be honest But that is that is like the level where we felt okay that that works for us That's okay but that's trust we have in in in your daily work as well because if you guys would distrust each other Nothing of that would ever happen. So I think even in that bubble there. It has to be all that trust in in a way It's important to take moments to really take it in you know Catherine I feel like you are you've been writing notes. I see so would you like to jump in? No, I was just thinking it's a bit like Infrastructure know like once it doesn't work. You actually realize. Oh damn. It doesn't work So it's like kind of invisible but then when it breaks like you trust into it like you trust like a train system For example, you kind of trust it, but then once it doesn't work That's actually when we realize how important it is And a lot of it is extremely cultural. I mean I studied the US twice and part of it is that I Love the US to death. It's kind of my to me kind of the second place where I feel home But there is a very different culture in trusting for example and even within the US I was first in South in the south of Florida where everybody's Republican and everybody goes to church and everybody's Baptist But in that bubble they trust each other as well because that the fabric is then church and religion and so on and While we usually have more trusting government more trust in regulation more trust in okay I can go in somewhere. It's not gonna fall apart that trust you don't have that much a typical example Is when I your repair a car in the US? No one ever checked if that repair guy knows what they're doing in In Austria they need to go through I don't know 100 certifications and five years of training to be able to do something on your car But in the US I needed all these different platforms to figure out Can I actually trust that guy to repair my car that it doesn't fall apart? because there is no to if there is no any kind of like approval and Transaction-wise it it's easier for me to just go to any place and know that the guy that's there Actually knows what they're doing and I think that is interesting how we organize trust different in societies how we have that also in certain Areas and other areas not and I think that that probably plays out a lot of that stuff in reality and There are different reasons for or different arguments for different ways of doing it But parts of especially trust is extremely cultural if you're in a society where you Fundamentally distrust each other and best example. We had an exchange student from Argentina She was in 18 and she wouldn't walk in Salisbury the smallest town ever seen for 15 minutes at night because she was always told She's gonna be raped molested whatever at night She was crying the first time she was at night out there because she learned you distrust society you distrust anybody out on the street That's the way you were brought up which in Austria like let's just do everything's gonna be fine And I think that's extremely interesting how that's different per culture per bubble and so on now I shut up Well, I feel like often for us Citizens of the internet that's often the take the approach that we should be taking which is distrust everyone until you know You have a few organizations that maybe you can trust and they Introduce you to other organizations or people that you can trust so it's an interesting one Yeah, I also wanted to bring it a big bit Back to like the era of the climate crisis where a lot of these infrastructures like Particularly if we look at the internet and digital infrastructures a lot of the times they're just they've just become monocultures I mean, I I think this audience know what I'm talking about. So I'm for example in certain environmental disasters often These infrastructures fall apart. There's this great example from Hurricane Katharina Where like people did not have any access to internet anymore? Except for this one small community of an open source Internet network that was still able to be resilient in this time and they were the only ones who still had internet access So I think that's a really interesting example also to like kind of bring it back to times of crisis How can we also build these smaller open source alternatives that might be more resilient than these big tech alternatives To kind of also be more resilient and prepared for all the different disasters that we are facing You really hit me with the word monoculture. I have some passions and you know local farming and such But it occurred to me just then when you said it that we can approach The internet and the tech industry with that idea of monoculture as well because we're seeing that sort of playing out Where you have you know four or five six very very very large monocultures of tech businesses and definitely I know you have Maybe some opinions on whether that's good for society Maybe there are some challenges there that we need to address, but mostly like is innovation worth the societal cost in the particular model that we have currently I I'm breaking the microphone by touching it So the question was does innovation is that kind of balanced in the monoculture Is that what you try to ask? Yeah monoculture. I think at least or prove me wrong is a bit of a Results of the capitalistic system and how it's been set up currently and so the question becomes well Should we keep going in that direction? No, and so where should we be headed? We had a discussion not I Was waiting for the one person saying yes right now We had a discussion you and I not so long ago about Microsoft investments in open AI and Yeah, it's not really a surprise to me that they are doing that For example, I have a strange fascination to the acquisitions of Google Because I figured out that Google or in other terms alphabet Doesn't really do a lot of innovation on their own So I was wondering what those 300,000 engineers are doing all day They bought most of their major successes such as YouTube Android They bought quick office, which is now Google dogs. They even bought their advertising technology Which they were earn all their money with called double click. So if Google would have not acquired any other company. They would have not been much more than a search engine and a very good knockoff from hotmail So every time when my colleagues are worried about Competition coming from big tech. I remind them that they are just a search engine with a knockoff from hotmail And that we also don't have to be too worried about Microsoft for the same time because Their investments in open AI are not a surprise given their business strategy. They actually bought hotmail They also bought LinkedIn get up. They bought Skype, which is now Microsoft Teams They buy a lot of different companies. They actually also buy PowerPoint. Did anyone knew that? They didn't invent PowerPoint So it's not a surprise to me that platform capitalism is functioning in this way What about I'm curious too about the environmental impact as it As it relates to this centralization, you know, you have all the information going to one place now We've explored the problems with that from a political point of view, but Catherine. I'm curious From an environmental perspective, you know Can we do better with decentralized solutions? Is that a better option or are there solutions that you're exploring that? We should think be thinking about Yeah, so at the moment you can't generalize that Decentralized solutions are definitely gonna like if we look at carbon emissions now specifically that they're gonna have less impact Simply because we don't have the data on it. Like there has not been done enough research But if we like go back to the topic of AI On the one side, it's really good that we have these AI technologies because they kind of help us understand the Massiveness of the biodiversity crisis of the environmental crisis So like these models can definitely help us in understanding these crises But at the same time especially in the tech industry at green web foundation We work a lot with developers and technologists a lot of people are just not aware How much of a carbon impact for example the internet has or AI has so it's around two to three percent Of global emissions are only from digital infrastructure Which basically means that it's bigger than the entire aviation industry So it's a lot So are you saying we should stop flying or that we should stop using AI? Yeah, I can I can't give you this advice really but I also don't think it's always the best solution to base it on like individual users You know like these individual perspectives. We had it as well With plastic where you're like if you don't take this one plastic bag at the supermarket You really gonna contribute to You know making the world a better place while we have these massive fossil fuel companies that are just emitting and of So much more than like single people. I think we have to have both approaches So it's not like individual versus the whole society or politics, but Like it has to be in conversation basically but what We're definitely seeing is that there's a really growing interest into this topic on digital sustainability But also that there's clearly a lack of data So that's one of the big topics that we're trying to work on For me, there's an interesting connection between what you say about AI potentially helping to understand the massiveness of these societal issues Well, also the tendency that this exact solution will actually contribute to climate change in the first place and I find that such a ironic train of thought because in many different industries AI is supposed to be magic and help to solve societal problems Well, actually they are causing those same problems that they are solving or claiming to solve They don't solve it and I'm arguing way too often with climate researchers about yeah, but AI will help to solve climate change I say AI is causing climate change. Come on in the Netherlands in 2019 the data centers were already using three times more electricity than the trains How is this gonna help come on? Well, I think part of the question there also becomes what's the greater societal value of each the train system versus AI and I think Often with technology. We don't really know right you have a question and so it's important to Ask, you know, what might the near future hold with these technologies? Okay, we're fumbling a little bit because we're brand new at this whole AI thing ish sort of And we know we can do better, but obviously there's a lot of work to do so You have a question. Maybe I can Well, actually what you just said it triggered something in me when you Said the word decentralization Well, actually if you can decentralize the energy supply which is Where we are on a good way to it and if we have like decentralized AI, I mean look at this here I mean if you connect these things together then AI doesn't have to be a real Environmental problem in the end Probably Yeah, definitely I agree with you. Um, we just don't have the data. That's Like it's an interesting thought and I agree with you I just think we still need a lot more research in these areas, but definitely decentralization of energy is one Yes, exactly I'm curious for each of you In your area of expertise. Do you see awareness? Increasing like our more people who aren't necessarily nerds and technical folks in our audience You see generally that awareness of the problems is increasing and that the desire for solutions is also kind of seeing some momentum If I may quickly add to what you just said before I think what's really interesting is this Responsibility shift in idea in the sense of like you're in charge of a plastic bag It's not the plastic industry and we have the same thing in a privacy bubble where it worked really well to say Oh, you are in charge of where your data is and you should like you know Have your little bunker in your basement where you keep all your data and otherwise you don't have privacy anymore It's a it's a recurring theme by by the industries to shift Responsibility from the ones that actually have power over something to the ones that actually don't And it's amazing to me because in Austria still have to study Roman law from 2,000 years ago And one of the principles ever since 2000 years is if you have power over something you also be at a responsibility of that shit So it's like if you write the code you're responsible for what it does And not someone else that uses it So I think that's that just connected to to that question and connects back to the awareness part And that's a bit kind of what I mentioned before about transparency We do see in the privacy bubble definitely more awareness like people do feel that something is weird and there's something going on My favorite examples we get that I think every two weeks We have some journalists saying I have this case where someone was listened to in their conversation and blah blah blah Are they all spying on our microphones? It's like no, they're spying on everything around you so they can already predict what you said yesterday Which is even more creepy But there is there's this feeling that something is wrong But people will have a hard time to explain it and really know how it works in detail and that gets back to the awareness part I think we need general awareness We need people generally agree that we need to move stuff, but that is different from Detailed awareness and I can tell you we work on Facebook for 10 years I don't know the fuck how this is working in detail and we even got like there were all data at some point But I don't even think that Facebook engineers fully know how the system is working because they just do their little thing there So it's hard to just then basically on the awareness part I think we need to stay on a very high level to make it comprehensible for people to make it understandable and Not worry too much in the sense of people work for eight or ten hours a day go home and not gonna want to worry about open source environmental impact of their train trip and privacy and you know the other 20,000 things that exist and We need to kind of really decide that we are generally aware But then do societal solutions to that with experts that know what they're doing in that area I think that's how we ever overcame any kind of bigger issues. That's how we overcame. I don't know electricity working It's not like everybody understands how electric fittings works. It's just they understand there's a plug and it comes from somewhere and Leave it at that and if it sparkles you probably don't touch it And I think that's cool and and and I think that's also how in all these other areas We need to to get to her. I hope that's somehow useful. It sounds like you're Suggesting that abstraction done well is good for everyone. Yeah And Daphne I mentioned are you seeing extra awareness in the field of maybe The academic field and Are you seeing those ideas translated to everyday people who have extra awareness is their movement in helpful places or just kind of stale It's kind of fun to observe also a little bit from the outside now I'm working in industry because on the one end critique on AI and privacy has always been there since I am studying the field Where the industry is trying to emphasize on the narrative that AI is magic And it will become so intelligent that it will take over mankind Academics say well, we don't fall into that trap Maybe we should focus on the present-day problems that are more concrete like discrimination or climate change And they kind of argue that this narrative of AI will take over the world is taking The attention away from the real problems that are more likely to happen Now there are some developments in Academics, but when I got interested in AI effects for work I called my favorite professor the name he's sitting over there And I asked him a yes, what's going on with AI effects and conveniently he just did a literature review on that and The first sentence was like well, I'm sorry. This is hopeless All the frameworks are hopelessly high level. There are some toolkits, but they are behind to paywalls So the papers are basically advertising papers for a paywall tool None of these methods are ever peer-reviewed or tried in a follow-up study So they were pretty useless for me as a practitioner in the industry and Then we discussed quite lengthly why this was the case and we think it's the case because academics seem scared of just becoming concrete about what is an acceptable application of AI and what is simply not an Acceptable application of AI they don't want to make up their minds and that's possibly because many academics also believe in the narrative that limiting Developments in AI is somehow dangerous because it would limit technological progress And of course if you limit technological progress then you must certainly limit human progress Which is a connection that is not necessarily true It sounds like there are some similarities and one definitely just mentioned with Max the sort of challenges you're seeing in law as well where they said they wrote a thing and yet it's In some cases not even there at all or so broad that it's not useful. Yeah. I mean, I mean in law It's probably different per area. We usually Assume that people generally to follow the law that doesn't mean that they always followed along if you if unless you have an Absoluted state a total surveillance state people will always not follow the law a hundred percent and that's fair. That's part of society And and sometimes people over followed a law like I was in Austria still had to do military service I opted out that there was an ambulance driver I don't have much respect for red lights anymore when you go around the city and and then you realize You know people are in the middle of the night at an intersection where there's no traffic whatsoever They sit in their car in front of a red light and stop For no good reason because nothing is around there. No one cares But we still have this inner feeling of that's a rule We have to follow the rule the laws and and someone actually running a red light would probably die internally Even though there's no realistic reason the other way around You know if it actually comes to following I don't know hundreds of other rules people don't feel that that's that important That's okay. What's interesting in the privacy field and especially in the in the digital field a field is that? This Silicon Valley approach of move fast and break things Just got so intensive and I was studying in California for a half a year in in Silicon Valley actually at a small University And we were taught in school to kind of think about how likely is it that you're gonna be caught how likely is it that? There's gonna be a consequence. How much money can you make with that? That that was simply the thinking and and it's economically making a lot of sense But as a reaction we need kind of much fiercer enforcement, which I'm not a fan of I'm a fan of a Government that you know doesn't look into everything and enforces everything a hundred times And and that is a bit of a societal thing that we see is like the more aggressive industry becomes and the more They use every little hole and just go to the last edge and even go beyond if there's no consequence The more kind of you have to push back and it the more aggressive it gets and the more all of this kind of pushes up And I think that's part of the Silicon Valley culture that we just now live in defect or globally that that We have to deal with so I for example when a GDPR came around they said 20 million or 4% of the world would turn over as a penalty, which is billions partly Why was like that's crazy like we never have penalties that are that high? But it was necessary because in that culture unless it has a billion it's not even relevant They don't even think about it and we see that even now in cases where we won I mean we had I think 1.8 billion or whatever against meta Which like without us wouldn't like sometimes comes like how much do you have to donate nob to get how much penalty for meta? I think we would be an interesting calculation But even there you see they litigated for another 10 years and it just to try to drag it out and so on and that is I think an interesting development. We got to get into is that we got to have this If we don't morally agree on a core if we don't comply with the rules because we think it's morally right But we have absolute disagreement where they say that's privacy is morally wrong and we think it's it's right It's gonna be really really hard to kind of work on that without really getting hyper aggressive, which is Defect of what we do at night. We're not nothing else than being the aggressor in this But I'm not happy about that that's development that that we see but I don't think that there is much of a way around right now Unless, you know, we cannot mature in this area and we see that people will actually do Ethical rights more and not just try to go as far as possible. I hope that was somehow useful Do either of you maybe have a some thoughts dad to that Just I don't know if you know logic magazine It's a really really nice magazine on tech and society and the latest issue was around How to move slow and heal things. I thought that was a really good Kind of push into another direction. Yeah, we once got a sticker saying move fast and fix things but yeah I'm curious you mentioned those fines and the billions. Do you still think they're effective at all? That's also interesting going back to it. Do we have data on that? We don't so there's a lot of data on Individual crime and individual people like blue color crime white color crime. Why do people break the law break in? You know molest other people. There's a lot of research on that There's very little research on what companies do and how companies decide to break the law or not almost non-existent So we tried to get that because we want to be as efficient as we could be to kind of you know Do stuff like that so we're now starting with that gradually to kind of ask DPOs What is actually I mean? I you have a lot of anecdotal Research so to say the DPOs data protection officers tell you I tell my boss for five years You can't do it, but the answer is we make 10 million euros with it So why stop if there is no consequence? So we we start kind of going into that what is interesting? We actually did a mass complaint system where we Send I think complaints to five run the different websites for shitty cookie banners basically because under a lot There should be a yes or no option, but they just do something else as we all know so thinking was let's show everybody that actually if they would comply with the law it would be a nice one and Because they have a b c d e f g testing for cookie banners They like test them like fuck until they get more than 90 percent of the compliance rate We were like let's do the same thing So we sent four versions of emails to the companies to say we're gonna sue you and it was interesting to see like You know which versions work the best and I think that's interesting ab testing to go the other way around The result was the most the most neutral one worked the best But we hardly do that We hardly know also as the regulators the the compet that the authorities actually dare to enforce it We have to think that just the privacy regulators in Europe We pay 300 million euros per year through these regulators 100 million of them by the way in Germany So one third of all the money that goes into data protection is actually sent in in German There's zero evidence on anything working there the German DPAs don't even publish their decisions We don't even know if they decide anything in what they decide Even though they're also in charge of freedom of information at the same time, which is like mind-blowing and But we need to also ask How this is doable well how in the long run we can enforce against these big tech companies What are the dynamics behind it? How are the decisions done in detail? To actually be efficient in now doing DMA DSA and the EU is gonna come up with another 100 rules probably And we cannot just pass more and more and more law We also have to think on how do we actually get that on the phone or on the device of the individual person? And I think there is tons of research to be done where with very little money and very little research We could probably be ten times as efficient as we are right now. I Think regulation is part of the answer But for me, it's also interesting to zoom even further out and wonder why So many companies are up for violating the law in the first place or crossing such boundaries of human decency so I Also, I'm very curious about the values behind people studying computer science and the type of values and the type of mindset That we actually teach these kids And I think we are just pushing up the hype of computer science too much I think we are teaching these kids as if they are gonna be savers of the world by collecting more data and more AI And that is if I may add But I don't want to and what was really interesting to me when I was in in the Silicon Valley being there And I was first as I said in Florida super Republican to super conservative But I was like these guys at least had some values. There were values absolutely disagree with but there were values Sometimes in the Silicon Valley. I there was this liberal do anything, but that's not necessarily a value That's not just like less a fear whatever and I was interested because I had that feeling they're partly the other thing That is mind-blowing to me. Maybe you all know much more about that But at least my colleagues in Austria that study engineering at the university did not get any courses into ethical stuff Also, just a law crazy, right? It's it's amazing because in any other area that you I don't know if a friend of mine does social studies and basically I'm not social studies, but like Social a diff like, you know street workers and so on they have half a year only of legal What are you allowed as a street worker? What are you not allowed blah blah blah blah and and that is interesting because in That area that's part of this maturing process. We probably will have to have we have that any doctor gets training on What are you allowed to do or not and at least the friends of mine? I mean that's also a bit an older generation now the only thing they were told is yes You have copyright in your code make sure you have it that was kind of the legal education They got at the time definitely this will also mature this will also change But it's it's it's the part that I mentioned before if people have Fundamentally different moral values or different views of what's right or wrong It's very hard for the government and to enforce that like we Good probably way of thinking about it. I was at the German data protection authorities conference and they asked me to speak for him to kind of tell give him an impulse on How can we enforce it and I actually had a slide of like a favela in Rio and that I visited way back when It's basically an area where the government just gave up and just says okay Let's just do whatever you want to do It's super hard for democracy and governments to retake that ground like once you haven't fundamentally enforced your shit It's really really hard to get big into it and that is partly what happened with the Internet I mean that's that was like the funny and crazy 90s And I mean I was still in that age that had an ISD and Internet so I had kind of an idea of whoo That's suddenly you can get MP3 for free that was funny and so on but we're moving into an area where this becomes such a Part of our society that we have to make sure that democratic rules are also enforced and get there That's extremely hard for for government's generally to retake ground once the ground this is is used not to be regulated That's gonna be an interesting time You mentioned government there quite a bit. I'm curious if each of you think that Government regulation is the solution to these problems we've been mentioning like the environment Such as societal impacts that we're seeing for technology these days I think we have five minutes left Should we take some questions or I think let's answer that one and we'll see how long that take Because I promised that we don't do too long Yeah, I think I mean it's a question back to democracy and Also the raise rise of authoritarianism across the world like is government always the right Model and there's a lot of countries in the world where you cannot trust your government at all So I think that's just a really important perspective to take into account when we are talking about Should government regulate more? There is internet shutdowns happening. There is like a lot of Facial recognition software developed people can't cross borders and there's a lot of technology developed there So I'm not sure if Like I think it's like from a very European perspective this idea of government and trust So I'm just a bit like trying to bring in the other perspective as well Yeah, and also I have to be transparent. I think Maxim here also both from Austria. I'm sorry for that I think the problems in the computer science industry are not that different in many ways from climate change I think both are problems that are hard to understand and probably don't have only one solution direction But require multiple so I think government is one of the pieces of the puzzle and my research focuses on the mindset around innovation and this conflict of innovation and Regulating AI or privacy one of my favorite examples is from Criminality prevention this narrative that surveillance is necessary to protect us from terrorism one of the AI software that's currently run in production in the US by American judges is north point and they help judges with knowing if someone will commit a crime again and The accuracy rate of this AI is sixty one percent which is not much better than a coin flip So theoretically you could ask any animal like monkeys or my favorite one the weasel For a judgment and you will get a equally good reply Come on sixty one percent. I have seen horoscopes with a better accuracy, right? So to break through the hype when we talk about AI within my team and that next cloud with Frank and Jules For example, we started to replace the word AI with a weasel And then we asked ourselves if it was still a good idea or not And you have to remember the weasels are racist sexist and they potentially also fart a lot because they emit a lot of green House guesses and then you can judge if it's still a good ID So is it a good idea to train weasels to? Generate a draft for a report. Nobody will read. Okay fine Is it a good idea to train weasels to blur your background while video calling? I have no issues But is it a good idea to train weasels to judge if someone has to go to jail or not? I would say that's a bit problematic Is it a good idea to train weasels to drive a heavy car through busy city streets? Of course not someone is gonna die So I think weasels should take over the world We have a question here, maybe I'll pass the microphone Thank you. We only have a few more minutes. We'll try to squeeze in a few following the discussion as my favorite sociologist Niklas Luhmann once said reduction of complexity creates sense and I think usability is key because if you look at the how all the I call the magma because Gaffam Doesn't fit anymore so how how magma actually conquered the world is because they found easy solutions and Easy to use solutions for people. So it's very it's very simple I mean people are buying an NSA spy and install them in their at their homes because they hey lights on and everything's happening It's fine. So it's easy. It's convenient. So I think usability is key I mean if I want to use a solution like next cloud as a private person first of all a lot of an attack community Don't even know that it exists. I mean I'm talking to a lot of developers and they don't really know that it exists So it's a thing We're like marketing and sales and so on is a thing and it has to be Come a lot more public. I think and it has to become easily usable like I want to have a click one click solution I mean, I'm I have a business background, but I never written a code in my life but I'm still a fan so and I think this is The the most the most and think that you don't think about code or that you don't think about Software, but that you think you have to have a solution for finding for for solving problems I think this is the the main key and if we if we provide an alternative where we can say, okay Here we have a solution which is equal to a solution like gmail or like the Google suite or or or Microsoft in the corporate area You really provide a solution there which is compatible and it actually is after what I found out today They think we really have a great chance to to get an alternative because people who are minded I think two-thirds of the people are very Minded about the privacy of the data After Snowden and if you have a great alternative and I think we have one here. We saw one today and so that Brings up some hope in me actually great. Well, that's good news, isn't it? Quickly, please. Thank you So many of these problems sound like it's because we are punishing data collectors on the other side It's too cheap to collect all this data Why don't we do something that a small German party wants to do a political party? It's a digital dividend So all those companies need to pay for the data that they're using and give the money to the people They collected the data on but in that discurses them from actually collecting all that data and You can't do go fast break things because you still need to pay Before you even broke anything. Yeah, I mean to learn to certain extent. I think the taxation thing is interesting in different areas. I was just recently reading books like on how much like the Systems are built that you're constantly hooked and you're constantly connected and so on it would be interesting to say, okay if the biggest aim that you have is to have people be on your fucking phone for six or seven hours a day Which is not gonna be good for any brain Why not text that you know and and we do have rules that I think we have to be more creative about that We had discussions about when when Facebook and the kind of social media came along is like the people didn't watch the news anymore They weren't informed about stuff that that you know news actually have to pay to get in and and so on and we did regulate that in other Areas I can tell you that for private TV or for private radio We for example require them that there is not more advertisement than I think six or seven minutes in Europe There is a cap on advertisement. It's twice still the half the time than in the US And and we could do the same thing and say okay You can in a news feed you only every ten thing can be an advertisement you can pass that as a law same thing get way around We have laws where we say okay at every full hour There has to be neutral news on any rate if you want to have a radio license even as a private company You need to provide news that's part of like getting a radio license Why not say there has to be somewhat news in every 20th thing that is in a news feed? So there I mean I mean obviously all of that is gets a bit more complex because what's news? What's you know conspiracy blah blah blah? But there is a lot of these options that we could think about and it's interesting to me that this bubble managed to say Okay, we're above the law. You can't do it that would be against innovation And it's not against innovation to say cars shouldn't go 200 kilometers an hour in a city's limits that That's actually helpful and I think there is much more We need to open up our minds much more that this is doable and and if anywhere realistically in Europe In a sense of their we we managed to pass laws like that that we wouldn't be able to manage anywhere else And we're large enough as a market that they kind of have to comply with it and these two factors globally You don't have that many other places So I think that would also be interesting and also inspire other countries or like other places that it's doable with their Variant of what they want to do. We don't have to like force privacy on everybody But whatever do you want to do? I think that that would be interesting to think about on I'm not gonna talk with I'm gonna answer the rest later shot I Think since we're running up against time. I have one final question for each of you Maybe just a couple sentences. I Given everything we've discussed today. What can Those sitting here in the room today and those watching online What can they begin to do to try to help in some of these areas? Patrick you want to say? Yeah, I would suggest especially for developers and technologies to check our developer site. I'm also gonna Show you some more tomorrow morning, but there's quite a lot of cool open source tools So I think that would like especially especially for this community be a really interesting start to get like a feeling on CO2 emissions of digital Services it's on green web foundation org slash tools What to do like Always have a bit of a hard thing because I'm usually this we need to change stuck structurally But especially for this bubble like for example what help does is to point at alternatives? Typically next cloud is one of the alternatives we can point to to say there is something else and to cat to Take up what was there before is also the usability side of it It has to be like as easy as something else Which I know like at least our developers hate in our office because we software and they love to do fancy stuff But usually looks looks ugly, but that is usually in our team even what gets people to use it I think once we can develop stuff like that where people can understand that it's easy in one line to to see that There is an alternative that can break down a lot of things in the long run And and I think you guys are one of the examples of how to do that Thank you definitely First contribute to next load and I have no business interest whatsoever And second remember that more technological progress doesn't mean more progress for humanity if this requires the violation of human rights I Think we can agree with that Can we please say thank you to our panelists?