 What do communes, municipalities in Germany do against discrimination emanating from their digital strategy? You could claim that as a solution, municipalities simply avoid digitalization at all. But that's no longer true. Because Germany, in Germany, algorithms have started to influence our lives beyond the digital sphere. And our speaker, Leonhard Freundler, is a student for humanism. He has a bachelor in media and computer science. And he will give us a data ethical overview about whether German municipalities have developed awareness to the risks and whether they act against discrimination. Leonhard, it's your stage. The stage is yours. But sound is not coming through. Can you hear me again? Yes, I could, but the audio quality was quite bad. Let's see how we cope. Okay, thank you. Short technical issues will start. Data ethic in this small city is my topic. As my Harold has just said, I have a bachelor in media computer science and then went into urbanism, not humanism. And I asked myself, where is the interface here? I started reading books on various topics about one year ago. And then for the first time I came across this term, data ethics. And I asked myself, what does that mean and how is that implemented in cities? Data ethics, the easiest way perhaps to explain it is the norms and values for digitalization for our tea projects in connection to big data. And the question is what should be taken into account and taken care of. And through various literature, I came across the term smart city, which I'm sure you've all heard and which you all have some meaning for. So I thought smart city, there are so many processes involved here that are digitally controlled. Does anyone make ask themselves the ethical questions? And then through the books I read, which I'm going to tell you more about later. I found several examples how this works. So I'm going to pick three of these. The first example I got to hear about is failing facial recognition. So these are images from faces that are matched to a database. And this kind of software will often have problems with people of color, trans people, which they cannot recognize as accurately because they often are missing in the training data. What does that lead to? Well, if the police would were to use such a software to recognize people they are looking for, then black people would more often be able to recognize people they are looking for. So black people would more often be falsely suspected, arrested, and stopped and searched. That's the first example. Example number two, predictive policing. Predictive policing is the attempt to predict crimes. So you feed a software, quite often this is a kind of algorithm that is linked to some kind of artificial intelligence. You feed that with police data and socio-demographic data with geo information. And then you try to predict when and where a crime will occur. Often there is a kind of heat map that is shown as an output. And that is then used to support the police personnel plans, the police actions. The claim is that this can work at minutes of accuracy. Often it's about drug deals, burglaries and things like that. The criticism here is that there is a strong threat of feedback loops. So the data that is being used as input is influenced by the output. We may have data that has originated from racist police planning. And then we find a apparently high identity of crimes in certain areas, which will lead this program to say, yeah, step up your patrols there. So they find more crimes there. And these data are fed back again into the software. And then enhances or magnifies the trend. And it would also seem that these systems are very successful, because of course the number of detected crimes rises. But what we don't see is what happens in other districts. Third example, sexist snow clearing practices, a very interesting example. So in the city of Karlskogar in Sweden, the town council in 2011 was running an audit of all their policies according to gender bias. And they found that, well, if we clear the main roads first of snow, the main arteries of traffic, then we will prefer people in Karls. That is just a political decision, perhaps. But two thirds of public transport users in Sweden are women. And it showed that many accidents that led to hospitalization were pedestrian accidents. And these are mainly happening in winter, and they affect women, because a large amount, large proportion of pedestrians are women. So what the city of Karlskogar did was that they changed the order in which they clear their roads from snow. They start with smaller roads and with pavements and bus stops. And this way, the number of accidents decreases, because the accidents with pedestrians decreases, and that saves the commune money, actually. And there are more examples, both with a connection to towns and other connections. Let's just mention racist credit approval practices. If banks use systems for credit approval that are based on racist data. So what can be causes of discrimination, both in a municipality context and others? Well, this could be a lack of diversity in teams, the teams that develop the software. Facial recognition software, the problem often is that teams developing the software are mainly male and white, and no one, therefore, will notice, oh, hang on, we should have tested with people of color as well. The credit predictive policing. We've talked about the feedback loops already. So the input that is fed into the system comes from the outcome of the system. Also, another item from the predictive policing area is data that actually is a result of inequalities, such as racism, which leads to areas with a majority, with a larger share of a black population, being more heavily policed and this data being fed back and inequalities being magnified. Another problem is that decisions that are based on statistical means rather than individual data, for example, I might try to prove a credit and base that decision on a average of the population in my district. The people in my district may have problems repaying credit on average, but doesn't tell you anything about this individual that will be perhaps better based on my wages, my tax returns, data that is actually about me as an individual. Context factors are important too. For example, in healthcare, there is a sex aggregation of data. If I don't do that, I have a much harder time to assess how men and women react, respond to a certain treatment, certain medication, unless I separate the data by sex. Another problem that is very problematic and that leads to inequalities is ratings assessments to other users. For example, if a restaurant that is run by foreigners is given worse reviews, not because the service was bad, but because people didn't feel comfortable through their racist stereotypes, then this restaurant will be given worse reviews, leading to few people going there, and that might then lead to further state checks, discrimination through reviews. Another example is the rating of distributed employees' ships. For example, Uber users, Uber drivers being rated and the decision whether they can continue being based on that. So much about that, but how does that all translate to a city or municipality environment? As the Harold said already, thank you, by the way. I think we all know that the influence of IT systems is constantly rising in communes, in town administrations. So this could be controlling of certain procedures, town planning, up to individual decisions in town administrations. There are interesting examples. The Woven City by Toyota that they tried to establish and they are trying to use a lot of digital planning there in the Arabic region, there is the so-called the line. And these were two examples, new examples, but there are existing examples too that are interesting to look at. For example, Rio de Janeiro, very intense early warning systems for natural disasters, and Barcelona does a lot of traffic management that way. They are regarded as worldwide leaders as far as smart city, existing smart city systems are concerned. What's the situation like in Germany though? We mostly have new projects. So the whole issue of smart city is actually slow to kick off in Germany. And a lot is about smart city furniture as it were. So waste bins that tell you to what extent they are filled, data platforms are then created that collect all the city's data. And of course, the town administration itself is being digitalized. From the side of legislation, there is not a lot as far as things that regulate a digital project like this, projects and a lot. There is not a lot about discrimination in this context. There are a few projects in Germany. For example, the last German government started a data ethics commission that isn't just dealing with the municipality area, but discrimination as a whole in the digital space. There is the anti-discrimination commission at the federal government. And Mr. Steinmeier used to be the foreign minister and now is the German president, started a project. And the interior ministry started a project about the smart city dialogue. They wanted to establish model experimental communes that started dialogue platforms. And in that context, a smart city charter was formed or was created, which says what communes should look out for as they start smart city projects or digitalization projects. There are some very general and simple ideas here such as inclusive practices and that's the sovereignty about the data should always be kept with the communes. The data should not be handed over to large companies such as Google or Amazon. Another item of course is open data because science, journalism, private citizens, NGOs should be able to work with the data that is produced by citizens and there should be a public good that is used and evaluated. And that is a very important item that various institutions always come back to. There should be an exchange of knowledge and there should be competency centers so that communes cities can learn from each other. The smart city charter remains in a very general field though. There are no details, just general guidelines. What is more concrete is the data ethics commission by the federal government, not so much focused on the cities but the government as a whole. And they created a few demands that should be kept to the most important being a risk adapted regulatory system which means that a technology impact assessment is supposed to be made and assessing what the impact of an IT system is on society, on business, on various fields and issues. And out of that a damage potential is being reduced which is the probability of a damage combined with the severity of the damage and from that the idea is to derive regulations from that whether a system like that should be permitted and to what extent and how and they propose a quality steel for that quality label so that I as a consumer will be able to find what the impact will be. Also they want to extend, they want to establish a competence center on data ethics for the whole of Germany that would lead to new regulations about beyond classical issues of privacy to deal with data ethics questions and maybe an extension of protection, protected features. So the classical anti-discrimination law that exists and is mostly based on five features, sex, religion, party affiliation and so on that should be extended and these items. I read various documents, not just the smart city charter and the data ethics commission's report but also anti-discrimination commission by the government and I am going to summarize a few items for you how to avoid discrimination. Or at least what are some suggestions to avoid it. The first point is exchange and competency. So cities like Hamburg that do a lot in the direction of smart city they are supposed to support smaller communities with estimating what they should do and how to do things and share the best practices. And part of that is digital sovereignty. For example Hamburg, they provide access to their developments so other communities don't have to take items or software from companies. Another thing that many people want is free access to data and algorithmic systems. So the GDPR allows me personally to have access to data that a provider has saved about me and all of us know that we can demand that it gets changed or corrected or deleted and here they want to enlarge that to provide more access. So for example for scientists or journalists to be able to estimate the impact of such systems and we want more transparency and a public control instance that has access. And this ideally should contain documentation to really be able to know what can the system do and maybe the public institutions are then able to also provide a seal of quality or a quality label. That's also then connected to an impact assessment. The demand also gets mentioned a lot that the laws have to be adjusted because right now they are very focused on privacy but not towards the whole field of discrimination. And an interesting demand as I think is an algorithmic accountability codex. So just like the oath of the doctors just for programmers. In my master's thesis I am looking at three example cities that are hopefully quite representative or more advanced for German cities. First of all Darmstadt, they have their own approach. They founded a company that is 100% owned by the city and this company first of all created an Ethics and Technology Advisory Committee. And this company is then advised by this committee to make good decisions and avoid discrimination. And the first step and this is why Darmstadt is quite interesting as a case study is that they provided ethical guard railings or guard rails. How their projects should be run. So general points like that such projects should serve the community and there must always also be an analog way not digital only but also there is an impact assessment connected to all the projects. And I talked to people of this company and they said that's more a target that they want to reach. They are not quite there yet. In the advisory committee there is a subgroup that already does it for some projects but not for all of them. And as in the most smart city plans there is a data platform where all the public data from sensory inputs and other data is collected. But there is not only motivation from the city side but also an educated civic party that is in the city lab or in the group democracy instead of surveillance and they are invested as well. Then I have Hamburg and Hamburg seems to is always called the first the first furthest advanced and that's probably because they have been at it for a long time and they have their own department State Department for IT and digitization. They have a proper strategy for digital life and their goal is to have central building blocks for projects that can be reused. They also first of all have guidelines, big guidelines for the university for example that was just developed or their digital strategy that is quite far reaching and has a big focus on digital sovereignty but also on the small level they made decisions that protects the data security of the civilians. For example the cameras they only want cameras that view infrared so they are unable to see number plates for example. And the third example is Ulm. Ulm has a big concept for data ethics and that makes them a little unique in the German landscape. So where others have a big concept here Ulm has a very specific one. I recommend to read it especially if you want to read further especially because they also have further reading recommendations and it's very interesting it's a good starting point to learn more. And also and this is something that many people want and also an example from Barcelona if you make contracts as a city if you make a contract with private companies that you then share those contracts. So for example Ulm specifically they had they made a pact with e-scooter suppliers so they can demand, make demands where the e-scooters can be placed but on the other hand they do receive data about the e-scooters so they can get information about the traffic. So they are in a dialogue. The data is not just the e-scooters but also with the city and they can influence, they have a dialogue and there is also an example of Vodafone with Barcelona and they have a similar pact. Those are my three current examples. So these examples are good. They are very conscious of the issues but then there are Cologne, Bonn, Dresden. They want to be a smart city they don't seem to be very interested in data ethics so there is a huge bandwidth of smart cities in Germany but there aren't many laws yet that concern themselves with more than just data protection and it's almost impossible for all the cities to just shoulder everything themselves. Many of them just want to be a smart city because it's a new thing, a hip word but they don't know what it implies. And yeah, I showed some examples then the so-called Battlesman Foundation does a lot here but yeah, it's still early days it's just beginning right now and I think especially now with this early studies decisions have to be made and laws have to be passed and now the foundations have to be laid and it's not just about helping people but providing concrete help how to really turn decisions into reality and one point that I see all the time and that I really want to stress that there has to be an exchange that's also used so all the cities not only exchange ideas but real concrete ideas and projects and help each other concretely not just have nice ideas but also so they know how to make them a reality and I mentioned some material and this is the slide with all the material that I would recommend to you if it's interesting to you a good start is this study of the Antidiscrimination Council what is data ethics they are talking about all these points what are the problems and how can we attack these problems three books to the topic I can really recommend them so this study is more than 200 pages maybe these books are more easily digested they are popular science especially weapons of mass destruction talks about data science and what the problems are or the issues or the potential problems and then we talk more about the operation itself the slides might be on then hopefully tomorrow on my website then you can also use the links that I put in here and now last but not least I am in the middle of my master thesis I hope that I can find more examples my problem is that most of the examples from the books are English they are in the US they are in England or Great Britain but there is very little happening in Germany or published in Germany now if you know examples in the German speaking area where state IT has been discriminating or maybe not just state IT or city IT but IT in general and the second question that I have discussed a lot can sensor data be discriminating? can city data be discriminating? do I have ideas for that? write a mail or maybe just join the breakout room and we can discuss it thanks a lot Dear Linett a very warm thanks for this elaborate and in-depth talk I am very happy that you gave it to us because it won't just be interesting and raising awareness for cities that are discovering a new universe for these many options and possible consequences of such projects but not just with your thesis that is of course upcoming so just realize that with their data they could discriminate and I hope that your talk is going to contribute to digitalization projects being furthered with increased awareness where the risks involved are also taken into account thank you for that we have a question and answer pad and there are some ideas there impulses maybe and also to come back to your request that you are looking for examples about beyond Darmstadt in Hamburg of communes in Germany what that could be I will quickly come back to that would you perhaps tell us what is so great about Barcelona traffic planning what do they do well Barcelona is something else they are taking the question how can a smart city be adjust and one question that they attacked quite early is that the data of the civilians belong to the people and not to some company and that has several implications in everyday life but that is the basis that they also made the foundation and examples are that in Barcelona the city council they have a contract with Vodafone so Vodafone is allowed to provide the phones to all civic servants of the city but in return because they are able to calculate where are many people in which area and you have to provide that data so Vodafone has to provide that data and then the city can use that data because the idea was why should only Vodafone have this data and we were talking about traffic before Barcelona has an extensive bike sharing system that is used a lot and also there it is possible there is a person who is responsible for data protection and data management and with them we can see how many bikes are used where how are they moving and again these are data that is generated by the city and that should belong with the city and stay with the city the sovereignty that is actually an interesting point just as an example the Luka app that of course had a role in the fight against corona in Germany just by its large spread it is interesting for in quotes new functions and well I'm not going to say it's a threat but maybe there is an enticement a temptation that communes that don't have the technological competency themselves that they rely on companies of course they then have a private organization and they have a certain interest in their data so do we need certain securities is it the GDPR the European data protection regulation that can help us there or are there any local laws maybe that can use well that's quite difficult because here there is two topics that overlap in my opinion one is privacy and data protection the other is the right not to be discriminated against so one is the GDPR and data protection laws in the European realm then data privacy shield with the US for example on the other hand there is not much so the right not to be discriminated against there is not much there is this right but it's just the five the five points the five attributes and the question is with machine learning it's quite easy to find correlations for example perceived place of living and ethnicity is a protected characteristic but if I know where you live then I can estimate who you are so data protection is nice but quite often it's not enough and also not really what it's about but we have to rethink anti-discrimination law because what kind of data can be used and abused and what has to be protected sexuality, gender, ethnicity obvious but what else do we need so yeah we have to make it further but that's as far as I know it's not been worked on thank you of course that reminds me of the example that we had in advance of the talk so we have companies with good intentions in a large US city that only wanted to employ people that were based close to the company headquarters in order to reduce the impact on the environment and then it turned out that people of non-white color were the ones that were living further away because rents were cheaper there so there was an indirect massive discrimination that was involved so all were intended but as the implementation took place they didn't think of the impact and another question I had there are systems today in holiday locations such as Venice that track holiday makers do we know of any negative impacts here well what negative impacts are known well I would recommend an excellent article in the New York Times they bought a data set from a data broker of movement data and there is a lot of apps that just ask for location data and hand it over with the data for example and yeah of course that makes sense that they get access and if you have ads that have that data as well and if then one place collects all that advertisement data then they can have movement profiles as well and now the New York Times bought such a data set and with this set they followed some of the actors or the data points in the set and they found one year as senator or some high politician that followed some protests and yeah if you don't close your phone or turn it off then in the background it makes connections and yeah so this person follows exactly the protest line that's very interesting data and which protests they follow and of course that's not only possible for politicians but also private people and that's just as interesting this person has their home and work and their third place but could that be? Could it be a strip club or something? So for all of us a movement data is very sensitive what shops are you visiting? Where are you driving? Very private data At the end of last year at RC3 we had a very interesting talk about the city of Oldenburg in northern Germany I think it was Oldenburg they had a public transport app which even after people left the public transport the tracking still went on so the potential that you talked about is enormous there Yeah but it's also difficult to say what's private and what's data protection and what's discrimination of a single person yeah I know what demonstrations or protests they're going to and maybe send them targeted ads but where's the potential for discrimination of whole groups? You talked in that context about ethical guardrails and I think the city of Olden had prepared some guidelines ethical guardrails questions here would something like this be something that you could go to the courts about and use it to ask for your rights? Well the issue is as mentioned that the laws of anti-discrimination isn't as far as developed but also it's very difficult to prove it and you have to notice first that you were discriminated against and even if you notice that you were discriminated how do you prove it? That's very difficult also in front of a court it depends on the case but in general how do you prove it? And how can you prove it if you are not a computer scientist? Maybe you can't you have no idea how an AI works machine learning so you need to know a lot of things often it's protected using copyright and the company just says well that's our secret thank you but I fear that we're running out of time because of the question we still have so let's change over to the breakout room and the link to the breakout room is on the page on the schedule