 So hello everybody here on site and at home before your screens and whatever device welcome to another session of making sense of the digital society for once here in Frankfurt on mine within the festival Politik im freien Theater so that's kind of a very special and new addition here because our series has been running for more than five years actually the first session was in December 2017 in Berlin we've been going on ever since and it is joint venture so to speak between the federal agency of civic education in German the Bundeszentrale für politische Bildung and the Humboldt Institute for Internet and Society so thank you for having us here in Frankfurt for this yeah for this venture out into the Republic so to speak the structure is pretty straightforward here we have a talk of our guests for about 40 minutes 45 minutes then it's going to be a one-on-one conversation here on stage followed by questions from the floor we have a microphone here going around to take your questions and there's also for you people watching at home or wherever a participatory tool called Slido where you may ask your questions we're kind of bundling them up together and they're going to be read out to us here on stage and I would also like to divert your attention to a very nice feature tool by the Humboldt Institute for Internet and Society it's an online compendium actually you can see the URL right there most lectures and also talks of the series in its fifth year now are online as video in the in full length actually but there's also a whole lot of extra content there's podcasts made around some thematic structures additional interviews being conducted after the lecture series so it's a really it's a gold mine if you ask me if you want to see and hear top experts about so many aspects of the digital society please check it check it out last note maybe and this one really may be worth looking at for teachers and lecturers who are not thoroughly familiar with the subject matter who is really and if you're not a professional in the field that includes me so to speak at least I'm not a researcher in any way it is really hard to stay on top when we talk about this digital transformation where do you turn to if you want to brush up and you know especially now and so many crisis of our day long gone are the times of course where at least I read one newspaper in the morning almost from start to end when I started college it was actually two newspapers a day that I browsed through in the morning then maybe some people consulted the TV newscast in the evening maybe an audio you know a radio bulletin at midday maybe one in the afternoon and so forth today we think this was excessive media behavior right and so many newspapers so much TV at night and so forth but was it really that excessive yeah you're probably right but my working in the media field had to do with it but if you compare this what I just outlined to what many people do today you know consume or comment on the internet whatever the device my old professionally motivated media habits seem all of a sudden a lot less excessive because daily screen time does not seem to stop growing because endless doom scrolling bad news has become a phenomenon with a name actually nowadays because we do almost everything online and we don't have to be online if you walk the streets I mean there's a whole lot of recording going on which we're going to hear about tonight and the pandemic of course has increased their time spent online even more so many of us do not just consume one or two offline media a day but visit very many different sites all the time while we troll the net we are being tracked and surveilled of course we may be you know be able to manage our cookies now in the European Union but do you really manage your cookies every time you enter a website I don't but there's one principle that applies to online as to offline space is one of many principles of course that increasing traffic means also less security I mean that's the you know the case for roads and cars and it's the case with internet traffic just the same wanting to have both mobility without end but also high security poses a series of logical problems to begin with tonight's edition will shed I think light on some of them so you know we have touched on this question several times in the past five years of this series how to resist massive data trawling a brokerage what can be done on an individual level what on a political and regulatory level we have discussed these hands-on questions usually towards the end of our sessions and the conversations here and of course these questions are notoriously hard to answer and science does not necessarily have an obligation to answer them either I think but if you were being honest at those sessions the speaking mostly professors confessed it was easier for them to limit their internet traffic and to do some form of digital detox because they were privileged enough to have secretaries and research assistants for example but tonight is different and I'm quite sure that this is not a night about detoxing do not expect the manual how to be good and safe online or on the road at least I suspect you should not expect that but resistance in the data fight society as tonight's lecture is called is probably a good hint that we're going to treat questions of what to do more prominently or what would be desirable to do what we have to be able to imagine first in order to find counter strategies to online surveillance data mining and so forth be it by corporate players or of course the state our guest tonight has traveled all the way from Amsterdam to Frankfurt by train she made it in time so thanks to public transportation in Germany you can't say that every day she was born and raised in part of our in northern Italy where she did her first studies she then moved on to the European University Institute near Florence to get a PhD in political and social sciences currently she is associate professor of new media and digital culture at the University of Amsterdam and a faculty associate at the Bergman Klein Center for Internet and Society at Harvard University but our guest tonight also partakes in like quote from her fabulous website go check it out grassroots engagement with data and data infrastructure called the data activism so for example she is principal at the data active ideas lab maybe a quick reference to some of her books almost 10 years ago now she published social movements and their technologies a wiring social change last year she co-edited a volume called COVID-19 from the margins pandemic invisibilities policies and resistance in the data-fied society and this monograph is in preparation I'm told data activism from information to agency I think we're all going to get a glimpse from this upcoming book but now please welcome her from Amsterdam all the way to Frankfurt to Stefania Milan good evening everyone and thank you very much to organizers for having me here it's a pleasure for a variety of reasons including because this is one of my first talks after maternity leave so let's see whether I still remember what I used to do right before my life changed for good and you're gonna see actually in the talk several references to you know thinking about what it means to grow up today in the data-fied society so meet Robert Julian Borchak Williams this man that you see there a couple of years ago in Michigan in the United States as being wrongfully arrested for shoplifting in an expensive boutique this was before because it has he has been misidentified by facial recognition technology so police forces had decided that some pictures taking and they were pretty blurry pictures taken by a security camera in the shop corresponded to the face of this guy that actually turned out to be completely honest and definitely not a shoplifter the experience wasn't particularly present pleasant and he had actually had to you know receive the apologies of police forces afterwards but in the meantime it was in jail they had to defend themselves against whom well of course police forces of course security cameras of course surveillance but in fact here we are talking about decisions made by an algorithm or a series of algorithms so an algorithm that as it is in facial recognition technology identifies some features of a face not actually the complete face but the distances between some points like the tip of the nose and the corner of your mouth and that the basis of these distances this measurement matches the pictures with existing database and identifies suspects or coordinates now facial recognition technology has been used by public forces for over two decades although it has been long known for being somewhat faulty Robert was allegedly the first victim of algorithmic decision-making but really the first that made the news that hit the media in fact unfortunately there's many more Robert around us a facial recognition technology is increasingly a staple of contemporary society you may think well I'm not Robert I'm not I don't have a dark skin so probably you know I'm protected by all of this I live in Germany where these things don't happen but in fact facial recognition technology is really between us and it's much closer than we may think for example it is increasingly deployed in education and in the education sector in public schools a few years ago some French schools have been experimenting with facial recognition technology for example in Eastern Marseille they were using a tool provided by for free this is an interesting point by the tech company based in the US Cisco whom you might have heard of and this technology was used to control access to the entrance gate in Paris around the same time facial recognition technology had been used by a number of schools to detect facial expressions and eye movement to see whether students were paying attention to the lecture so it's no longer enough to just sit at the back of the room and hide behind the taller guy right because the software is gonna track you down and of course report you to the teacher has been not very good student but similar stories were found also in other European countries like for example in Sweden and those were merely trials so we're talking about 2017-2018 but this is indeed the future that awaits our children and our students you might remember in the pandemic the famous was proctoring software to make sure the students taking exams at home would not cheat and I have an example on that later but yesterday I was in Amsterdam and I popped in a big fair called EduTech Europe fair so this is a fair was held in various countries in for the 2022 edition was in Amsterdam if he ended tonight and the buzzwords there were emotion detection and personalized learning now there was a lot of strange talking about all of this and often as a teacher I found myself thinking about the fact that indeed personalized learning is something we strive for even without technology community building the same although there the the story was that we really need technology to solve all these problems including for example teacher shortage and I found out for example about a tool called smile ml the picture that you see in the screen comes from the description of this tool that is powered among others by Google education which uses AI artificial intelligence to capture human sentiment the idea here is that the teacher has to have constant feedback from the class because happy pupils are more prone to learning but as a suspicion person like me may wonder perhaps this is also conducive to become happy consumers but maybe there's just me being a little suspicious but this is the type of feedback that you get and this is soon gonna be part also I mean it's just one of the many components that has offered to teachers we live in an increasingly data-fied society the urban environment is data-fied think about the smart city the workplace is data-fied think of on-demand labor gigs like you know mediated by platforms like Uber like Deliveroo or whatever service you have in Germany that delivers food to your doorstep but also friendships and relationships are data-fied think of social media but also apps like tinder and our health is data-fied as well you know as a woman I cannot but think about the period tracks the trade tracking apps which are a very good example of being constantly monitor or actually deliberately constantly monitoring ourselves through technology but the message here you know that the really the core mission of the data-fied society if you will is to make the word better through data and information in the data-fied society has become a constitutive force in society it's a commodity it's something that companies trade and drive on it's it's a currency also for the state who wants to have information about its citizens but it's also something that it's not only collected but it's also used to shape social reality think for example about social media content able to steer elections and you know of course we think about the Trump elections in 2018 but you know we have evidence from the political election in 2021 in the Netherlands of the role of Facebook for example not in massively steering votes but in promoting homophily so you know promoting the formation of close-knit communities of people to think alike and of course not promoting exposure to diverse ideas and in a nutshell really the message of the data-fied societies that data has or must have really a better idea than us tiny humans but at the same time data as no values right although at the same time is not definitely as neutral as we might be told all the time right if data knows better data is objective as well well a lot of people not just me have contested this idea but it's still still one of the late models of the times we live in now the COVID pandemic as it possibly accelerated the ratification of society because it worked as a test lab to pioneer or repurpose technologies including for example biometric surveillance in the op or in the faulty to some extent promise of curbing virus diffusion so we were told that for example contact tracing up would definitely help us maybe in Germany work I can tell you that in Italy that project was very quickly abandoned for obvious reasons because they were not able to track anything nor to handle the data in a safe manner but this is just a tiny example digital identity is another case in point there's been a massive increase about in the year in the around 80% increase of the use the adoption of digital technology solutions in the marketplace but also for example the state level that's the case of Canada so when the public administration cannot deal directly with its citizens then they go for this sort of solutions which might be very convenient in the Netherlands for example through your digital identity you do pretty much anything from enjoying health care to the less enjoyable element of paying taxes but of course this also comes with some or those in the sense that it's not always necessary clear who and for what purposes and those the data and given we are in Germany I cannot by think about the Luca app and the fact that although it was promised otherwise the data were released to police forces beyond its original stated use of curbing virus diffusion to instead track crime more in general but so the pandemic these are just some example it's as out on the one hand augmented the governmental demand for problem-solving through technology but is also at the same time overriding override the public concerns over privacy risks so we were more prone than before to say yes give it to me whatever it takes on my data just so that I can go again back to dancing back to the restaurant back to the stadium and in fact that's how it was often introduced technology lintel technology like fashion recognition technology in Italy for example in the aftermath of the pandemic assuming we are already out of it which is probably not the case but when finally the football resume again the Roma football club in Rome in the stadium spent several million euros to implement a system of facial recognition cameras that with the the byline that of course now we can finally return to the stadium because it was able to track the temperature of stadium goers but no one at least to my knowledge raised the question is publicly of privacy concerns or whether for example this was the most welcome almost necessary most useful even a measure to take and although we really have been promised a better world through data that if occasion that if occasion contributes to augment inequalities and to perpetuate injustice and unfortunately it's harder in marginalized communities and in countries with poor rule of law and in effects vulnerable communities vulnerable groups and racialized individuals the most I mentioned earlier the software used in a lot of universities across the world to allow students to take exams in the solitude or their bedroom at home during the pandemic how can you make sure the students do not teach that no one's make passes on suggestions that they're not opening a books or that they have you know the entire book pasted to the wall in front of them how do you make sure that an exam is still fair which we are always quite concerned off as teachers well the software used in most Dutch University to monitor these remote exams and ensure the students wouldn't cheat discriminated against dark skin tone which led to a student of the Free University of Amsterdam to file a complaint in front of the Netherlands Institute for Human Rights and we're still awaiting for you know what the Institute is going to decide on this the picture here actually refers to a similar case in a few similar in fact collection of scary cases it's a documentary that if you if you have not watched invite you to take a look at it's called coded bias it was released in 2020 and recounts the story of Joy Bolambini an MIT student who also because of her skin tone had problems using facial recognition software during her schoolwork which led them giving me a talking about resistance to protest big time so to you know not only then end up in a documentary but also to start an organization the algorithmic justice league that is tried precisely to change this type of situation for a lot of communities of color and more but you know we are looking mostly at so far at least my examples were from Europe or from Germany and from what we can consider rightly so I think privileged countries previous situations although as we know pockets of poverty and dispossession exist also in our very rich Western society but if you go for example even beyond that in countries with you know maybe not super strong rule of law you can see situations that are even worse meaning technology is implemented and deployed in situations where there are little safeguards for citizens and the results is often fairly scary in India as my colleague Silvia Masiero reported from the University of Oslo the fact that biometric identification was used to give people in need access to ration subsidized commodities so food right through a network of ration shops so this was food for families in need that they could have access through through their fingerprint so the use of biometric identification brought the program to a halt because of the risk of disease transmission associated with fingerprint identification so you can imagine a situation in which a country closes down people often working on informal economy are not allowed are able to do so anymore and these were people that even earlier in normal conditions relied on food subsidies and all of a sudden because the digital identification system and the biometric identification implemented in this case as part of the digital identity scheme was deemed dangerous in the situation then all of a sudden they cannot put food on the table so privacy so all the concerns of the data file society you can name them sound often you know a luxury program problem something that we you know have time to think about and energies and education and the skills to do so and sometimes also the machine the right machines and the right devices but in fact the consequences are probably even worse without probably they're even worse for people who don't have all this luxury assistance so is it a system failure for sure as a sociologist I feel like that if he can change us and to some extent and dangers the democratic system as we know it and I talk about democracy not because it's the only point of reference the only or the best system that exists although about half of the world population resides in a democracy although not all of them are you know working to perfection but I mention it and refer to democracy and to liberal democracy in particular because it is the system that constitutionally list upholds the highest safeguards for its citizens in terms for example of human rights so if democracy suffer then imagine what happens then in authoritarian country when technology is used activity to for example surveil people but what I'm gonna talk about now is democracy and well it is in a way a system failure or it could become a system failure for sure that the vacation has accelerated the crisis of liberal democracy now we already know we already heard a lot about that in the popular media in you know bar talks that there's a problem with liberal democracy there's a problem with voting and we know that there is you know representative democracies so in the representative voting there is always lower voter turnout younger people younger generations and not to pick particularly in tune with the idea of voting we've seen some you know populist governments emerged from these voting exercises so all of this has been widely studied by a lot of great colleagues but here I want to draw attention to what democracy sorry what datafication does then to the system of liberal democracy and I only mentioned three elements amongst probably a much longer list but these are the ones where I've done a research on so they're only it's only only partial depiction of the problem you can add your own for your own perspective so the first is the observation that the increasing role of the increasing role of the industry in democracy so the industry meaning that in fact a handful of large corporations they're monopolist or semi-monopolist very vertical vertically organized that corporations play an increasingly big role in state affairs visible really in the mediation of the relation of the state with its citizens so here I'm not talking about for example the problem of lobbying so you might be aware that for example the European Union level so in Brussels right now people are discussing the new AI and the new digital market acts so legislation that is supposed to set some boundaries to this magmatic market and I don't have data about lobbying related to these specific acts but I've noticed how there were an unprecedented amount of lobbies in the I think in the order of 4,000 only the descended on Brussels during the negotiations for the general data protection regulation which you are all probably very familiar with so something that was came into force in 2017 so already a few years back but that created quite some hurdles for for example US based tech companies because of data transfer across border and so on and so forth and data protection of 42 European citizens so that's a problem so the lobbying and the influence of the industry into legislation formation but I'm not talking about that I'm talking about the fact that for example in Amsterdam and in fact in most Dutch villages and small to big cities if I want to talk to my local representative if you want to talk to the local administration for a problem in garbage collection a broken lamp you name it I have to do it through Facebook or especially what's up which in fact is same company so we see more and more intrusion of technology into this this type of relation between the state and the citizen the same can be can be seen with relation to a lot of covid related technology like contact tracing up or the digital covid certificate right a lot of this software was provided by the private sector and we can only speculate what this might mean in terms for example of data protection and privacy but also in terms of you know the democratic process of relating through my representatives for example then the second observation has to do with surveillance and I'm sure in this series you've heard a lot of people much better than me much more informed than me talking about a surveillance so what I'm gonna mention today is only the fact that you know surveillance the mass surveillance that was made visible by for example this northern revelations which exposed our national security agencies across the world in fact preventably spy and collect data in a blanket manner on their citizens well this phenomenon as of the observation that the public notice that these mass surveillance exist and it is so mass as altered the trust of the citizens towards the state including what has been called the the social contract between the state and its citizens and finally a third observation of how that if you catch us and rated the crisis of liberal democracy as to do with social media and the algorithmic personalization of content including content of political nature which is operated by social media platform which you know exposes us on social media platforms only two opinions which are like ours and increases you know in group behavior at the expenses of exposure to different opinion and you know the fundamental dialectic exchange which is key to a healthy democracy but today I had promised to not just talk about the depressives part of the story but to talk about also resistance so how can we continue to exercise our political agency our rights as citizens in such a monitor environment is resistant at all possible or is just an utopia I don't have an answer to this I can really anticipate that how can we organize resistance individually and collectively in this ever more complex data fight society and the problem does not only concern resisting to the data fight society per se fighting surveillance in public space so I don't want cameras I remove the cameras or ask someone to remove the cameras it's also about being able to express dissent in a society which increasingly monitors his citizens even when they are not they have nothing wrong at least to that moment facial recognition technology to go back to the examples that I started from which is my current obsession given I'm researching it for example means really giving away our identity when we are caught on camera preaching pretty much like walking around town with a passport or an ID card open and paste to our forehead now you might wonder how many cameras that are around us well probably Frankfurt is or anywhere you you are is better than London which is infamous for the intensity of surveillance in London since 2020 the police has been implementing facial recognition technology in security cameras across town although an independent reviewed had found that the accuracy rate of these matches so the matches between people caught on camera and you know people identified as suspect is correct only in 19% of the cases mean that out of 100 people that are identified 81 are misidentified and might end up in jail by mistake but given this situation in any case someone has calculated that in London and I you know immediately there might be a particularly bad situation but ask about your urban environment each Londoner has been estimated to be caught on security cameras about 300 times a day so imagine even if half of this facial recognition technology and much data and intrusive data is collected constantly about us now in the slide here you actually see this takes us to the United States of America so a slightly different environment but there you see a Black Lives Matter activist named Derrick Ingram who has been arrested a couple of years ago in New York because he had participated to a demonstration a Black Lives Matter anti-racist demonstration and the way that he has been arrested was matching data from facial recognition technology with some Instagram pictures which means very to put it very simply that you might not have anything to hide and I hope it very much for you but if I've ever been if I have ever been on a social media platform you me we are all part of one of these databases against which then facial recognition technologies try to find potential matches to identify suspects and culprits so the situation is particularly scary for us all despite as I said we might not have anything to hide and then it shouldn't surprise us that in summer 2019 and this is a great example of resistance Hong Kong pro-democracy protesters you might be familiar with the picture attacked trying to take down you can see at the bottom there the picture they are trying to take down this smart lamppost which are feared or were feared to harbor facial recognition technology which would have given away their identity as protesters and this was a protest against extradition to China so you can imagine that if you are protesting against China in Hong Kong you definitely do not want to be recognized because it has potentially dangerous consequences but so kudos to the Hong Kong protesters for doing that it's a great example of resistance but how can we make sense so what I would like to do in the remaining of my talk is to sort of provide some examples of how we can engage in resisting the data fight society what can be good solution not all of that immediate right they're not like the magic gun that is going to solve all the problems at once but something that a society as citizens we should definitely engage with and think about so the first is the five the first so well first of all there are five type of action that as a sociologist you know I would call the action repertoire so that these are essentially known practices that social movements and protesting individuals apply from time to time to reverse a given situation or to promote social change so now this is particularly new as a practice so it has been part of what social movements been doing for decades in fact for centuries in various parts of the world social movements typically do not reinvent the wheel all the time but they have been applied to data fight the data fight problem the problems of the data fight society and most of them can be you know they foresee let's say you know they they make room for both individual and collective engagement and you know you can resist both through your phone but can also resist in the street with other people which is probably always a much more powerful approach and so they are basically of both kinds both individual and collective hyper practice the first one is probably however a bit more individual than the other and it concerns self defense now I was looking for a nice self defense picture and you know I just borrowed this from an organization called the front line defenders also kudos to them for all the amazing work that they do in helping especially human rights defenders in authoritarian countries to defend their activities and their persona and but practicing self defense in the data fight society includes or start from for example deciding on the setting of your or your smartphone now the best situation will probably not to have a smartphone but given probably also for practical reasons we all have it well deciding on each setting is is a good starting point so what app should have access to location services what apps should have access to the camera I actually happened to look at that this morning because I was a bit bored on the train and I noticed that some apps that had nothing to do with the camera that I installed because there were public transport app they had asked and automatically right they didn't ask me there was probably in the terms of services but you don't read you know this kind of super lengthy legal text all the time and they had actually being granted access by the phone to a number of functionalities like location services that maybe they didn't need or I didn't want them to have access to so while the situation might be of course rejecting smartphone social media bank cards and all that it might be hardly practical so informed choices is then the best answer it also means preferring software and services for example the browser that you use so much that implement highest privacy safeguards and no google is not one of them but there are plenty of other examples that you probably use already the second strategy the section action repertoire that you have available is subversion subversion the data fight society can take a lot of forms and one is the very old social school the very old school destruction adopted by the Hong Kong protesters but it can also mean for example swapping travel cards so in the Netherlands a few years ago paper tickets have been phased out so you have to have a cheap card and this cheap card is usually individual but it can also simply be anonymous but in any case all the data is collected constantly about your travels in the country so what people started organizing in in the Netherlands are like small swap parties where they trade cards just with the idea of messing up with the data collection implemented by the card provider so this is just one example in the interest of time I'm gonna leave it at that but there's many more other creative ways of doing that the third has to do with literacy and education it's a continuous learning activity for all of us because technology of course goes very fast and much faster than our ability to catch up and read about it and be concerned about about all the privacy hurdles that it comes with but in fact you know literacy education has to do with self-education so I have to to educate myself and continuously so not only because I teach about these things but because I also use many of them so it is you know a challenge at every level of at every moment our life but it is also something that is extremely important to introduce in schools and I know that some countries have timidly starting to teach schoolchildren how to use how to deal with algorithms how to understand popularity for example and virality on social media platforms or we need much more of that because if we are to be to form the citizens of the future they have to be informed not only about how voting works or the parliament works but also about how to defend themselves in this increasingly increasingly difficult context and how do we keep ourselves informed I want to give a shout out to what are called crypto parties crypto parties are they have been defined as the Tupperware party for learning crypto I don't know how much of you are familiar with Tupperware probably this betrays my age but essentially Tupperware are plastic boxes that our mothers used to buy and they would buy them from each other so it was like a pyramidal scheme and you know they would just get together and teach each other you know tell each other all these secrets about this wonderful plastic stuff and I say it with a bit of irony but whatever and but it was really you know coming together over coffee over whatever drink a beer to learn to exchange information about something which was a vital importance and you know crypto parties do exactly that they gather people who want to learn more about defending their privacy online for example masking their communications in the Netherlands I call privacy cafes and before the pandemic at least they were used to be run mostly in public libraries because that's a great public space open to all but also able to capture various so not only the hacker minded or the the one who are really obsessed with technology but also those who simply have legitimate questions and do not wear to find answers so literacy and education are key to resistance as well and if you want to to nurture resistant citizens we have to start from school as well and counter imaginaries this is something that's probably a bit less familiar to to a lot of us but social movements have been trying to change the terms of the debate so to promote norm change across a variety of fields that's what for example the environmental movement did all of us now today feel a bit bad about flying but maybe we didn't feel as bad about 15 years ago precisely because you know public debate on for example the environmental cost of flying as a change so similarly there's a lot of groups that are trying and there are groups that work as sort of intermediaries in a way so they're sort of experts they're progressive developers there are people who are engaged in digital rights activism and they're actively trying to change the terms of the debate for you know lay people as well so to help people to understand technology and the challenges of technology by giving them alternative perspective on a given problematic aspect and they interpret the needs and or serve the interests of the affected social groups by giving them in a way serving them a different story and they make a parents to how to to respond for example how to engage in a given situation now this is actually a bit of a funny example this is see the dazzle but i picked it because it was recently also in vogue so it has also reached a certain level of popularity uh you can see it has to do with makeup and essentially resisting facial recognition technology software by using um certain hairstyle or you know painting your face in a certain manner so it explores how fashion can be used as camouflage from face detection technology which is the first step in automatic automated face recognition as you can understand this is really not a very practical way of going around town just to hide yourself from facial recognition technology at least it would be a lot of work every morning to have to set up all of that but there's a type of workshop that this project called CV dazzle organizes they serve the purpose of showing people how the technology works and then making them think playing with themselves about you know the dangers the intrusiveness to you know understand their feelings would have been this type of technology and eventually also try to make informed choices now it's not that we can always necessarily make informed choices about facial recognition technology but this leads me to the strategy number five the actual repertoire number five in the last one we concluded which concludes my talk today and this has to do with advocacy and campaigning so ultimately this is really the most collective of all these practices this really showcases the power of coming together as a group to protest and demand change this is a campaign that started a few years ago it's a european campaign for the most part and the goal was to ask the year and i say was for a reason i want to explain a minute was to ask the european union to ban facial recognition technology in public space across the union so in all the member countries i say it was in the sense that there was also an attempt to start a european citizen initiative that is the referendum instrument of the european union which however is a very very complex one the goal was to gather in the space of 12 months one million signatures for at least seven member states and unfortunately this campaign failed to do so although they even got an extension because of the pandemic but as you can appreciate facial recognition technology is not yet top of the concerns of most citizens but this campaign is a great example of how you know to also implement you know count the creation of counter imaginaries because they were trying to posit and present facial recognition technology as dehumanizing and also as an example of how you know to educate yourself and continuously educate yourself about for example legislation and it's also an example of self defense but yeah at the moment the the exercise of reclaiming your face this is the name of the campaign is still going on the focus is no longer on gathering you know signature to ask for new legislation to ban facial recognition in public space but the focus and the energies are concentrated on trying to influence the from the creation of the AI Act which is a good part on facial recognition technology but ultimately this is a good reminder of the fact that if we don't like something we can resist as individual we can resist as as a group but probably the most fruitful manner to resist is to ask for better laws and rules inform ourselves and others about risk and challenges and mobilize against measures that we find unfair and discriminatory or simply invasive of our privacy I hope this is not an exhaustive collection of strategies of resistance but I hope I inspire you a little bit and there's more in this website and thank you very much for your attention so thank you so much for your talk and for teaching us so to speak Stefania five ways of resisting thatification I'd like to talk about that in a little bit more detail in a minute but I'd like to start out with a an observation I made 10 days ago in this very city of Frankfurt and it sort of ties in with an example you gave of was it RS Roma or Lazio Roma some Roman football club okay for another event we're having tomorrow on cryptocurrencies trading apps sustainable finance and so forth I was trying to do street survey with my phone and I was asking people if I could film them asking them about whether they knew about crypto were they using crypto what did they do with trading apps to did they think that the banks should be regulated more tightly especially here in Frankfurt and so forth I spent eight hours on the street and I had three hits which was extremely low and I had to sort of I had started I had started to cast friends of mine friends of friends of mine so that I could actually find enough people to edit three different clips and those subjects and I was really surprised because I've done that before and it was much easier I mean it could have been because I didn't have a huge camera I didn't have a sound man I didn't have this authoritative you know look it was just some dude with a selfie stick and the telephone that's not very incriminating right but then I started to ask myself are people actually becoming more sensitive to facial recognition that I was I mean spending eight hours on about five different spots here in the city in Frankfurt and having three hits is a really low turnout right and then I was thinking of your example of the football stadium where I thought are actually hooligans of a Roman football club letting that happen facial recognition in their stadium that would be almost unthinkable to me especially after this experience here in Frankfurt are people getting more sensitive when it comes to facial recognition is something happening there is this on yeah well I wish people were becoming more aware and more critical but in my from my perspective from my point of view which is anyway partial but there's not enough awareness enough enough people are not scared enough probably it's also because a lot of the examples that we use often refer to for example racialized individuals so people with a darker skin and a lot of the people that we see in the society where we live feel they have nothing to fear precisely because you know the examples are often of people are different from the majority and facial recognition technology is known to be biased not really against but in fact that's what happens it's not optimized that's right terminology for darker skin tones and so you know that's that's probably the message that most people get like it's a sort of newer version of that I have nothing to hide right it's not really going to affect me that much therefore I don't worry I have to say though that I think a lot depends also on the sort of let's say national culture now national culture is a bit of a stereotype but you know I can tell you that for example one time we're traveling with our van in Germany and we wanted to pay a campsite with the card because he had run out of cash and the guy was like no there's no option and it's like I'll come that we cannot pay with the card and this guy went like well because we are all surveilled and I wanted to kiss him right because like wow this is like a conversation in the middle of the night in a van park especially more than a campsite and I found someone was concerned about this so I do believe that it seems to me at least that for example the peculiar history of Germany with the experience of the Stasi has probably remained somewhere ingrained in people's minds and you know you seem to have by you mean German people seem to have much more awareness of surveillance and its risks than for example it happens it is the case in Italy the case what I mentioned about the football stadium of the Roma FC I couldn't find a single critical piece that would question this introduction this technology from the privacy point of view it was literally all the reported the report that I could find was hey that's great we are advanced we are ahead of others it's so cool technological innovation we can go back to the stadium oh wow and the fans didn't resist at all the hooligans that will be hard to imagine in Germany actually so this was presented not as you know used by police forces but it was presented as we use it to track weather of temperature so that we're going to kick that person out so it doesn't in fact you would call it okay and that's what I was presented so it was a private enterprise doing that and it was not linked at least officially to the issue of security I think it's very interesting that you mention a cultural difference there I would say yes on one hand and the other I still would doubt it I mean of course there was the GDR which was a small country as we know compared to western Germany and but there was also heavy resistance to the censuses in the 80s here where they burned down community houses and everything it was quite hard but then on the other hand we all know what traces believe voluntarily on the internet and I don't think Germany is any different than any other nation that is online so to speak so I really don't know how to make sense of that which brings me of course to your five ways of resistance you outlined tonight you know it's basically about increasing agency which which sounds good but you know again what we've seen after the Snowden League was that 2013 where crypto parties actually started at least in Berlin where I come from crypto parties started to hit a little bit more prominently but in the end I'm not sure you know how big or a thorough that change actually was I know hardly anybody that does you know even encrypts the email or things like that so it's not really a success even after a massive leak like the Snowden leaks and almost 10 years ago imagine that and and what you advocate for is very interesting on very different levels but I also think it's a lot of work it's a lot of work for everyone you know especially for those who have too much work on their hands already be it for money being for care work whatever underpaid work and so forth it's a lot to do for people if you sort of delegate resistance on an individual level at the end how do you what do we make out of this yeah you're absolutely right there's no revelations haven't changed I've been produced enough change not definitely for my own point of view I remember you know presenting a funding application that was based exactly on it was right after 2014 and it was based on a hypothesis that there's no revelation would really change our perspective of surveillance because people like me were been studying those things and you know being supposed to certain type of activism for a long time we already knew it in a way maybe we didn't know the details the extent of it but you know you could you already you've learned over the years to be suspicious of all of this but essentially you know the fact that for the first time a surveillance program is in the daily news right it is of concern of for example the president then president of Brazil Dilma Rousseff that goes to the United Nations so it really becomes a topic of public debate was unprecedented unfortunately you're right he hasn't produced enough change or let's say long lasting change we might be aware we probably all are by now that we are surveilled to some extent but we decide we are okay with that because at the end of the day it's convenient right our digital life where you know I go to Frankfurt and I my phone supposedly knows the food that I like and I get some personalized recommendations about my ideal restaurant is why not right but indeed there's also a number of social groups individuals but also communities were far less privileged and they definitely cannot say no to certain type of surveillance think for example about Uber drivers you probably don't have Uber in Germany right what we do oh you do some cities some cities fandom some don't depends and and those you know they are they aren't bound to the platform because that's where their work comes from so again the level of so let's say if it should not be delegated exclusively at resistance will not be delegated exclusively at the individual level that's why I ended with you know a sort of call for reform but at the same time it's also important to train the generations of the future in making informed choices from the start so to us given we were not raised with these tools it might take probably more energy and more time to our age group let's say then it might take two kids born in you know 2015 but then we also have to empower them from the start that's why literacy so we have to basically ingrain and bat you know this type of concerns in school education and also basically give people the tools to making informed choices let's stick to those five ways just a little bit longer if you allow Stefania we've had managing the setting the settings right of your apps managing the settings to like check what you allow your phone to do if it needs if an app asks for your camera where it when it doesn't need it really you talked about number two I think obfuscation pretty much number three literacy and you just mentioned again now crypto parties as an example three more counter-imagineries you mentioned apps like Dazzle that were covered in vogue even where you might expect some sort of cultural phenomenon that it becomes cool to dress like that or to do your haircut in a courting style and you know pretty much basic political resistance with reclaim your face now again this sort of reminds me in some ways to the discussions it's not identical and I'm just trying to make this connection here to discussions we've had in the 90s you know with people like the chaos computer club of course for everybody thought oh we all have to learn to code that's our that's our gateway to heaven right now we can do this right if you all learn to code of course nobody learned to code pretty much or not not many people and the apps and the interfaces just got more and more convenient and that's where we were at now actually it's just kind of a call to sort of reverse this whole development we've had to convenient interfaces what we have to do what you advocate for in many cases here is actually look behind the interface is actually to see how it is wired and I mean you can you just have to use a whole lot of time right yeah this might reflect my obsession for infrastructures I'm very interested in understanding how things work and I do believe that informed choices goes through that privileged path if you want in the sense that it's only by you know knowing not exactly how things work I mean it's impossible to to know even for the most expert software developer for example any social media platform algorithms actually work precisely because for example they are proprietary software so we don't have we can only reverse engineer them if you want so try to understand the source code from their behavior but they are way too complex to actually do any of this type of exercises but you know that said I do believe that at least the mindset that encourages people to not take things whatever you put instead of thing you know app or device or you know technical solution at face value that is the attitude so you might not actually learn to code but you might learn to ask critical questions so is there is really the the sort of practicing suspicion in a way and asking yourself what this is for how does it work that it might not take you to actually learn to code it's and even if you do it's impossible to understand most of the machinery that surrounds us anyway but at least you have the ability to make some informed choices we are never going to be completely clean and pure unless we live and probably not even if we live on top of a mountain right but if we want to be citizens that understand even you know like how to process information today which is very different from how you used to process information when you were at university right then we have to understand how the machine works when we talk about literacy especially in the night like this that is you know again a joint venture between an academic organization like the Humboldt Institute and Federal Agency for Civic Education we talk about literacy we talk about you know the main state apparatus which is the schools right and we see sort of very ambivalent signals from a lot of schools you mentioned some of them in other papers I've read by you where it's sometimes even mandatory to use WhatsApp or Facebook or Google drives or whatever to actually participate or to to do tests and so forth now I would like to talk a little bit about the regulatory level there because I mean there are things proposed by certain commissions on a European level that want to ensure what they call interoperability right which basically it's it sounds complicated it's very easy it's if you have a WhatsApp messenger app you have to be able to send the message to say signal or any other messaging app if you think this towards the end it's much more far-reaching you'd have to exchange files between Spotify and Apple Music and so forth to ensure interoperability what you use there what you buy in one place you have to be able to send to another place right now is this a promising regulatory way to actually circumvent to what's happening in many schools that sort of you know force you to use certain platforms and certain apps you think that is bound to succeed what is actually being tried in Brussels well I'm not 100% sure the interoperability is the answer because in a way or another you're always bound to a platform or another so we might want to use certain platforms certain services like chat messaging I mean they're they're very convenient like my family of origin is in a different country I don't even have to do any more an international call right I can just call to signal or telegram or WhatsApp whatever right but so again we are back into the trap of convenience but then there's also probably thinking in particular about schools we should really ask ourselves do we really need all that technology and what does technology do what does this substitute is a substitute for and can we probably do without it I mean yesterday I mentioned I was at this briefly at this edu tech not a festival edu tech Europe so it was basically companies paying a whole lot of money to have a stand there so it's a festival that is open to essentially ministries but also public procurement agencies that then buy you know all the expensive equipment and software for you know a lot of schools in the country individual for example universities of educational institutions and so on and so forth and I've heard Google talking about we need more standardization which is a different story right standardization is actually you know basically making sure that certain so it's not interoperability is machines talking to each other in a way like but exchanging content but standardization is really machines working with each other understanding each other so at a more basic level and when I heard the Google talking about standardization it was more in my mind I read essentially like well this actually means you should use our product more than you know so we provide an excellent platform which is actually true I mean it's a very well functioning platform with a lot of I mean talking about Google classroom with a lot of functionalities and you should actually you know come to us and kind of embed your product in there so to me ultimately the the solutions really again taking a step back and asking ourselves exactly for example do we really need this emotion detection in real time so that me as teacher can see whether my school pupils are are following or they're sad or you know their dog died that morning maybe the information's important maybe it's not strictly necessary to go through the the the technology maybe I can simply talk to people right so I think this I'm not sure I'm answering your question but I'm not I'm sure that interoperability is the solution it probably allows to break some not really break but like hinder a little bit shake a little bit some monopolies and that definitely allows for more choice so to speak to my family members while I have WhatsApp I don't have to install WhatsApp I can still stay in my preferable choice that helps but it's only a cosmetic type of approach I think one last question before we open this up here to the public and to the people at home watching is you know one of the core sentences of your talk of course I think was that a vacation is endangering liberal democracy something we have to cost often here in the last five years it is certainly a big issue but I always like to ask can it also when coded differently lead to actual empowerment of the marginalized as in controlling the state you know as you know basically asking the questions in whose hands the tools are I mean some examples for example in China we see that that a vacation and monitoring of big companies also leads to less pollution things like that so we have really positive effects of this of course massive and very authoritative authoritarian that a vacation that we see in China but there's also very positive effects in that that's not really what I mean as a way for Europe which is you know asking the plain question can we actually put those tools in the hands of the marginalized to sort of counter this datafication with a datafication with a diverse datafication yeah for sure I mean I decided to to focus my talk today on resisting the challenges of the datafied society but indeed the datafied society offer also a lot of unprecedented opportunities and you know one example that I have in mind now is for example info amazonia a coalition a network in fact of activists environmental activists, anti-deforestation activists and journalists and storytellers and teachers that various countries around the amazon I mean in that you know you know the amazon forest being the biggest in the world crosses several countries and we use data fact tools of notification if you want I mean like for example drones and apps of various kinds to monitor deforestation and then provide information trying to lobby against this this type of phenomena which is massive especially in brazil and you know but one thing is to say hey there's a lot of trees coming down and the other thing is another completely different thing is actually providing evidence of something which is in fact very far away from any living you know any city or and the public purview where a lot of people live right so that's that's one example another one that also comes from from latin america for some reason is in argentina in argentina people knew for a long time there was a problem with gender violence or domestic violence violence against women but again there were no numbers there was no attempt by police forces by the state to keep track of this phenomenon so how do you advocate against this how do you try to create a different culture of respect at home if you actually don't know the concerns of the problem and there what what people did there a feminist group came together to you know count literally the death of women in the country and then starting you know using fanki software that is really available everywhere for free now to produce data visualization and then this data visualization made the problem apparent and then the problem became actually a public drain became an issue that people all of a sudden were concerned about so it was not only my experience of my friend being killed but all of a sudden it's something that concerns us as a society but there's plenty you know for example collecting evidence of the syrian conflict this is actually a project in based in berlin called the syrian archive that collects evidence of any violence committed by all sides of the syrian conflict and this evidence maybe one day will be used in court in the hay for example in the international court as it happened for example with the ukoslavia tribuna right so there's a lot of work to do in the sense that we have to rethink also the tenets of you know for example the legal system what evidence produced by me on my phone then can be used in court under what conditions for example so there's there's a lot of work to do at various levels but definitely we have a lot of tools that allow us today to use you know the possibilities to turn in our favor essentially all of this developments thank you for those examples once again as stefania so let me ask you here on the floor there's a microphone going around stefania has it where are you and stefania stephanie there she is in the back of the room any questions from the audience here from the floor before we go to the digital tool feel free questions or just comments and what you've heard well there's one otherwise we just switch to two slides and see what's happening there but please sasha uh yes thank you for your talk um not that much of a detail question rather if you'd like to share maybe um some of your principles or strategies and how to cope with the surveillance and the problems coming from that or what do you do as an anecdote on that i'm kind of trying to circumvent surveillance uh one of a few steps i took was changing my search engines um well but that often leads to all funny moments with my wife where she really asked me to please stop using this this this engine because the the results are not as good as she's used to be when she's working on my my device or using my my smartphone have you some principles on how to change your private environment there there's many different examples like with kindergarten or in school or with colleagues how do you communicate with them do you have any any tips there that's a good question and it's also well you know i don't have the magic band that i don't have a perfect answer and mine as well is in part a history of failure in the sense that we all start with the best intention with a very secure stack of you know trying to to make our life um airtight in a way but then it doesn't work as planned why because you know you might encrypt your emails but then your interlocutor doesn't or you might encrypt your emails and lose the password it happened to me twice so then you have to start again um you know it was such a secure password that i couldn't find it anymore um so even people who like me have actually a considerable amount of time or let's say i studied this stuff so i can make room for experimenting as well so in that sense i'm privileged it doesn't always necessarily work i have to say when it comes to the daycare which our daughter just started this week we did pick a daycare which does not communicate to parents via whatsapp but what they do they don't send you pictures so once the kid isn't there you don't know what is happening but they actually write a paper diary they still exist so um that's i think a positive you know observation that also has reached this type of environment i'm i'm reminded of a colleague was the the classroom of a child was using whatsapp to notify of positive a covid positive cases and then telling everyone then to you know stay at home and whatever and she only got the message because she didn't use whatsapp two weeks afterwards when it was not exactly very relevant anymore but it could potentially have exposed uh you know our family and others to uh you know the virus so um but talking about positive experiences probably i can mention uh the one um of our research group in Amsterdam we spent just to give you a sense of the effort that sometimes goes into something like two years to create a completely secure stack to do our research and collaborate with each other also remotely via encrypted channels using only certain platforms and especially for example uh what you do when you um author attacks together and you don't want to use the mainstream um docs that are uh then searchable by search engines then you have to use alternatives and there are alternatives for that as well but then sometimes at least in the case of our research group you require a little bit of financial effort to pay for a password protective service and to you know run our own servers ourselves but it is increasingly difficult also at the at university level because um they actually want us to use for example exclusively microsoft services now including for storage so um you know this is to say that uh you know i could mention software but um software comes and goes sometimes especially open source software sometimes not so well maintained is definitely not as funky as uh you know other more beautiful looking uh commercial software and uh it requires a lot of patience and often we have to be prepared to uh failures again there's a time factor right you have to privilege enough to have time in your hands to actually uh stay on top and to maintain those things thank you for that question is there another one from the floor thank you here third row please thank you so um first of all also from my side thank you for the talk so far and you mentioned that um we in germany have maybe a bit further advanced awareness to all towards uh surveillance because of our um history with stasi and so on um and how would you maybe um compare with the different countries internationally the different areas of the world concerning their surveillance surveillance awareness maybe well this is a million dollar question in the sense that also my observation about germany was not based on any empirical data was based on my conversations and by you know for example noticing how um when it was that the with the data retention legislation so it was around i think 2007 so a piece of EU legislation which for the most part went completely unnoticed because it was uh very dry and you know very boring type of of of topic um in germany there were a couple of demonstrations gathering i think one even 30 000 people which is something unheard of in any other country right so um my then observation about you know the higher level of awareness about these topics um in germany is only based on this right so i don't have any empirical data and i also don't have empirical data about other countries and as a researcher should probably only talk on the basis of empirical data but in this case i'm not now um we can however notice for example there's the euro barometer which is the basically continuous pulling instrument of the european union that monitors a number of aspects of you know people's beliefs and preferences and stuff like that and they also monitor for example uh the digital skills and digital awareness and um you can see um you know essentially they ask people not even you know do we have a critical attitude towards something but how later do you consider yourself being versus you know various various type of tools and you can see there's some countries especially northern europe are leading i don't remember now the exact list and some countries especially in the south southern eastern europe are lagging behind and uh there might be a variety of reasons i refer to the stasi because again thinking about the data retention uh demonstrations uh i still have the t-shirt that was produced but i don't even remember what berlin grew up back then that um used some of the um design of those times to bring it under the spotlight the then minister of let's say information whatever there was a person in charge of implementing this legislation in germany and uh and for me it was extremely impressive to see how you know basically they had made sense of history to present a topic that uh or at least history in the sense of something that people might not have necessarily lived through but are familiar with at least what having heard about it to make sense of something very complex in today's society and um but uh yeah there's there's a lot of work to do i also hear um uh fairly sure thank you let's look uh a slido is there anything going on a slido who's doing slido there you are please i think we need some light don't we have some light on you here no no light anymore uh good night i want to thank you for your time and for sharing um i think you started by saying that data is not as like increasingly less accessible uh and i would like you to ask to expand on that especially how by being a commodity it can promote inequality by being not accessible to the majority of people um was was i clear is the question no i actually miss the first word what is not accessible that uh technology that uh if it's becoming like less accessible i want to ask you to expand on that yeah that's a that's an important question because it goes back allows me to reflect on something that is no longer as popular as it was um at the turn of uh the millennium let's say so in the about uh you know 20 years ago uh 20 25 years ago which is the issue of the digital divide is it the direction where you want me to go or so good i start from there to then you know go to the present day so uh the digital divide refers to a problem that's extremely important which is you know the fact that some people have access to technology some others are not now what we see is today with no longer care about digital divide at least is no longer part of uh you know there's no many policies about this anymore but also towards uh developing countries and a lot of this is left to the private sector to solve so there's still a problem of access there's still a problem of privilege there's still a problem of you know lack of devices and machines but what we see is that the industry is trying to solve it and uh for example you might remember a discussion recently about the zero rating uh problem not really recently about 10 years ago i guess by now uh the zero rating services refer to services they're essentially were offered like a strip naked in a way two people who couldn't pay so for example in india if you couldn't um access data on your phone then um so if you are you know didn't have a subscription to uh to have data traffic by installing for example facebook you would have write to some data through facebook which then meant that for example for certain african countries there were actually interesting studies about this pretty depressive but it almost makes us smile but it's unfortunately sad reality for a lot of people the news then were facebook for example right because it was the selection that reached them through the device and the only type of interaction there to the device which was the zero rating meaning free service data service offered by uh facebook so um this is one dimension of uh the problem and then the other dimension of the problem is uh the issue of visibility and invisibility now we tend to think that privacy um is very important and we you know keep it in eye regards and rightly so but not everyone has the possibility of saying no to certain services uh remember what that i mentioned for example the case of india where there is this digital identity system you might also want to stay out of that but what is then you have uh your poor uh and you need to put food on the table you need those foods after this then it becomes i mean the only way is to give out your fingerprint for example so um and in that case uh you know privacy becomes a secondary uh concern but also being visible to the state becomes extremely important and this also concerns a number of communities also in our rich society for example migrant communities uh or the so-called undocumented migrants that uh you know there are many of in our um in our societies as well who uh because they were afraid of uh being reported and maybe they didn't go during the the pandemic to get um you know uh a covid test but at the same time uh you know being visible to the state also meant that they couldn't get access to any for example subsidy or income supplement that the state might have made available so in a way the problem of what was called a digital divide has spiraled up in many new dimensions and definitely in a nutshell technology is not so of the problem in equality there will be solutionism right we're going to the microphone here I'm going to switch to German to see what's in digital the battery tool of tonight and see if the questions came up there please yeah some questions came up there one you tackled somehow with your last answer already and it was a topic that came through again and again um through your talk um how do your surveillance practices affect social injustice and social divide privileged wealthier people tend to be more able to address the problem and resist on a small scale uh e.g. using devices on like apple and such um as one question maybe you can elaborate on that a little bit further another question I just like to add and then you can decide um so variance has become commonplace especially among the young generation who has grown up with it how can we help them see the need for protection I think the first one we just covered pretty much the um social divide so to speak the inequality uh reproduced uh by surveillance technologies of course but the latter one would aim at what we've talked about in terms of literacy I guess right how to teach yeah well yeah I want to start with um referring to the work uh by Lena Densik and colleagues at Cardiff University at the Justice Lab Lena in particular wrote about what she calls surveillance realism so essentially we leave so much submerged immersed in a surveillance society that we are lost the ability to notice and uh to to care essentially but even to imagine any any alternative right so at some point surveillance or whatever monitoring or being spied cameras becomes a given and you sort of surrender you're like okay um once again literacy is indeed uh key and um there it's it's a bit of a challenge but I always find interesting to let people experience first and uh the drawbacks or the problem I'll give you an example um and I have to refer to the work of a great developer Claudio Agosti also a collaborator of us uh in Amsterdam who um wanted to make people reflect on their information diet so uh it was like you know we're not going to get people out of Facebook or YouTube or any other platform let's show them firsthand what they're missing out what this software does to them what this platform does to them and um so what it did was to create a software that gathers data uh about your private content but also then serves to you the content but of course uh that that you you have been given but compared to the content that other population or reference have been given so to see what you are missing out so in a way what I find interesting is to give people much more than the theoretical uh considerations which are you know theoretical but to make people experience first hand what it means to them another great example is amazon and the dynamic pram pricing that is implemented by amazon so you know by now probably that um you get a different price on amazon if depending on uh your ip so your location what you're browsing from your computer your browser your computer so for example if you are an apple user you're likely to get a slightly higher price because you are automatically placed in a more wealthy uh sector of the user population and so and so forth and um when it comes to money people tend to be a bit more sensitive right so political counting yeah whatever I have my own opinion I'm not not gonna be you know gonna change that no one really likes the idea of being steered around like taken by the nose by the platform right but uh when you actually show how this affects um the you know buying habits then it becomes already uh more concrete and more um interesting especially for those who actually earn money so maybe might not work very well with school children but it might work better with um older um the older population but um you know um Caldi Agosti did that experiment it's actually on a video produced for uh an italian broadcaster rai and uh so there was software developer developed on purpose to sort of to say although it's a bit incorrect but reverse engineer the uh the algorithms that implement the dynamic pricing on amazon and then evidence being produced and it's on camera and this i'm sure made a lot of people uh think about what algorithms do to them thank you staphoni we usually close with um with the r question the regulation question but since we've covered um some of that already i'd like to um ask a different question play a little bit the devil's advocate maybe right at the end in germany uh we have quite a different app that in italy it's called you see it blinking red now everybody who has it on it's probably red so it's called corona worn app right um and the interesting fact about the development of this app um at the time when it was developed i think in summer of 2020 although time frames sort of are blurry by now with two and a half years of the pandemic right i think it happens to everybody it's not just my age right um the interesting thing was that the data security proposed by the tech giants in this case apple actually was much higher than what the initially state favored models were and some people said quite many said that the lack of success of this also tracing app it just it doesn't just do tracing it does many other things or it has learned to do many other things too that the lack of success was actually due to its high security standards which were extremely high i mean many data activists said yeah well yeah let's favor the tech giants model because it's a lot better afford the state initially proposed and of course we know all about those pr campaigns with the new iphone and google and so forth they're all about privacy right they're really stressed out and i wanted to ask you as a closing questions do they learn is there a change imminent actually by those tech giants the corona worn app would be one example i'm not sure if it's a contingent example or not what would you say well the other example is what's up right what's up was not encrypted and i mean although there's various analysis of how tight the encryption is but let's let's leave that that but you know signal was encrypted and then what's up used actually similar technology to also go encrypted in a way you know the industry goes i mean it's often leads the way but often responds also to the needs of the market and the desires of consumers so if you want to sell a product you have to satisfy people's desires so if the society changes but for that you need literacy probably and an awareness raising and campaigning and you know yeah awareness creation then the tech giants might adjust or might also bleed the way because it's probably the best technical solution i am probably a bit suspicious of the good deeds or the good agenda let's say the global common good agenda although there's actually several examples of that as well coming from the industry but you know it's not for me to say but there is there is hope also in the industry for sure if that's the question but definitely we have to become more aware and more concerned users and consumers to contribute to drive this change thank you so much for that closing statement thank you for being with us this was the last edition of 2022 of making sense of the digital society now exclusively in frankfurt our mind thank you for being with us here online thank you for making all those travels for us stefania milan have a good night thank you