 Hello everyone. I can't see you. I hope you can see me. It's getting better already Hi, thank you for turning up under the circumstances that we have now here in Germany, you know All what that means and now you appreciate it really Yes, you probably know most of you maybe at home You don't before your screens in front of your screens how this is going to roll out pretty much as the short intro Now then there's going to be the lecture By Helen and we're gonna have a conversation just one on one here on stage And then it's your turn actually in here at the live audience, but there's also a tool a Digital tool that enables you to participate in this discussion It's all going to be in English tonight as most of the times in this series making sense of the digital society But of course if you don't feel comfortable enough to ask your questions in English you may do so In Germany there's some simultaneous translation for Helen. She's got this little headset And if that doesn't work, I might I want to try to translate what's being asked in terms of don't worry It's going to be bilingual in that case then Okay, but before I drew the proper intro to this evening I would like to show you something brand new that was actually launched today And then has a lot to do with this series We launched this may we see the slide This is a digital Compendium it is called created by the HIC the Humboldt Institute for Internet and Society and co-funded by the BPB the federal agency for civic Education and it can be used for various educational activities as well It is based on the lecture on this lecture series actually It goes well beyond it as you can see we have grouped the recordings of the individual lectures into seven major Clustered topics and you see them now while you can see one digital society It's in German and in English most of the podcasts you can find there clustered thematically also Are in English just as one actually that is in straight Sherman There's a lot more Podcasts by the HIC that they have accumulated over the years during this series before this series you can check out there It's really a great repository of all things digital if you want to tap into that just being launched today Everything at once actually so you have a lot to do if you want to check it out And we certainly hope it's going to last quite a bit So in addition further material from blog posts also journal articles podcasts episodes and so forth Provide quite a complimentary insight into the current discourse on digital transformation Thank you to the people who made this possible at the HIC of course But namely Christian Graufoegel, Filina Janus and Yuri Bada who did most of the editing Thank you so much for putting this up. I hosted just some of these and lent my voice That's about all I did all the other work Was done by the three names. I just mentioned. Thank you so much. Please check it out It's been up since this afternoon 3 p.m. I think and it's going to stay up there for many more years. Okay Now to tonight's session in this long running series We have talked a lot about algorithms machine learning artificial intelligence about surveillance capitalism Thatification about the ethics of robotics and artificial intelligence almost always we have discussed at some point Machine bias or discriminatory practices being reproduced sometimes amplified even What we have talked about a bit less is the people affected by said bias or by said discrimination This will change with this lecture to some extent since our speaker is looking to tell some stories of marginalized people in their seemingly mundane interactions with data on a daily basis How political are feelings or emotions in relations to data? Does there lie a form of agency in how people or they or we think about what is important in the lives concerning data mining What role as an echo of our last session actually with you did see one Here in Berlin in this series. What role does trust play in this or maybe distrust as we will see Talking about trust and distrust these days during those skyrocketing numbers we have here in Berlin I might want to say this is a 2g Event as we say in German, which means we're either vaccinated or have recovered from COVID many are tested additionally I know I am Back to our topic to our speaker who made the journey from Sheffield to Berlin in order to be with us Helen Kennedy's professor of digital society at the University of Sheffield Where she directs the living with data program of research great website to check it out and a lot of material there I quote from this website Advocates of data vacation argue that data driven change results in a wide range of benefits Critics are concerned about the harms and risks that result from widespread data vacation What do citizens and members of the public think quote end? That is the basic shift of perspective that informs a lot of her work as far as I can see It is a shift that at least Methodologically is informed heavily by cultural studies not quite surprisingly since her guest did it some of her studies at University Oh, I'm sorry at the famous Birmingham Center for Contemporary Cultural Studies actually She had worked then at University of East London for 11 years Went to Leeds for seven years before she arrived in Sheffield her latest monograph from which we will hear some ideas I think it's called post mine repeat social media data mining becomes ordinary here. She is please welcome Helen Kennedy Okay, thank you very much for that kind introduction and for the invitation to talk here I'm honored to talk in this lecture series alongside people who I am in awe of And I'm excited that it's in person and that there's actual people in an audience in front of me And I'm going to talk about everyday life in times of data vacation So we're living in times of data vacation a clunky Tested but also quite helpful term for referring to the Quantification of things previously experienced more qualitatively Working studying communicating shopping maintaining relationships keeping fit and healthy moving around democratic Participation accessing public services in times of data vacation. These are quantified visualized and analyzed So Datification is everywhere and it's every day and there's lots that should worry us about this increased surveillance threats to privacy new forms of Algorithmic control and the expansion of new and old inequalities and forms of discrimination And these are things that have been discussed in other lectures in the series So Nick coldry for example uses the term data Colonialism to talk about a new social order based on the appropriation of human life through data Which he describes as a hollowing out of the social world for endless exploitation and manipulation Shoshana Zuboff uses the term surveillance capitalism Suggesting that people are a mere appendage to the digital machine While their data function as a source of value in lucrative new markets that trade in predictions in human behavior So what these writers do is they draw attention to the fact that Datification is far from a purely technical phenomenon Rather, it's integral to social political and economic forces as Lena Densic puts it Datification determines decisions that are central to our ability to participate in society in Domains like welfare education work crossing borders move my hands less so the politics of Datification are also everywhere and every day and this means Paying if we pay attention to everyday Experiences of living with data This can help us make sense of the politics of data vacation and of data related Inequalities so what I want to argue is that ostensibly a political ordinary everyday engagements with data are as important as more obviously Political phenomena if we want to understand and intervene in data power The ordinary is Political as feminist scholars have been telling us for a long time and a lot is lost if we overlook this And while concepts like data colonialism and surveillance capitalism focus on powerful actors and it's important to do that It's also important to ask how ordinary people experience and live with data as part of their everyday lives Now some people are uncomfortable with the term ordinary people and it's assumed opposite Experts because everyone is expert in something especially their own lives and experts are also ordinary people too Some of the time and this is a reasonable point But as Toby said I am influenced by what cultural studies had to say about the ordinary back in the 1990s when cultural studies researchers argued that focusing on ordinary Activities in the daily round is a political gesture What cultural studies did starting with Raymond Williams famous 1958 essay culture is ordinary was to ask how and where power Operates within ordinary cultures and those are the questions that interest me Where and how does power operate within ordinary cultures of datafication? But ordinary people are not all the same and people's everyday lives are also not the same As also kind of hinted at in Toby's introduction Already socially disadvantaged populations are more likely to be discriminated against in data-driven systems Because of structural inequalities, so we need to bear these in mind when we look at ordinary everyday encounters with data And what I want to argue is that by focusing on everyday life We can seek out acts of agency which may be small scale and mundane All of the ones that I'm going to talk about are but they may also be more significant And what this does in turn is suggest is to suggest that Datification doesn't simply shape social life. It's also shaped through everyday practices Something that I think that concepts like data colonialism and surveillance capitalism can't really accommodate And and this everyday life lens also opens up a space through which we can look at the Emotional dimensions of living and engaging with data and there's a politics in that too. I want to go on and argue So if you haven't worked out already I'm going to tell some stories about people's lives and people's everyday lives with with data and Datification and I'm going to focus on three different Incidents from totally different projects that I've worked on over the course of the past ten years Which come under this umbrella of living with data Which you can find out more about on the website that you can see here So the first story that I'm going to tell you is about BBC iPlayer Introducing a requirement that people sign in to access its services which which happened in 2018 So before that point in time you didn't have to sign in to access BBC services And one of the reasons that you started having to do that in 2018 was because across Europe at that time Public service media were interested in using data to deliver Personalized services and one way to gather such data was to require people to sign in and that's what the BBC did And so what we did is we undertook research which examined what disadvantage groups of people made of this decision So it was really striking how people who were living with poverty navigated our conversations about BBC data practices in focus groups Formerly vocal people fell silent when we moved from talking about media use in general of which they had Experience to talking about data-driven services of which they didn't For example Jason is a white British man in the 35 to 44 age group He has a disability and he's unemployed He didn't have regular access to a computer a tablet or a smartphone And so we tended to access media in analog form like reading a new a physical newspaper And he was at the Center for the unemployed where we carried out our focus group in order to use a computer Which he otherwise wouldn't have access to and he contributed to the early stages of the focus group When the discussion was about media use buzz but as it moved to be about perceptions of data practices He withdrew from the conversation He didn't have access to the understanding that comes from experience and this had a silencing effect on him a silence that seemed to come from a feeling of shame about his own exclusion Chris is in the 55 to 64 age group Is non-binary of multiple ethnic groups British has a mild learning disability and is also unemployed Chris loves BBC radio and had passionate opinions about it But not about the data-driven services which they couldn't access in the absence of a TV license or a personal computer They described themselves as one of the poor mouses, you know, who's too poor to afford the TV license So I don't watch any television at all unless at a friend's house and Toby tells me that you know what a TV license is here in Germany and that I don't need to Explain that so here what Chris is doing is he's responding to a question about Perceptions of data practices by talking about his experiences of poverty And other people in this research did that as well at the time that we undertook the research a Common concern in online discussion forums about the requirement to input a zip code on Registering for iPlayer was that this data could be used to enforce TV license payment in the future We asked our participants if they shared this concern Or if they thought it was a good thing that sign-in data could be used to monitor TV license payment Throughout the discussion at an older woman's craft group Brenda who's in the 65 to 74 age group a white woman who is retired and who has a disability Made reference to her own limited economic resources When the question of about zip codes was asked she responded by talking about poverty Which in her view often led people to not pay the TV license She said it's probably all the young ones that are out of work who aren't playing their TV license Where's the jobs for them for a start? She didn't care how TV license payment was monitored what she cared about was the relationship between its monitoring and Inequality her awareness of how poverty impacts lives Informed how she felt about data practices So Virginia in the 75 plus age group a white British woman who's retired Expressed a similar view. I don't object to signing in but then I can afford a TV license So there's nobody going to come knocking at my door She felt angry that limited access to devices and funds to pay a license fee Excluded some people from engaging with BBC services, which is supposed to be for everyone public service media, right? So feelings about data uses be about BBC data uses Intersected with sympathy for people living in poverty Moving the discussion about BBC data practices to a broader discussion about poverty Chris Brenda and Virginia Identified a relationship between data practices and social inequalities without necessarily articulating it as such So what I think these people are doing is they're talking about perceptions of data practices in ways that Connect data back to the social and political reality from which they're produced That's a quote from Catherine Dignasio and Rahul Balgaver And what we could see that's going on here is that they are Decentering data from our discussions And this is something that Cita, Penya, Gangadharan and Yed Rek Niklas say is necessary In order to understand the role of Datification in the production of inequalities and how these two things shape each other But what Gangadharan and Niklas are saying is that we researchers need to de-centre data In our research practice Whereas what we see here is ordinary people doing exactly that They're acknowledging everyday Datifications connection to larger systems of structural inequality And I would say that we found similar concern relating to inequalities amongst Participations in amongst participants in our current research living with data Which has included carrying out a survey of people's attitudes to everyday data practices in the public sector including in welfare health and in Media use and what we found on that survey is that black and minority ethnic or Bane people trust the police with their personal data less than white people do We found that LGBTQ plus people trust health professionals with their data less than heterosexual cisgender respondents do and Importantly we've also found that those who don't have direct experience of inequalities are still worried about them 86% of our respondents are concerned about data being used in unfair ways now fair and unfair Can have lots of different meanings, but one such meaning is equal or unequal So what I want to suggest is that inequalities cut through Datification in myriad ways and this affects the feelings both of those who might be negatively affected by the consequences of structural inequality But also of those who care about these things and I see these survey findings is a bit like Brenda and Virginia's insistent that we have sympathy for those living on the margins those already excluded enough Now that statistic from our survey that 86% of our respondents are concerned about data being used in unfair ways Brings me on to the topic of fairness and how thinking about the fairness or Unfairness of data practices helps us make sense of digital society in my research Fairness is a term or a concept that has surfaced when we've created a space in which people are Invited to say what they think about Datification in their own terms using their own words and this is something that I found in very early research So I undertook in 2013 2014 Looking at what social media users and think about social media data mining And what we found on that research was that the type of data tracked and gathered the purpose of the data mining activity The extent to which social media activity and its data are perceived as public or private views about Valorancy views about consent all influenced how people viewed social media data mining People assess these factors weighing them up and asking themselves is this specific data mining practice fair And so we can see examples of people Showing that their evaluations varied depending on the platform on which the data mining was taking place So here we see Hugo a male Participant from Spain who was 32 and unemployed Saying if it's on Facebook. Yes, I am concerned about data mining if it's a forum I don't care, but a social network is something more personal. So telling us that the platform matters and Some people differentiated the types of social media data that they felt it was acceptable to mine So there was greater acceptance of mining data shared when you set up a Facebook account or like something Van when you wrote a personal or an intimate post which was seen to be kind of less Minable and the uses to which mine data was put also mattered for some mining for targeted Advertising was a fair exchange, but for others it wasn't because it's in truth intrusive And this point was nicely captured in this mildly self mocking quote from Francis a Female participant who was 49 British a welfare rights officer So self mockingly she said if they do something with the data that I agree with then it's okay If they're going to do something. I don't agree with then it's not So what we have here is people reflecting on whether data practices are fair or not and arriving at different conclusions Differences which don't necessarily result from different conceptions of fairness, but from distinct Evaluations of the fairness of data practice a data practice is considered fair if it meets Users expectations about the collection and use of social media data Now is now I would say fairly well established and has been discussed in previous lectures in this series That data-driven systems are imbued with the values of their designers and a number of commentators have shown this I'm thinking for example of the work of Joy Blomweeney her spoken word poem Entire woman, which if you don't know it look it up. It's brilliant And it's only two or three minutes long where she shows that facial recognition technology has not been trained on enough black women's faces and therefore fails to recognize them as black women and this is just one example of the imbuing in data-driven systems of the values of designers Data scientists themselves are increasingly aware of these problems and alongside legal experts Ephesus and others are working to address these issues and one way that they're doing it is through Fact initiatives right and fact which you can see on the slide here stands for Fairness accountability and transparency so looking at transparency looking at fairness looking at accountability as possible ways to solve these problems Now these are well-intentioned efforts But they're criticized by critical data study scholars who say that focusing on whether data systems can be fair Is the wrong place to focus and instead we should be asking do they shift power? The criticism is that fact initiatives seem to be premised on a belief that inequality can be designed away If we tinker with the algorithm, we can remove it and this appears to conceive of inequality bias and discrimination as Technical rather than social problems What's needed critics argue is recognition of the structural inequalities that lead to bias and unfairness in the first place so These are important criticisms that what I want to suggest is that talking about whether data practices are fair or not is how people who don't have a more politicized or social scientific vocabulary of Social justice or inequality or power for example sometimes express their political concerns and I also want to suggest that people's Evaluations of the fairness or unfairness of data-driven systems in which feelings and values play an important role Can be seen as small-scale acts of agency of the kind I was talking about at the beginning of this lecture And they can be seen as acts of agency if we understand agency as a reflexive process As well as understanding it as what people actually do And there are a number of scholars that suggest that we should see it in this way Nick coldry for example Describes agency as involving reflection making sense of the world so as to act in it JK Gibson Graham Define it as the continual exercising in the face of the need to decide of a choice to be Act or think in a certain way So if we understand agency in these ways then forming judgments Evaluating mobilizing values or acts of agency are foundational to agency so I'm moving on now to my third story and I want to start the third story by Noting that there's been a lot of emotions in the stories that I've told so far in people's Thoughts about the link between data gathering and poverty in people's evaluations of the fairness of data practices and in 2014 2015 I worked on a research project called seeing data Which explored how people engage with data through? Visualizations and visual representations of data and in this project. We found that emotions shape as well as being shaped by reflections on and evaluations of data and We also saw this entanglement of emotions and values Something which I've suggested is foundational to agency can be a starting point for imaginings And here what I also want to argue is that imaginings can be seen as agentic or as relating to agency So one particular visualization that we talked about in focus groups was called Migration in the news and it visualizes data about the ways in which the British press describes migrant groups So would anybody like to guess what word most commonly precedes the word migrant in the British media? Not economic. No, it's illegal. Yes illegal Good good answer. But yeah, well the British press is is obsessed with making a link between illegal and migrants So I can show you it now, but I don't think that this actually gives it away So this is an interactive visualization, which are notoriously really hard to grab screen shots of so this isn't really very representative of it, but it gives you an idea About it. So this is a visualization telling us about how the British press talks about migrants and migrant groups describing them predominantly as illegal and it's a visualization that evoked strong feelings of empathy In two focus group participants who worked for a civil society organization Sally Woman who's 48 and white British and Horace a man who's 27 and white British And what this visualization did I'll show you another screen grab from it here What this visualization did was it prompted them to imagine the experiences of those migrating to the UK and Encountering British media portrayals of migrants and to feel strongly about this imagined experience So what Sally said was I felt really bad. There's so much negative stuff in the press about refugees and migrants It just makes you feel a bit for people who are refugees or migrants who are coming to live in this country And then go out and buy a newspaper and all the articles are negative and they're portrayed as scroungers and all the rest of it You just feel that it's such an unfair biased view and What Sally and Horace did was they described the feelings that the visualization evoked as feeling guilty for being British and ashamed of the media as a whole Here's what Horace said He said I'm kind of ashamed to live in a country that even though these people have given up their lives and come over here And given so much to us by and large we constantly belittle them and shout them down I'm a little bit ashamed to be in a country that has a meet a media like that So both Sally and Horace already knew about the causes of migration and asylum seeking from their work in the civil society Organization, but the visualized data Provoked strong feelings of sorrow shame and guilt as we've seen Aware of situations that migrants and refugees might leave behind The visualization provoked Sally and Horace to empathize with migrants and their experiences anew It translated data back into people for Sally and Horace Bringing the humans who are the subject of the visualization close to them The visualization made it possible for Sally and Horace to feel the data to make sense of them in a way that is emotional And as I've said, there's often an emotional component to engaging with date data And we've seen that in the other stories that I've told and and we've also seen that these emotions inform evaluations of gathering data to monitor TV license payment and or whether social media data mining Affair or how the British media portray migrants So here we see that feelings about data also lead to imagining the experiences of others represented in the data Sally and Horace like Brenda and Virginia connect personal lives to broader debates in ways that center inequalities through reflexive processes that I'm describing a small-scale acts of agency So turning back to our living with data survey One of the things that we asked people about there was what they think of the NHS Covid data store set up to hold all Covid health related data in one place to contribute to Coordinating the response to Covid-19 and although we found that 78% of our respondents were comfortable about their NHS day patient data being added to the Covid data score that high statistic masks Concern that was revealed in comments in free text fields in the survey Where there was more concern about data sharing in this example than in any other example that we asked about in the survey and More than half of these expressions of concern were about the involvement of commercial companies like Google Amazon Web Services and Palantir in the data store and the absence of publicly available Information about the nature of their access to sensitive personal health data So people were concerned about this how they're involved and not really knowing what kind of access they had and These concerns also led to imaginings of a different kind in this case Imagining what might happen? negative future scenarios many of which involved commercial organizations profiting from leaking Misusing or selling data in the future and here's some of the things that survey respondents said This data could be shared with pharmacutical companies and misused by them There'll be breaches as overworked staff could make mistakes the data could get into the wrong hands Maybe they will share it somewhere where it's not safe So Concerns and imaginings like these Can in turn lead to distrust which Toby promised you I was going to talk about Now distrust is usually seen as a problem Because trust is often seen as a positive emotion So for example in the UK the Royal Statistical the Royal Statistical Society has claimed that we're experiencing a data trust deficit and this is framed as a problem to be solved however, assuming assuming that trust is desirable can actually Delegitimize appropriate distrust Maybe in this instance we might want to argue that it's appropriate to distrust the Arrangements made by the UK government with firms like Google and Amazon and Palantir When they're not telling us the nature of the agreement that they have reached at and this kind of distrust resulting from feeling Evaluating imagining can also be seen as a kind of agency as Previously defined as a reflexive and evaluative process Distrusting is a choice Resulting from structural inequalities in this case inequalities around access to knowledge and information about how data systems work specifically the COVID data store and It's and it's this conceptualization of distrust as an Agentic choice that leads American sociologist Rua Benjamin to say that the problem of distrusting citizens should be recast or reformulated as an issue of social justice I might have a drink of water and Coughing a bit. Oh, that's impossible to open Yes, it is. Sorry. I need help opening the bottom Sorry, it's a water interlude Here we go. Thank you. Thank you very much. I didn't think the water interlude would be as long Or as complicated Okay Anyway, that's the end of the stories bit, but I'm gonna talk a bit now about what I take from these stories and Kind of remind you and remind myself that I started this talk by saying that Focusing on datification in everyday life We can see acts of agency and that all of the acts of agency that I was going to talk about was small-scale and mundane So let me recap. Where is the agency in what I've presented? And maybe I'll change the slide to a slide of my book cover Just so that you're not distracted by the quotes that are up there So I see the ways in which participants Responded to questions we asked them about uses of data gathering from signing into iPlayer as agentic They took control of the questions we asked Responding to questions about data practices with answers about inequality They de-centred data as I've said now I think that these kind of answers can often be seen as digressions in Empirical research, you're not answering the question that I asked you But what I want to say is that it can also be seen as kind of agency in the research process Through which people link our research questions to their experiences and to the things that are important to them And like digressions imaginings can also be seen as small-scale acts of agencies Imaginings could also be seen as digressions. We have digressions as a code in our code book and in our analysis But actually what imaginings are doing is they're filling in the blanks The story's not told the information that's missing So we don't know what's going to happen with our sensitive personal health data in the COVID data store So we're going to fill in the blanks And this again is agentic I want to suggest Using data as a starting point for imagining what it's like to be a migrant Encountering the worst excesses of the British media Sally and Horace also de-centred data. They also focused on what mattered to them the experiences of migrants in the UK and In the absence of transparency about what kind of access to personal data Google Amazon and Palantir have For how long and for what purposes as I've said people fill in the blanks This sometimes leads them to imagine the worst. It sometimes leads them to a position of distrust and again I've argued that we could see this as agentic We could say at a very simple level that asking people what they think about different data uses Invites them to moments of agency It invites them to evaluate Which involves mobilizing agency and in building up this argument through the stories that I've told I'm personally influenced by British political philosopher Andrew Sayer who's written a great book called why things matter to people and What he argues is that in the social sciences People's evaluative relation to the world is often ignored But values and feelings need to be taken seriously in the way that I'm attempting to do in this lecture He says that people's evaluative relation to the world constitutes ethical being in everyday life It forms part of social struggles about how to live about what is a just Virtuous or good life and a good society And I think this is reminiscent of my argument earlier that agency can be understood as involving reflection and as acting ethically So what I'm trying to persuade you all of is that forming judgments mobilizing values are ethical Reflections and they form part of agency But I do want to add that methodologically Talking to people about these matters isn't straightforward people don't feel invited to small-scale moments of agency if we ask them their opinions about Things they don't know much about and data systems data-driven processes Datification are often things that people don't know much about in part because they are opaque by design So people can't weigh up the fairness or unfairness of data-driven systems if they're not familiar with them So this is a methodological challenge I think and the way I respond to that is in my research I show people data uses Rather than expecting them to be able to evaluate data uses without understanding them So in the research about social media data mining that I talked about we presented participants with Real-world examples of uses of social media data mining On living with data we provide participants with information about the claimed benefits and the claimed harms of data-driven systems You can't conclude that people don't mind about a data-driven system if they're not aware of what the potential harms might be so for example again in relation to the NHS COVID data store We said that it aimed to help national organizations responsible for Coordinating the COVID-19 response But we also said that patient data groups are concerned that not enough detail has been provided About contracts with partners to fully understand who has access to data for what purposes and for how long So what I'm saying here is that methods matter and Linking what I've been talking about back to the politics of all of this Agency matters if we want to establish more just forms of datafication if Current data relations can be harmful to ordinary people and to some people more than others as Notions like data colonialism and surveillance capitalism suggest then alternative data relations are needed And we need agency to achieve these alternative data relations. I Want to suggest that there's little scope for agentic engagements with data in the visions of Datification provided through notions like data colonialism and surveillance capitalism Yet there is agency out there identified not just in my work, but in loads of research. So for example And Jean Burgess and others have a book coming out called Everyday Data Cultures Which looks at agentic engagements with all kinds of Data-fied technologies from music listening to sex tech There's been loads of research about everyday health self-monitoring practices and the place of agency in that there's research about how vloggers and Influencers and others negotiate algorithms on the platforms on which they're operating and even in work on Digital exclusion we can find agency Now digital exclusion is something that is not used usually seen as an agentic choice But rather as a consequence of limited access to technologies and of marginalization But what Ceter-Penier-Gangadaran says is that digital exclusion is a form of refusal of Technologies based on evaluations of their harmful pasts and histories of marginalization This affirmative take on digital exclusion sees agentic possibilities of self-exclusion in technology and technologically mediated Society so digital exclusion is an agentic choice according to Gangadaran So we might want to argue then given these small and mundane and other types of acts of agency that we can see around us The all-encompassing notions like data colonialism are Empirically inaccurate in their failure to take account of these agentic engagements with data In my book post-mine repeat, which was published in 2016 which you've been looking at a An image of the cover-off for quite some time if you're here in the audience So in this book which focused on the increasing Ordinariness of particular kinds of data mining. I argued that as Datification becomes more ordinary new data relations emerge and this phrase data relations Came into being in a conversation with an excellent British sociologist looking at kind of data studies David beer and So what I was arguing was that in these new data relations that were emerging are increasingly integral to everyday social relations, but importantly new data relations are Undetermined that's a concept used by North Jomaris and Caroline girl. It's to mean not yet settled not yet stabilized I Said that in 2016 and I think that is still the case So so in my book I paraphrase the question that Andrew fienberg asks at the start of his book transforming technologies must human beings submit to the harsh machinery harsh logic of machinery or of data colonialism or of surveillance capitalism must human beings submit to the harsh logic of these things or Can technology be fundamentally redesigned to better serve its creators? And my answer is no It's not the case that we must submit to technology or Datifications or digital societies harsh logics. There is agency out there But the small moments of agency that I've discussed here don't mean the harsh logics and unequal data structures have been torn apart Or that they're not powerful Have gained power Agency in the face of data relations Like data relations themselves is also Undetermined so the stories I've told Highlight how inequalities cut through data-driven systems in multiple ways affecting the feelings of the structurally disadvantaged and those who care about these things, but I also want to say that surfacing This point is not an inevitability Into in research into everyday life in times of datafication. Okay, so this is another methodological comment from me Methodological choices matter relating to whose stories we seek out and listen to and how we engage people to tell their stories if We speak only to white able-bodied middle classes We can expect a different kind of story to emerge than the ones that I've told here The stories I've told also highlight how emotions matter in everyday engagements with data in the US academics Activists Katherine Dignasio and Rahul Bogover who I quoted earlier have found that emotional Connections facilitate engagement in the data projects that they have organized and for those of you that can see the slide Here we see a big a large-scale data mural that they worked on with some Disadvantaged groups in the US so they're saying that emotional connections facilitate engagement and so for Katherine Dignasio Legitimizing affect is an important principle for inclusive data work Working with emotional responses in projects which seek to engage socially disadvantaged people on data related issues Dignasio and Bogover Have done might lead us towards more inclusive data-fied societies at least for some groups in some contexts so Fantastic media researchers Sonya Livingstone Complains that there's too much focus on power and not enough on lives in research into Digital society or too much focus on structures and not enough on agency. We might say She's critical of theoretical Imaginings of people's passivity which ignore their actual and possible agency To make her point. She quotes Washington Post article about the circle a film based on Dave Eggers dystopian book about the power of Internet companies This article says that theoretical Imaginings that ignore agency view people as Distracted into idiocy by the insatiable demands and worthless pleasures of the Internet Where are the people in these kinds of theories Livingstone asks and like Livingstone? I'm arguing that in order to make sense of digital society and to understand everyday life in times of Datification we need to center people We need to center the experiences thoughts feelings values Evaluation evaluations and judgments of ordinary people We need to listen to people's stories and as Toby hinted And I would say there have not been many People's stories in this lecture series so far not that I have looked at all of the lectures But I have looked at a handful of them So I want to close by saying that I've quoted people in this lecture But I haven't gone quote-end quote because I think that might have been a bit disruptive But there are quotes in here certainly not all my own words And I also want to acknowledge that I don't research alone I research with amazing teams of excellent people and if you visit project websites You can find out a bit more about who they are. Thank you very much Thank you Helen so much for this very inspiring very clear cut very clearly delivered Lecture it's always nice for non-native speakers to have such a clear lecture too I think many more people can understand it that way Well, obviously we're going to talk about agency I think for quite a bit here on stage from about 10 minutes or 15 until it's your turn here on the floor It's through the digital tools ask your questions now I was wondering to start Helen what this this model of agency You propose mean for science actually and would it be correct to say that the agency of the Interviewed sort of rises were the sovereignty of the researcher Wayne's so as a researcher You have to give up a certain kind of sovereignty in order for the agency at the other side to rise And what would this actually entail for your work? Thank you. Do I need to turn it on? No, it's on Um, no, I don't think that Listening to people stories Giving people space to say what they want to say and talk about what they want to talk about Necessarily is diminishing of the agency of the researcher and And I don't have much else to say really You know I work with teams of people where we prepare really carefully What we want to go and talk to people about you know and think about who we want to talk to and we You know tweak and edit and iterate our research questions and we Invest a huge amount of energy and effort into that and then we go and talk to Brenda and Virginia and Chris and They tell us what they want to tell us It's not diminishing of our agency. You know, I I don't have to interpret what they are Saying in the ways that I've interpreted it right that I've got some agency there What I could have done is I could have said oh Digression digression digression digression not going in the data set. Here's my data set. Let me analyze that so so I You know, there are forms of research which are participatory Like co-produced research. I wouldn't claim that I am doing that. You know, I I think that's great and really interesting work I haven't done that myself You know, I haven't generated a research project from the ground up with communities I think that that is Very valuable approach. I haven't used it myself So those are my multiple thoughts of a researcher agency. It's not something I've ever thought about before but I certainly Don't feel the absence of agency when I do my research It's really important to listen to what people have to say isn't it? Oh, sure. I didn't mean so much Diminishing your agency but just giving up part of your sovereignty. I think that's a different term really I didn't mean to I didn't think it was a bad thing. I think it's wonderful to lose a bit of your sovereignty Actually, what does that mean for you? I'm giving up your sovereignty sovereignty Well, that is just less teleological work in this case I would say right that the outcome is more open is more uncertain is more opaque In what you do, it's just That's what I mean by losing your sovereignty for another kind of agency to rise. Yes Yes, well, I'm all for what you're describing Let's talk a little bit more about the term which is a very interesting term I'm not sure if I have grasped it completely yet about Decentering data or Decentred data what that actually means and what this leads to very concretely in your research What then is a de-centered data? Well, it is what you call the digression It is a different answer what you may have expected in the first place It puts different issues on the table that probably haven't been on the table before or hadn't been on the table That's how much I understood at that point but de-centering data that's saying a lot, right? I mean, where does this de-centered data actually land in the research Does it always have to be or is it always taken into account in the end when you present your research? Or are there, you know a certain instances where you say, no, this really is off topic And you just leave it out. How do you what do you do then? Yeah, okay, so This concept of de-centering data is something as I said that Cita, Penya, Gangadharana and Jedrek Niklas write about in a 2019 article I think in information communication and society and it's also something that Lena Densic writes about in a book chapter in a book about Citizen media practices, I can't remember the exact title of the book so What they're arguing and I am with them on this so I'm arguing with them is that if you want to find out about the role that data is playing in the reproduction of structural inequalities in In in in discrimination in bias in in decision-making By the police by social services around child welfare, for example If you want to find out about the role that data is playing Don't start by asking questions about the role that data is playing Okay, because people and this comes back to that, you know talking to people about things that are hard to know about Okay, so people don't always You know the role that data is playing is not always the thing that matters most to people So if you talk to them about that or ask them a question about that you might not get a very interesting or informative Or useful answer But if you ask people what matters to you I Had my child taken away from me Well, how did that happen? it happened because of Automated decision-making process that misinterpreted some Data because of the training data that was fed into it that didn't account for the experiences of people like me Then you find out something interesting. So so it's a kind of a Call for a kind of methodological approach for those of us that are interested in studying the role of data in society To not ask people questions about the role of data in society It's something that we tried to do on the survey that we did So we asked people's attitudes to various different aspects of data uses People might say you might say oh does this concern you and people might say oh, yeah That really concerns me and then go away and never think about it again for the rest of their lives So how concerned are people what is the place of data-related concerns in people's lives? So we asked some questions on our survey Where we asked people to rank concerns and they were taken from a a national what concerns people poll And we included data uses as one item We had a list of broad concerns in which one was data uses and that landed in the middle And then we had a list of more narrow and more specific concerns There were 15 items in the list and interestingly data being used in unfair ways was the third most concerning after the economic costs of Brexit and Something to do with COVID and the NHS. I remember the exact wording But but the least concerning concern was my personal health data being used to respond to the COVID crisis So that told us interesting things. So so Decentering data is about Putting people's concerns first in order to find out about the role that data plays in society but it also, you know, it links to these bigger questions about Researching concern. So I guess yeah They're kind of empirical challenges. I guess and it's certainly shifting perspective But when is the point reached you think where those small acts of agency as you called them I think become sort of empowering because if you asked and I think you quoted that That you know 86 of our respondents said they were concerned about data being used in unfair ways Most people are I think we're all in this together Pretty much everybody in this room will probably say yes to that question, right? Yes We are concerned of you know in what ways our data are being used and how they're sold and so forth because we don't know because They're those systems are opaque by design as you put it now Okay, we say yes Is this a form of agency too and where has it gotten us because I don't think it has gotten us very far, right? Yes, well Good question, and I don't know if I've got the answer. So if the question is when Do small-scale acts of agency become empowering? Yeah, that's Yeah, it would be really interesting to try to find a way of mapping that wouldn't it and I guess this is a much more empirical and kind of story-based lecture and conversation than a lot of the other ones in the in the series so for me That's a kind of empirical challenge. Like how would you do that? but You know, there's a there's a I don't know if you know, there's a sort of big research project in the Netherlands about data activism It's been going on for years and the person that that runs that Project and me have a really kind of productive dialogue about How does what I'm studying studying meet what she's studying? Which I think is your question. How does you know the everyday meet? resistance kind of And I don't think there's one kind of clear and simple answer to that question but I think they are in dialogue with each other and When you ask the question, how does small-scale when when do when does sort of reflecting Evaluating and deciding that you don't like this become empowering. Well, I would give an example of Not this summer, but the summer before in the UK an Algorithm was used to decide the A-level results of students 18 year olds who then did or didn't Get to go to university on the basis of that result and written into that algorithm was the previous achievement of the school so students from elite schools got good marks and students from schools in poorer areas get got less good marks based on the class basically based on social class and People evaluated this and decided they didn't like it and took to the streets with placards that said fuck the algorithm and the government overthrew its decision and actually Algorithmically decided grades were replaced with teacher assessed grades and and people got the grades that they deserved not the grade that That an algorithm thought they should deserve And I would say, you know things like GDPR exist because of Things that started as you know small-scale evaluations of whether of things are fair or not leading to organized activism and advocacy and So I think that you can trace links between the types of things I'm talking about and and Bigger achievements, which is absolutely not to say that Datification is not a problem. It is a problem. That's why we need agency Another very interesting point I think was when you when you talked about fact right about fact accountability Transparency and where critics say well, this is would be a typical case of what you've getting Mars off called solutionism as one of the more prominent speakers on the Subject you cannot design away inequality by inventing digital tools or by you know letting big consulting firms through the talking Until all that inequalities sort of being swallowed by the public That's true But don't you think that still there actually can be something done by tinkering with the algorithm So to speak as you called it so that we sort of need both We knew we actually need that question. Do they shift power when we talk about The systems and how they reproduce inequality or not. That's one question But the other question is well, we can fix a little bit something if we fix the systems and if we check them With programs you just refer to like fact, right? I mean, it's not exclusive or did I get that wrong in your lecture? Yes, I I agree that We yeah, we do need to tinker with the algorithms, right? That's not only what we need But that is certainly a part of what is needed What I was trying to do in the lecture was to kind of show these cycles of of debate where you know We know now that data-driven systems algorithmic systems automated systems have Discrimination written into them and so data scientists are trying to respond to that And that that's a good thing and they're not doing it alone. They are doing it with Kind of more critical thinkers or ethicists and so on, you know And I think that that is absolutely well-intentioned and intends to have positive effects But that in turn gets criticized Because they're not asking the right questions or because they conceive of inequalities in certain ways so I Wasn't personally taking a position on that cycle of debate in the lecture But I can do that if you want me to do that now what I was trying to say was that I Think that something is lost in that critical position that discounts Fairness as a useful concept in this space. That was the point that I was making but yeah, I mean There's there's people like Joy Blom weenie who I referenced Is part of the Algorithmic Justice League, which I don't know if people have heard of that and they produce resources for You know data scientists and technologists and you know do the things to do tinker with the algorithm, right? In order to try to address try to pick away at the problems So even I think those people who are kind of critical of these things are still Trying to work out where the where there might be some minimizing of bias and discrimination technologically As well as looking at other factors not everyone some people are standing on the outsides on the sidelines shouting But lots of people are engaged in dialogue Thank you so much. I think we start taking questions from the floor before we go to the Digital space so to speak is there a microphone out there? I cannot see that much after wrong glasses on tonight Where's the microphone? It is back there. So anybody with a question or a comment, please speak up now It's those little pauses that make it interesting There's one in the back and one in front let's start in the back and then move To the front to show that Sorry now. I got it. She's keeping the microphone So thank you so much for your talk was very insightful I was wondering because you are also mentioning GDPR and like this is actually like evolved out of activism and so on So how do people view that or like is there any insight from your research? How this influences agency or like the like also perceived agency of actors if like New systems are also kind of designed away from them because I think in a lot of countries the GDPR was perceived as something People now have to do additionally that actually doesn't really give them more insight or anything like that So is there any kind of insight you can give like to that? Thank you So the question is what do people how do people feel about GDPR? Which was kind of designed not with their kind of involvement or consideration More people not how they felt about it, but how that influenced their agency or also their perception of their own agency So how the how the fact that GDPR was designed over there Influences their perceptions of their agency All right, so I'll give it one more try I think I'm being very clear know how the GDPR PR already being there and actually Making things less transparent for a lot of people and more overwhelming is Influencing the perception of agency. Yes. I mean really whatever the question is I don't have an answer because I haven't researched that I haven't researched how G yeah anything to do with GDPR affects people's sense of their own agency, but I think that Yeah, underlying the GD underlying your question is a sort of note that the GDPR GDPR is not Simply and only a sort of force for the social good, right? It's brought with it a kind of package of complexities and I Know somebody who was wanting to research the effects of GDPR on a small local rural school And I don't know if you ever got to do that But that is really trying to to look at the kind of question that you're asking there anecdotally a head teacher of a school told me that Parents know that it really Annoyed school if they put in a subject Access request because it takes them a load of time. So if parents are disgruntled with the school They're annoyed with the school They'll put in a subject access request because they know that that will suck away the school's resources There's a bit of agency there In the parent and what the parents are doing But I think yeah, there's a lot to be researched around The effects of of something like this and all sorts of things about aspects particular aspects of the GDPR that are to be Researched I think so for example, we have a right for a decision About us not to be made only by an automated system, right? There has to be some human in the loop How do you implement that? What's that like in the world in on the ground in people's everyday lives? I think there's a lot about the everydayness of GDPR that we could research and I don't know if any Research has been done Sean net, please Thank You Helen for this really interesting talk What I found very impressive is your finding that the people you want to interview about Datification might lack the language and the experience to answer the questions you are interested in That is probably true for a lot of empirical research on digitalization That it concerns things that are very difficult to experience in a practical sense So people who are not reading all the time about it indeed lack the vocabulary to answer the question and perhaps even to understand the questions and That this raises a lot of methodological issues, but what you call How did you call it data? Decentralizing data Datification I wonder whether people might not feel manipulated if you present them with stories to sort of Approach the issue you are interested in from an angle of their life. Is it that they? Then understand what you mean or is it that they wonder what you are after? That there could also be agency in escaping your questions Okay, very good point. I think to start with I wouldn't Frame it as a lack. Okay. I think that that's a Negative framing and I don't think that I was describing people lacking knowledge and lacking vocabulary I think I was describing people having a different vocabulary using a different vocabulary to the one that I'm I might use for example and I think what people are bringing to the research moment is The knowledge that they do have which is knowledge about their own lives I Suppose like my answer to your second the second point is that I think we have a kind of Personal political orientation in in the research that we do That would lead me to try to make sure that neither me nor anyone in my teams was manipulating anybody Okay, so there are ways of doing it There are ways of doing it where you absolutely don't tell people what is that you're interested in or there are ways of doing it Where you start with what matters to people? Andrew Sayer's book is called why things matter to people and he's he's saying let's Focus on what matters to people. So if you start with what matters to people you can bring in data to that and Anyway, I am interested in what matters to people and how data fits in And I think a lot of researchers in this space are it's not the case that they're not interested in what matters to people they're only interested in data and So and feel like they have to talk about what matters to people in order to get to data so I think there are there are ways of doing these things and there are Orientations that we have that I think are Political personal ethical moral That I think would address the concerns that you raise I have a follow-up to that question Helen sort of because Hopefully without being discriminatory it is probably safe to say that You know social life transformed into data has become much more complex nowadays and much more opaque then in You know previous periods of history because this is basically a modern project right and social life being transformed into data when we talk about street names But European European city being remodeled about the postal system and we talk about Taxes when we talk about census and sentences there It was always resistant a resistance right public resistance very violent resistance in some cases Against these attempts of exactly that transforming social life Into data that was then opaque right, but everybody could see how it was done by labeling a house with a number By knocking at your door and asking how many people live here. What's your profession? What's your age and gender and so forth? I mean postal offices burnt in Germany in the 80s even you know It's not that long ago because of a census that basically gave away no information Compared to what people do nowadays online and resistance was very vivid My question then as a follow-up it would be do you think that is actually become harder to resist those kind of practices Which are common to power. That's what power structures do right That it's become harder to resist these practices because they have become so opaque actually and complex I mean, I I don't know what's happening to my data and I do these things like this event all the time I don't know exactly either. I I lack the vocabulary to be honest Well, I would have to do a careful historical comparison to answer the question is it harder But I can answer the question. Is it hard? Yes, it is Yeah, it's you know, it's opaque isn't it and one of the things that people were expressing concern about as I described It was the opacity of this and I'm sure I haven't watched the video, but I'm sure that Jose van Dyke will have talked about Not only is it more opaque and more widespread that you know aspects of life are turned into data but it is much more consequential because data Producing platforms are providing public life You know, they they are increasingly the foundations of public life and public services So not only are they more opaque and more widespread, but they're also more kind of consequential Making it simultaneously really important to to resist and complain, but also as you say very hard to do so Let's take questions from the digital realm Christian. Are you ready? You have some we have on your screen? Yes We have one question from slido You said that we should focus on people's concerns What matters to them to find out what role data plays in our society? But how can this be combined with big concepts and transferred to political level to solve these concerns How can this be? Compared to big concepts and transformed to a political level Yes, to solve these concerns two separate things. I would say Let me talk about the transferring to a political level first So that's kind of like a question Like and therefore what should we do? Okay, so given given what you've said therefore what should we do what should be done? So, so, you know some of my research is Funded by charities who really kind of committed to social change So they want us to come up with recommendations about what should be done And so we do so the types of things that we have said should be done Here's something that you know policymakers could consider stop gathering data Stop doing it, okay Stop involving commercial organizations in opaque profit-making ways Enter in to kind of open dialogue with different publics about what you're doing Communicate clearly about what you're doing although that alone is not enough that that's the kind of transparency problem Don't build policy on the basis of one study alone You know do an ever since evidence review of five years of studies before you kind of builds policy So, you know, we say things like that. There's some answers to that question. What could be done? But I do wonder what happens to those kind of statements and you know, we have people from Policy on our advisory board of some of our projects listening to what we say they are listening But still do we see change happening? I think that's another process that needs to be tracked What happens to the recommendation that comes out of research into public attitude, which is one way you could describe what I'm doing What happens to it? Where does it go? Where does it end up? Can we see that it has ended up in policy of some kind? I don't know. I I personally actually think it's easier to influence practitioners So in the research about the BBC that I talked about we worked with people from the BBC They're interested in what their audiences have to say about what they do with their data I think sometimes if you want to make change happen, you can go more effectively to practitioners Now that other question, how can we link this to big concepts? Are my concepts not big enough? You know, I think there were some big concepts in there They're the big concepts that I think are appropriate to what I'm talking about I'm looking at you Christian. You didn't even answer the ask the question. Did you know it? Simply reading out a question asked by somebody else I'm just looking in your direction. It's just the advocate. That's their job being More questions from Slido. That's it. Okay Well, I have maybe one more maybe two more questions. Let's see. I Think it was very interesting also when you talked about Imaginings as a genetic right when we talk about visualization because they're merry many many many traps there also how we visualize data that can be very manipulating also to Again have a follow-up to Shannette's question, but I'm just asking again in terms of of Scientific practice. I mean do we need more interdisciplinary work there with the fine arts? With art schools with graphic designers and so forth and data scientists Is this something that is happening already in your department in your research? Maybe or what would you think would be a good trajectory? What would be good means to actually overcome? This that we don't have Apparently enough visual adjacent visualizations that sort of enable people to engage with the data in a more emotional way Okay, so what's the solution to the problem of? Not having the right kinds of visualizations. Is that what you'll yeah, should there be more transdisciplinary work with? Other departments of the fine arts graphic design and so forth people who do that on an everyday basis. Yeah I mean I would first of all, I think we have to be careful about sort of describing data visualization as manipulative I can be certainly we see that in the tabloid press every day No, it can be but to visualize data choices have to be made Okay, and I think that there's some really good visualizations where Choices have been made and so we end up with this visualization and not that visualization which we could have had if different choices Were to be made and I do I do think that the distinction between good visualization and bad visualization is quite an important one Because good visualization practice would involve explaining those choices Explaining how those choices have been made And yeah, I think you know interdisciplinary is good. I am interdisciplinary I don't know how to describe what my discipline is There are there are collaborations like that going on in the UK There's a big project called art data health Funded by the Arts and Humanities Research Council looking at working with artists other projects that have brought Visualizers or designers in I Mean I wasn't really sure what you thought the problem was But I do I do actually think that it would be really interesting if an artist Taught a data science course. That's what I meant Yeah, I think I'd love to see that and to see what the outcomes of that and these things are happening There are there are projects of that kind happening Catherine Dignacio who's based in the US is I mean she would say she used to describe herself as an artist But she now described herself as a data visualizer because it gets a more open doors What's the difference? Yeah, but she does bring her art training to working with data and Working with people and I think it's it's fundamental to the great work that she does Very last question a cultural factor maybe because you know, we're here in Germany And I think it's safe to say that all the German-speaking countries in mainland Europe are very data-sensitive Very skeptical about data. That's what we see happening here in a very bad way Now because we have very low You know vaccination percentages and so forth. That's pretty much what happens now It has to do with data sensitivity also and the people resisting That kind of power If you look to the US where people apparently or used to be have been much less Sensitive what happened to the data you think you'd have a different outcome with your kind of research If you did that in US then when you did it and mainland Europe and how would you describe this channel will sort of sensitivity to what happens with your data in the UK in perspective to those continents Yes, well, I would certainly agree that you know Context matters. That's one of the big things that sort of Emerges from a lot of the research that I've done context really matters That might be national context it might be, you know context in other senses of the word it definitely matters, but I Would refrain from saying All of these people in this group feel this way about data So I wouldn't say people in the UK feel this about data precisely because context matters You know what data used by whom in what context for what purposes with what effects Makes a difference with to how people feel and that is in part why I do the research that I do we need to differentiate We need to understand that there's no The people and there's no data Either, you know, we need to be specific about who we're talking about and what types of data What types of data practices in what context we're we're talking about so it may be generalized kind of skepticism in one place and not in another but I Yeah, I think when you drill down. I've looked at some international comparisons. I haven't seen Any kind of patterns I guess that would seem to be about kind of National context, but I think you know specificity rules Okay, that's a good ending quote. Thank you very much for being with us from Sheffield Helen Kennedy Thank you for being with us here on site See you next year in the series. I think thank you You