 I work at Cardiff University and I guess what I've been asked to talk about is a project that we did, which we completed at the end of 2018, which tried to really kind of map this development of citizen scoring in the public sector, so the way in which data is collected and used and analyzed to produce some kind of assessment or profile or actually sometimes a numerical score that informs decision making in public services. Just briefly a little bit about who we are and how we approach these kinds of developments, so this is very much a collective effort. We're about 10-11 people who work at the Data Justice Lab and our broad research focus is actually thinking about developments happening in terms of data-driven systems across lots of different areas of social life, so this question around public sector and public services is sort of one area in which we've looked at, but we also have looked at issues relating to migration, for example, or health or in the workplace as well, and I actually think some of these developments we're seeing in one part are often this kind of issues that come up actually translate across lots of different contexts, so actually thinking about it in broader terms might also be helpful, and we've been around for a couple of years now and our angle to some of these issues is often particularly to look at the societal implications of this and what we talk about in terms of data justice, which are the sort of social justice implications that emerge quite prominently sometimes. For us that's pushing the debate beyond questions of privacy, which often has been the dominant way in which we need to think about this, and also beyond questions of protection of personal data, and actually try to think about other ways in which we might have to consider issues of social justice to do with questions of inequality, to do with questions of discrimination, transparency, democratic process, etc., so lots of different issues that we want to bring out, so the notion of data justice for us is really trying to broaden the scope that we're talking about, so that's also what informs us, so we're social scientists coming to this topic, we're not technologists, and that's also the approach we take to understanding what's going on, so we're looking at practices, we're looking at how data-driven systems are integrated into different social and institutional contexts. So the project that I'll talk about is this project that we did call data scores as governance, investigating uses of citizen scoring in public services, and it focused on the UK, now we wanted to use this term of citizen scoring which seems to have become you know more widespread now, but at the time wasn't used so much because we wanted to allude to this trend we've seen in financial services around a credit score and that logic of scoring people, how that's migrated into spheres of governance, and then we also wanted to use this term to allude to the kind of skew debate around citizen scoring which had mainly focused on China and the social credit score in China and this idea of scoring citizens, scoring in terms of trustworthiness etc, which was often presented as something that would only happen in authoritarian contexts and that kind of discourse was very prominent in how that those developments were talked about as something that's very alien from a European perspective, and we wanted to sort of engage with this idea that actually although yes the context is very different, this logic of scoring citizens based on data-driven systems is actually something we're seeing emerge across European countries as well, so this that's how we sort of why we wanted to to emphasize this notion of citizen scoring, but it's broadly speaking to do with this idea of of assessing profiling categorizing populations in different ways that sometimes results in a numerical score but might also be another form of a category or an assessment, but we sort of group this all together and so what we tried to do from this, even though in the UK there's been requests for lists that lists sort of algorithms that are used in local and central government, that kind of list doesn't exist yet, even though it's been requested or it's been asked for, so we tried to go about finding ways in which we might be able to to create a list of a sort or map some of these developments and we did that by combining different methods, so what we did was that we sent out about 423 freedom of information requests to all councils and local authorities in the UK, sort of very generally asking them about their uses of data analytics and algorithmic decision-making and that was it is a tricky thing to do because actually one of the things we found is that many local authorities don't know if they're using these systems or they don't have a shared understanding of what that might mean and what we're talking about, and we're talking about what we might think of as citizen scoring, so it's hard to sort of actually provide or create a comprehensive list but from those 423 freedom of information requests we identified 53 councils that mentioned some use of particularly predictive analytics which I think is what we're especially interested in here that it's not just simply sort of collating data but actually that you are you are trying to make a prediction about an individual household or population needs in some way just kind of what you know the element of big data alludes to and what we also found in collating all of these different or looking at these freedom of information requests was that there is there are few private companies that supply risk assessment tools to a range of different local authorities and councils so we're seeing new public-private partnerships emerge around these developments although a few do develop their systems in-house for example Bristol which I'll talk about where they're the system that they're using has been developed by data scientists who is situated at the council but generally speaking we see a lot of this being outsourced to a handful of private companies like Centura like Capita and others and we saw found that predominantly the areas we're talking about is other ones that you mentioned already which is sort of benefit fraud, child welfare, policing is a big one, those were sort of the areas we predominantly found mentioned looking using these freedom of information requests but in addition to that we also then looked into a few different case studies to try and get a sense of how these systems are actually used in practice and then we had a number of this sort of news coverage that relates to some of the research and some of those that news coverage draws directly from the research of this report that talks about the ways in which you know citizen scoring actually might happen and I'll just give two examples just to illustrate what we're talking about when we're talking about citizen scoring so one is the one at Bristol City Council which is used to try and predict the likelihood of child exploitation and essentially so that system is built in-house and essentially what we're talking about is the integration of 35 different social issue data sets at the time of research that might have changed subsequently and out of that so all young people in Bristol are allocated some kind of score from zero to a hundred which is based on how much they match previous victims of exploitation in terms of characteristics and behaviors and that's essentially supposed to give some indication of the likelihood of child exploitation of some form so that's one example so it's about integrating data into what is often described as a data warehouse or a data lake and then on top of that try and create a predictive algorithm that allocates an individual profile or score to all young people in Bristol children and young people in Bristol that's that's one example the other one is what we looked at in policing is Avenus Somerset police which is also encapsulates Bristol where they have contracted with a company called click sense where they have what was actually initially supposed to be a performance assessment tool they have repurposed for existing offenders within Bristol and anyone who's on that police database so previous offenders but also could be victims of crime and they are also they're ranking from high to low the likelihood that someone might reoffend so there you have actually just a sort of high to low ranking based also on on data and and what they predict to be likely factors to for someone to to reoffend so those are the kinds of things those are the types of developments that we're talking about and we're talking about citizen scoring based on on data driven systems um okay just to give a little bit of context as to where this development comes from so what we did what was we interviewed a number of different for these different case studies we looked at we interviewed a number of different people who worked in the councils or in the police so public sector workers and then we also interviewed what we might think of as sort of stakeholder groups from civil society so these would be people who work with service users for example or who work with impacted communities so it might also be we interviewed also disability activists and welfare rights activists or community activists as well as well as digital rights activists so these just to get a sense of what they felt were the issues with these kinds of developments or the challenges and I think one thing that was just to highlight some of the context in which this is happening and of course some of this will be unique to the UK and some of it might be something that's relevant across different country contexts so one is this question around that we have a interpretive or regulatory vacuum meaning what we found was amongst public sector workers amongst different local authorities and councils there isn't a shared understanding of one what data analytics refers to and algorithmic decision making refers to also there isn't a shared understanding of what it's appropriate to do with data so we found a sort of heterogeneity of data practice meaning that in some instances councils and local authorities felt it was appropriate to integrate data more within a council but not necessarily to create these predictive scores for example so there wasn't a shared understanding of what data is or should be able to do for so I think you know we can't always generalize entirely with this the really important thing as well that came up in our research is the prominence of the austerity context so in many cases that was given as the rationale for why these technologies were being integrated into public services in the UK have had a significant cut in funding since 2010 and as a result of that have had to at least in some ways argue for ways in which they're changing their practices as in a way that seems to be more efficient and these technologies are often being sold as a way to target resources more efficiently or more effectively by focusing on those in most need for example by identifying who you know people that they should be focusing on so the austerity context was was incredibly prominent in the rationale that was given for why these systems were being contracted or developed and generally speaking what what councils were saying that they wanted to achieve was one council described it as a sort of a golden view of their citizens which mean basically they wanted more more information and more granular information about individuals and households as a way to for them to do their work in a way that targets needs better basically but this idea of more granular information was was key to what they saw was something that these systems could provide now in terms of the challenges that they highlighted they were predominantly what we might describe as cultural and technical so from managers point of view a key challenge was that councils and local authorities or public sector workers and were sort of historically reluctant to engage with technologies to say so there wasn't a culture in which they were they would sort of easily integrate these technologies into into their practices and also in some instances they were saying you know social workers always will never sort of really go along with this because they always feel that the only people who should speak about needs should be families themselves not anyone else etc so there were questions here about organizational culture and and professional culture that some saw as a challenge for integrating these technologies within these contexts and then the other key challenge perhaps also partly linked to that was often technical so actually the error rates were often very high in many of these systems but they were still used so they know that they're high but they still get used and often that's to do with very poor data quality which is to do with sort of the way in which data has been collected historically has is not necessarily consistent and has a lot of flaws in it etc and isn't consistent necessarily with these models so that was remained was a challenge that was highlighted and in terms of impact assessment we asked about the extent to which they consult with citizens about the implementation of these systems and what kind of impact assessment they do now of course they have to do sort of some form of privacy impact assessment and data protection impact assessment but there were there was a lack of any sort of general impact assessment including also how this might be changing professional practices so are people making decisions about the allocation of resources and the approach to public services differently as a result of the implementation of these systems those kind that kind of impact assessment wasn't there nor was there an impact assessment with actual one we might think of sort of as impacted communities how they might respond to the implementation of these technologies and that's an issue because when we spoke with civil society groups and stakeholder groups we found that there were a number of more substantial concerns that they had about the use of technologies so at one level it's the surveillance questions so the sheer extent of data collection and sharing that's necessary for these systems to work and concerns about who would access this and you know the extent of this invasive aspects of these technologies and then the other one that's also I think quite familiar now as an argument that was highlighted in our interviews with civil society was this question about the extent to which these kinds of technologies tend to entrench forms of bias and discrimination in many cases because they're based on on skewed datasets by for example collecting data on some groups more than others and so they're overrepresented groups or there might be underrepresented groups in the data and so in that in that case this this was brought up as a concern because again there would be you know it'd be difficult to interrogate those types of biases and forms of discrimination but actually what came out more which I think gets talked about less was the sort of the way in which this will tend to target certain groups and will tend to stigmatize and stereotype so in part because this is based on these profiles and scores are based on what people like you tend to do right so it's based on group traits in order to say something about an individual and there was concern that this would actually lead to a form of stereotyping but also that by attaching a risk score to an individual you actually engage in forms of stigmatization so you know one person said you know what right do we have to attach a risk score to someone to label someone as a risk so this this came up quite a bit and I think maybe it doesn't get talked about that much then there was a concern about the extent to which it's possible to actually challenge these models once they've been implemented both at the level of professionals so people working or use making use of these kinds of scores but also among citizens and service users and that in part will be linked also to austerity context and so forth which will you know in part shape the opportunities for for example social workers to push back on the implementation of these technologies if they feel they don't work or their ability to challenge what a model says etc based on other types of expertise and knowledge so the question of of empowered someone is within these institutional contexts was was brought up as a as a concern or possible challenge but also many cases you know what in all cases citizens don't know what score they're that they're being scored or what their score is so how can they challenge decisions made about them on that basis so there was also a real concern and then generally speaking I think just I won't go on for much longer there was a sort of recognition amongst many groups that they didn't necessarily think that the issue was technology but that this technology is advancing certain policy agendas so that these technologies are being used in part to so as I said the austerity context means that these technologies are being implemented in a context of service reduction and in many ways this was seen as what these technologies are doing is advancing that kind of policies policy agenda of shrinking public services and reducing public services so actually this is about the politics or politicization via technology more than anything else and I think that's something that has come up I think in sort of broader questions about whether we need to think about the implementation or the integration of these technologies in broader terms that speak to transformations in in state citizen relations and I think you know these are are points that have been raised by by other scholars who have looked into this as well but just to summarize some of the key ideas around this so one is what happens to expertise and when we start relying on these technologies and there is here a question as to the extent to which expertise gets transferred to these like private calculus devices so these systems that are being developed in private spheres it's not clear what expertise informs these kinds of risk assessment tools and who gets to what gets to count as as expertise in that context and what happens to you know domain specific expertise in this regard so amongst professionals and the other thing that gets highlighted a lot I think is is the fact what this does is position citizens as potential risk not to say that risk management hasn't been a key feature of public services for a long time but these systems tend to be optimized to catch a risk predominantly and so actually citizens are very much then positioned as somebody who is potentially at risk or a risky citizens is kind of ingrained into then what public services come to mean and the other issue is about what kind of social knowledge gets captured by these systems and they are about so these risk assessment tools will attribute risk factors or risk factors are attributed to individual behaviors and characteristics and there is a question here about what that means for social policy when that becomes the definition of risk and whether focus shifts away from broader social structural issues like for example inequality or increase in poverty or racism etc and actually gets a shift with these individuals and individual households that those they are the source of the problem so this individualization of social problems is another question that has been highlighted or another issue has been highlighted in this context and then finally that we're moving away from looking therefore at underlying causes of social ills so for example underlying causes of crime towards this continuously preemptive kind of mode of governance that actually targets that targets individuals and targets families but actually moves away from from looking at underlying causes and therefore what we might think of as preventative measures towards these kind of more preemptive those that are about engaging with with on an ongoing basis what does politics come to mean in this in this regard because the sort of deliberative process that shapes policymaking is kind of removed from this towards the sort of more operational logic that preemption invites that's I think I'm within just within my twin it's there so I'll finish there thank you