 Hello, welcome to the session. So this is the surveillance 101 so this is everything you need to know to know it's not something slightly different. But I'm going to kick off with a little bit of video. Yeah, it's not happy about that. So why don't you just wear sunglasses. So we'll come back to that. So, um, so I want to just give a bit of background to this presentation where it came from. So if I can go to the next slide. So 2019. And I was working for all I just happened, we were having a conversation with Jim groom about we are. I think it was 19. And Jim groom, nice guy, bit crazy, but I started talking to him about a project I had done with video surveillance. And he was intrigued and he ended up inviting me to the domains conference in 2019 and this is essentially the presentation. I gave back then it was in Raleigh, North Carolina or South Carolina. So obviously then the pandemic hits. I'd love to have done it in the UK the top. So, but yeah, never happened. Now I have the opportunity to kind of reshare to you and also update it in terms of what's possible. So, let me just go back to this. If you haven't kind of appreciated there's quite a bit going on here. And it's radiating quite a bit of heat as well. So someone might want to get a fire extinguisher just at the ready. So this is where it started. This is the project I was talking to Jim about. So I've been very fortunate that a number of years ago, I've done a lot of development on Google. And Google have a developers expert program. And as part of that they pulled together a different product area. So there was a guy I met called Niko Micelli. He's a Google Analytics expert. So Google Analytics is capturing website data doing your analysis. The Google Analytics is actually a huge measurement machine. You're not restricted. It doesn't have to be websites. There's basically an API where you can ping events into it. And that's exactly what Niko did. He's got his Raspberry Pi, his Arduino board, a motion sensor, a couple of lines of code, and basically he got a baby monitor. So he was using the motion sensor to send data in Google Analytics. And I thought this was really interesting. And I think, you know, as alt values, us exploring things, I wanted to explore it myself. It's probably the one project I've done that definitely got me more trouble than I wanted. So I took my Raspberry Pi with this webcam. And there's an open source library called OpenCV, which is used for vision processing. So I basically replicated what Niko did, but detecting faces. So this is DevFest 2019. And we basically had this at the front of the audience all day. And we got a graph of how many people came in and out. The thing I wanted to stress people, it was face detection. So it wasn't face recognition. You're just detecting faces. But retrospectively, people were concerned about what, you know, also, you know, throwing Google Analytics as well into the mix, people understandably concerned. But I think as part of this project, it made me think about the accessibility of this type of technology to people. In terms of how to do that, it's essentially six lines of code, the OpenCV library. So in terms of other people doing this and OpenCV is actually used in a lot of other places, other applications. There's a long history of us trying to explore what can be done with facial identification, detection and so on. Going back to the 60s, so it's an area that's interested people for a long time. So I can't remember if your first digital camera didn't have that option to save your family members and so it's essentially using OpenCV as a technology behind that. Obviously not everyone's going to want to even six lines of code. They don't want to write six lines of code. But the face has essentially turned into a service, which I think increasingly with the other conversations we're having now around AI, I think is quite interesting. I think there's in terms of what happened with face detection, face recognition, which was, you know, around 2019. And the response to that, I think we're seeing similar sort of conversations around AI in general. So looking back to look forward. So face is a recognition service. So basically other people taking something like OpenCV or their own version of it and putting it in the cloud and then developers can basically use it as a service, API calls. So there's a couple here. Some of them are more scarier than others as I'll show you. But I thought do a bit of playful stuff with this. So do we have any volunteers who are happy to have their identity pass to Google? We've got one. I need a couple. Yeah, yeah, excellent. So we're going to play domain invaders. So if you want guys want to come up here. I'll just, so we're going to look sad into the camera. There we go. So sad faces. Now we're going to shuffle that way slightly. We're going to do a happy face. So there we are. We are domain invading. Thank you very much. I've got a download option here. So all this is using is the webcam and it's making a call to the Google Vision API. So Google Vision API. Anyone can sign up to use it as a service. Google quite happy to take the money and run essentially. But in terms of like just jump into here. So it's analyzing the faces and it's looking at various aspects. So emotions and basically coming back with a threshold. You know, I, you know, is this person happy? Is this person sad? And Dom, thank you, Dom for this pic. Do you remember this one? Was it Eden? So I've never gone to this point, you know, I've obviously, I've like, I've made different domain invaders. But you know, this round this time off, because there was lots of press items around this technology and people considering or speculating or suggesting how that technology might be used. So, which is quite, you know, I've, I look back at this and it's quite disturbing really that we're, you know, what do we learn by this, you know. But I think as part of this kind of, you see these small scale examples, which hopefully go nowhere. But at the other side of the spectrum, you've got something very different happening. In terms of training these models, there are various data sets out there. So this is 110 million celebrity faces that have been coded. And there's various of these data sets lying around. What's quite interesting, can people remember, I think it was around 2019. New stories. It was, in terms of timing, it was slightly ironic because it kind of kicked off at the Creative Commons conference in Lisbon. This idea that big companies were basically taking Flickr data that was Creative Commons and using it to train their models. So the issue with that, though, is it had nothing to do with Creative Commons license. What had happened was a researcher was looking for a large data set for, I'll go to Flickr, and then thought they would do the good thing of taking Creative Commons license images. But fair use doctrines means that they didn't, it could be fully copyrighted images that they could digest and use. And I think, given the current conversations that we have around AI, I think we're still at that point where anyone can just point something, particularly in US and just suck that data up. So we still, even now, don't, I feel have controls that prevent that from happening, which is quite disturbing. I'm contributing to the data set. I'm tagging myself. We're all tagging ourselves, just helping the next model along. If you're interested in finding out the different data sets that are available and how they're being used, there's a really good project. It was originally funded by Mozilla and it's still going today, exposing AI. So they basically catalog public data sets, mostly image based. So, you know, if you want to read more about the Oxford Town Centre data set with 2200 identities or the Duke University data set, they're trying to expose and make people aware of what's happening in terms of the data that's available. Meanwhile, over in China. There's alarming signs of where this potentially can go if it's not properly legislated. Social credit. She buys how she behaves, gets trapped and scored to show how responsible and trustworthy she is. It's called the social credit system and in one version now be tested. A person's reputation is scored on a scale of 350 to 950 and how you, with a good score of 752 is okay with it. In fact, most people are. Is that a mechanism like Prusa's new to become a better citizen? It's been data meets been rather expanding how the government monitors, understands and ultimately controls if 1.4 billion citizens. Thanks to advances in artificial intelligence and facial recognition and a web of more than 200 million surveillance cameras people bothered by privacy concerns. I think it's a lot of camera. Keep the safety is really good. We can accept it. These are experimenting with the algorithms to help the government create a new national social credit system. The government also has pilot projects. Citizens are required to do hours of unpaid work to get benefits and scores are dropped for things like literally a messy yard, gossip, even jaywalking, video of vendors, they show on the local news. Have you come across social credit for I see some social. So, again, this is another one of those stories that was big in 2019, and it's gone quite since then up until I think it was 2022. And the legislation suddenly appeared on the Chinese website that they were going to make this law. So, don't have a question. So social credit. Really. I did not know. Yeah. Yeah. I did not know. Yeah. Thank you for that. So obviously China, you know, as part of their social credit system, they're not only developing the tools and mechanisms to do this but they're generating the data as well. So this is one, one of the Chinese facial recognition software called Skynet. So we're just going to make it slightly more obvious as featured in Terminator. So I don't know if that's just like a really crazy joke or so anyway you can find that more about that on. It's on the New York Times, so I think it's a reliable source. I think alarming in terms of without regulation and actually state sponsorship in this area of just what can happen. So, and you know, it's not just social credit. They're using this technology in various aspects. I think there's one in particularly where you know you're talking about using, you know, state control on toilet paper. But there's another one, you know, in terms of minorities, so monitoring minorities so again, we're getting into really kind of scary territory in terms of what they're potentially doing with this technology. You know, it's not just in public spaces, it's in supermarkets. So, you know, the extent is quite massive. Kind of flipping across to the US, you know, this similar, you know, the face as a service has been used in various ways. So, again, around 2019 stories started merging around, you know, things like using it to take people at concerts. And I think that there was this period where there was a vacuum of legislation and regulation that didn't restrict this in any way. So, you know, there was, there was nothing there. Yeah, similar to social credit, you know, starting to trickle down into other parts of people's lives. The wheels did start turning so people did start wanting to put the brakes on this. So there were a number of small victories, sometimes it was state and California was particularly on the ball in terms of starting to protect people's, you know, their concerns around this. And then at a similar sort of time, we've got this kicking off in the UK. Do you remember this? So, there's one line in facial recognition. Police cameras in a nice London street. Everyone gets scan reviews. Here's what happened. He wanted to be caught by police cameras, so he cut his face. Police stopped him. They photographed him anyway. And I'm going to follow. You're a better, I'm doing a lot. The fact that he walked past him. I would say yes. Police said this was distorted behavior. So they gave him a fine. It's probably very nice to get a closer look to it. So I'll look past like that. Oh, Daniel, I think I've done that. It's very short for you. I'll just make a quick link right next to that guy. That's just good. Don't waste it. Don't waste it, don't waste it. Don't waste it, don't waste it, don't waste it. Don't waste it, don't waste it, don't waste it, don't waste it, don't waste it, don't waste it, don't waste it. I've got a man 90 pound fine. There you go. Look at that. Right effect tracks, 90 pound. Well done. So Met Police and South Wales Police just started, let's see if I can get this on slide. And at the time there was guidance on use of surveillance, but there was this legislation hall in terms of usage. So they went to court. South Wales Police actually lost the case. And as a response, so the original guidance published in 2014 was updated as a response to the findings from the court case. So it's good, I suppose, that legislation caught up. But there was that period where we're in the unknown in terms of what's going to happen next. Who's right? Who's wrong? The other aspect of this, I think, just to think about is legislation can work for us, but it can also work against us. So Berke bands, which are surprisingly in terms of the number of countries in Europe that have some sort of Berke band. So preventing the wearing of the veil in public places or schools is quite actually widespread. So my fear is that we start legislation against covering our faces or wearing sunglasses. So I think it's definitely something that we as public have to keep our politicians where we want to be. But I think the danger is, if you think back to that social credit piece, it was like more cameras will make people feel safe. And I think that's the danger that people will lean towards, oh, I want to feel safe, then actually having the freedom of expression to cover their face if they want to in public, to walk in public places without fear of being captured by cameras and then process. And just thinking about this, recently, within my town, there's a group of youths that have started hanging around the medical center and started abusing the staff there. And a lot of the immediate responses, oh, let's put surveillance cameras in, but actually, what's that? They're just going to move on to another place. What we need to do is invest in why those people feel that they can do that and deal with the drug problem So this is back to Audrey Waters 2014. So my position on this is, like the Luddites, it's not necessarily the technology, but it's when we get to a point where we're no longer caring about this, how the technology's been applied, that we seriously need to start rethinking what we want to do with it. And actually, positive use of the technology Actually, positive uses of face detection and use of video surveillance, but I think we have to keep in mind it's not all negative. So, Karros is one of the companies I showed you, faces a service, so this is something that they've put together in terms of just showing positive benefits of facial recognition. So it's not just humans, it can be animal welfare as well. And this is quite an interesting one. So, Scope published a report recently in terms of assistive technology and gamers and it's a barrier for a lot of people in terms of it's very expensive equipment and hard to install, hard to access. So Google have come up with Gameface, so it's basically a downloadable piece of software uses your webcam, can use your eye motion, your expression in your faces to allow you to have a game controller, essentially. So I think it's not a back and way. And I think in this room, I think we all know technology is not neutral. We can decide how it's used, how it's applied and make sure that we have the restrictions that we need in place to prevent the bad stuff. So, I just want a final piece of art. Who's familiar with they live? So if you take one thing away from this presentation, make sure we can always wear sunglasses. So yeah, so final one. So final, if I can get on. So final experiment. So I mentioned Karros. Oh, have I broken everything? It was gonna happen wasn't it? And for hopefully I can run on and save the day. Yes, I can. Did you see the video? Was it working? Good, good, good, good. So final experiment. So I'm going to, they live myself. So sunglasses off. So this is the data that comes back from Karros. So I don't know if you can see it, but it's got my ethnicity value. But the other thing it can detect if you're wearing glasses or not. So glasses on, glasses on or off. Come on, I've got 11 of these. So on or off? On, okay. On or off? It's not always, it's not always correct. We do one off. Oh, my eyes have gone turned into sunglasses. It's a big transformation. See, well, don't use the Karros service. It's terrible and evil. It's a healthy before. It's the late night. I'm going to blame the late night. So let's go back here. So what? It's messy. But ultimately, I think, you know, I think it's useful, I think, to be aware of what's going on in terms of both the technology and the legislation and how they played together as part of this. And I think, you know, thinking about some of the conversations we're increasingly having a gain now around AI, I think looking at how we can respond quickly and proactively to that to protect us. And that's me. Thank you for your time. I welcome any thoughts, reflections or discussion. I was going to say quite, you know, I've got a big period here. Yeah, yeah, yeah. I'm going to speak to some of the others and shut them in the sun, like to stop saying things. I think that's is as you've seen with the sunglasses piece, you know, and I think that's one of the issues. And I think useful to look at when the Met were using this technology, there was some data published about successful arrests. The numbers were relatively low in terms of, but given the thousands of people that they detected. So I think currently, not high, but as we see larger datasets, more compute, higher resolution cameras. I think, you know, they can make it better, which is a bit worrying. In universities, only the views and knowledge, a lot of it, which is all in terms of I don't think we're lacking all that in the detectives. Yeah. And it doesn't produce the same close-up chances in which they're not. Is there a set thing with being able to say we actually need to take the people? Yeah. I think personally, I'm conscious that I'm sharing less images of myself online in a public forum, but my Google Photos is just a huge data bank of me, my life, my family. And I've kind of accepted Google are probably mining the hell out of that. And Google actually got into issues with the algorithms that it's using with Google Photos and biases that it had. So it's obviously refining that model with the data that it has. So I don't know what the answer is. It's like, and it's not just photos, it's all medium or media is potentially mineable now. Use AI to not be able to do that. Yeah. Yeah. But then the other, also, but here, it's just problem type, that's great. So I don't know if it's worked back there. I mean, that's pretty potentially, because you've got to see less input, but there's a bit of input, but there's a bit of input, but there's a bit of input, so I believe it's, and then there should be an equity in general. Yeah. Yeah. Because I protect, recently Google put me on trusted test programs. So I've seen what they've, they've got in store. No. I suppose what, I'm less fearful of this particular area and I think other areas I'm more concerned about in terms of the future. I mean, burning to death is probably the one that's top of my list. I think that the thing, it is making our government work for us in terms of this. So in the case of the South Wales Police, the courts worked in terms of, found that they had gone beyond what they should have done. Government worked in updating the legislation, but it's that lack, it's that time delay. You need due process, but it's out of line with how quickly technology is changing. So it's always on the back foot. So that's, that's the concern. Yeah. What behaviours, normally that's the worst of that, if they're used to their data, don't just think about that guy who's covering his face up. Yeah. Yeah. People just aren't aware of that. I think it, I think is part of it is, you know, back to that social credit is that I've got nothing to fear. The cameras are there to protect me. So I think there is a danger of falling into that headspace. Thinking it, it's, you know, it's, it's going to save me. It's going to make my life better. But it could go to the point of social credit and it's not there. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Mm. So you could point. Yeah. Yeah. Mm. Yeah. Yeah. Yeah. Yeah. Mm. even when it's broken pieces in our case of legislation, we still as institutions have to find our own versions of that. I'm just looking at check, check, check policies. Yeah. What about that? We still haven't made the property. We have these conditions that students sign up to without ever reading them. They will be recorded and it might be insensible like their faces are more we've seen. As soon as they're going to be recorded. Yeah, yeah. You know, students are hard to get into. Yeah. And you know, when we then bring out the case, taking the decisions of my colleague who's researching in the last couple of years, which is probably the next few years, but we then also have really gestures and how people interact and are able to move on to these special conditions. This is still important. Yeah. And yeah, institutions have not happened really in part time, having that on point. It's interesting as well in terms of the balance because I was working on a project and we did a quality impact assessment of video capture and looking at who it could potentially benefit of people who had other life commitments or mobility issues. So it's not a simple black and white or which I think is the tricky bit. So it's balancing the rights of the individuals with benefits for the individuals. Thank you.