 Over the last years we've been uploading more and more of Photo and video data to the internet either voluntarily through YouTube and Facebook and Instagram I guess and also without our knowledge through security cameras coupled with an increase in machine learning AI Research and all of that. I think we are all quite concerned about applications But it's getting farther than that and our next speaker is looking at what the implication are for our society as a whole for our culture so he's an activist He's a digital rights advocate and he's a tech ethicist. Please welcome to the stage Matthew standard Just want to let you know if you want to move up Be awesome if you want to leave at any time no worries like I know it's late in the Latid camp everybody's probably getting a nap for some party tonight, but Excuse me while we're here Let's talk about the future My name is Matthew stender. I Call myself a tech ethicist. I'm really interested in the Both the cultural and ethical implications of emerging technology kind of this question. What does it mean to be human in the 21st century? Other things like what is the social contract that humans and the machines we've built but they're becoming more and more than just tools What is the what is the social contract that we can expect? What isn't what is a reasonable expectation of privacy when we live in an age of ubiquitous surveillance? So My talk today. I am gonna run through it in some ways the the kind of first half is going to be a update on what is going on as far as Facial recognition technology how it's being deployed in countries around the world how it is become how our faces are now becoming an access control vector And what does it mean the image once it's taken in a digital format? Can continuously be processed and added new and have new layers of information added to it All right, so I first really got interested in The idea of ubiquitous surveillance and particularly facial recognition technology because I feel it's an interesting nexus that combines sociological Psychological, but also technological forces What I mean by that is when we are walking down the street, huh? Let's say 50 years ago 60 years ago walking down the street you could in some ways be Anonymous maybe somebody could see you but there was very little record of you being there now with 24 hour a day CCTV cameras Those that have the access to these two databases in which to collect visual information can rewind They can also see where their other eyes are around town and actually map out our trajectory Throughout the real space in which that we occupy so with that I am this is gonna be this is a Six things from a talk that I'm giving in an hour and a half, but I wanted to start with this I have been working on a theory and called hashtag mimics But when it comes down to proprietary algorithms proprietary platforms I'm very concerned that there are a lot of a lot of forces that can be exerted on our information flow that create a increase the signal to static ratio between Information sources and us so this is a theory that I've that I'm continuing to develop I call it mimics and it stands for monitor index manipulate intercept sensor and silo So these are these are six forces in which that the proprietary nature the blackbuck nature of proprietary algorithms Can can interact with our information and eventually nudge us into consumption decisions or other decisions in which we not we may not have even known that They were they were acting on us Mm-hmm, so Let's let's take a look at this when it comes to the mass proliferation of image and video technology That this surrounds us. It is pretty mind-blowing that a hundred thousand People on nearly a hundred thousand or nearly a sorry mill nearly a million people log into Facebook every day That one point eight million snaps are created in internet minutes I mean snapchat's new. Yeah, I mean so I mean I think that snapchat has been very impressive in the way that it's been able to Be outside is they've been able to resist Been able to resist Purchases by Facebook and now has had his IPO. We'll see how that goes I don't personally use snapchat, but y'all probably all seen at least a snapchat interface in which an augmented reality layer You'll use vector mapping on a face in order to create filters. This is actually a pretty sophisticated piece of technology Vector mapping. I think it's a pretty exciting thing. I'm gonna talk a little bit about that later But it just goes to show that now as commercial products are being Put out that we use as fun. There also is technology that's being developed That goes into it goes into something like a snapchat So I mentioned this a little earlier. This is a This is some bullet points paraphrase from a piece in the new inquiry by Trevor Baglan called invisible images and This this wasn't when I when I read this piece It got me interested in this in this dichotomy of the idea of how we see an image as always static But the systems that we put it into can continually Churn and be able to draw out information that may not have been apparent when the first kind of cycling through of this image So so Trevor begins talks about some that a photograph shot on a phone creates a machine readable file And I think this is a really interesting first point that we are not creating Images that are human first readable. We are now creating machine readable first and we actually need a subsidiary or another photo viewing app in order to see visually what the what our phone is captured the And secondly that images do not need to be turned on turn into human readable forms in order for machines Do is do to do something with it that we that there may never be a human readable file, but information that's captured is still being still being processed and The implications of this are that the automation of vision at this scale Which we talked about in the first slide 1.8 million snaps per minute this sort of enormous scale and it says is Is something that we've never seen before right the computing power that it used to take to I don't know even put You know images into a system for CCTV cameras manage the the information flow especially in an analog sense people needed tapes and tapes and VHS's and maybe DVDs whatever it may be but now on on such a small device we're able to Of with solid-state memories and other things were able to put so much information in one small space So I wanted to talk a little bit about the kind of ecosystem that's going on and this is that What we're seeing now is a mode of development a mode of technology development that is Diverging in two ways at least in my opinion and this is the corporatization of Services But this is also nation-states Developing technology to further their interests. So this so At Carnegie Mellon who had at the time the most advanced said the world's experts in facial recognition and vision Vision technology and neural networks. These people were all basic Carnegie Mellon What did uber do they poach 40 of the top researchers in this field and now they're working for What I would consider a pretty shitty, you know a pretty shitty company It's an interesting interplay in which that now there's commercial or economic motives for the brightest minds in the world to not In not to put use their talents for the betterment of humanity But for individual corporations on the other side of this is nation-states that Baidu and other companies In China are now Are now doing some amazing things I mean Facebook and Google as well But but China right now is has some of the not only the top super computers, but they're also doing they're also pioneering Machine learning neural networks and other things on their own accord Why I find this interesting is because if we're talking about modes of technological development What what are what do we expect? What are the motives that drive the the entities that create this technology? if we look at China Even though the Chinese computer scientists and others that are programming this I find it quite interesting in that the boxes the linguistic boxes in which that code is entered into is entered into Through programming languages also has what I consider a cultural DNA that a character that each character representing a syllable in Chinese is different from a string of of characters in a Latin alphabet and so I think that it's really interesting For me to think about though the ways in which that different entities are pioneering different technologies But they all bring something different they all bring their own motives into into the mix So these are three things that I believe that are going to be important Factors and what I consider a next epoch. This is an epoch of the world a changed relationship between humans and technology So I'm going to talk about the I'm going to talk about these three things and some other things But talk about how machines see us how we see each other through the lens of machines And how we view what we now consider machines right so Let's delve into this into this a little bit So personally I lack the agency. I don't have the ability to control how a CCTV camera or Facebook or Instagram running neural networks on an uploaded digital content is Sees me what kind of sovereignty gives me as an individual or how I'm packaged together with groups of similar people And this really does I believe impact to our identity I'll get into this a little more lately as I as I as AI is incorporated into new platforms This will become increasingly pronounced. We see this all the time AI is going to do this Let's let let's outsource decisions to that So what about you? So I'm a peer you're down there We can all see each other in this room But if we think about un-networked humans what I call the next billion the next billion people that have never touched a network device Right, but soon more and more people every day more and more people are have a phone in their hands When it comes to a place like India and Bangladesh that have a high rate of digital adoption Have just by the scale of their populations But there are still a lot of un-networked humans so how is somebody that is not even on Facebook or Snapchat or these other systems how what it what is their identity? What is their sovereignty of self look like in the context when they're still being passively captured whether in the background of selfies or On CCTV or soon with self-driving cars that have both LiDAR and cameras around it What are what are what is the ability for people that are not inside of these corporate silos? to actually control the way that they're seen and Ultimately when I'm thinking about technology You know what are the communication strategies that we need to develop? And with the the body of epistemic reality this new the way that we learn things How will this inform? humans when we're dealing with not just one type of AI but many different types of AI that all have these optical nerves that are essentially digital eyes so Machine learning neural network. So mega face is a database Set up by a university in California, and it is like one of the most important Test data sets. So it's a training set where there were 4.5 4.1 million photos with yeah, nearly a million unique identities and There's so there's some overlap so people are able to have different photos of themselves in this way to train things But it I find it quite interesting that one database is kind of the standard for machine learning Let me talk about how this is problematic and even the idea of Why we need to further interrogate test data and training data But I just wanted to set this up there because this is kind of the premiere a lot of the A lot of the contests that are that are that are run use this this database the mega face database So let's say like we are getting more and like every day it seems That there are new developments inside of facial recognition recognition technology and a lot of these are Done for corporate capitalists means so Google is now Linking offline online habits and offline buying habits facial recognition technology is being used it was being rolled out across the UK to to track consumers and Mentioned this Instagram has rolled out a neural network which now Looked at the background of photos and see where brands are and other things and they're able to match up Advertising ID with particular photos, but again, it's kind of weird I upload something and all of a sudden it exists outside of my control and can be crawled by by multiple processes I'm not going to talk so much about emotional measurement measurement, but There's a number of factors So emotional measurement At least with this example uses at least 10 data points per frame We have the vectors on the face This is able to where there's emotional measurement that's able to work in real time walking through a train station and With eight with L with L with at least eight frame for seconds. We're able to get tell The mood that people are in and so this exists, but how is it being used? Well, here's two things in the last couple of weeks that have come out And so we have we have a new technology that is now being employed for CCTV CCTV cameras But I think most interesting is this example now Disney has a new patent that they're going to put an infrared camera behind the screen to be able to look out at you and See if you're smiling or not. So there's real-time Real-time analysis Being done on you even though you are paying to engage in an entertainment product And I think this is really interesting. What are the assumptions of once we paid for something once a week? Have been in a consenting economic actor Have we given away our rights? Are we okay with buying? Excuse me buying a concert ticket paying 50 euros for a concert ticket, but at the same time There's drones and cameras and watching us and tracking us and we're using our wristbands to check in so I think that like the agency in which that we are able to to Employ to be able to freely exist inside of space is actually perhaps hopefully not but maybe diminishing So I think so sovereignty over our identity as we move through place in space will be more important Right now a lot of things are fixed. They're stationary We have androids and apples that are increasingly Running more sophisticated technology on device GPUs that then ping back to servers But at the same time once self-driving cars automated vehicles come online We're going to see more than his lidar, which is the satellite was a radar Positioning that she's for kind of the cartography and the movement of space But also see more images that are passed up once Tesla or Lyft or uber Incorporates more images on their their vehicles When we're now on a street, we may get passed by four cars that all have a photo record of us being there So now I'm gonna talk about ten different cities to say like to show like we we are now in the future so New York City is adding facial recognition and license plate readers to his river crossings you go from one borough borough to the next They're gonna know who you are This is not just Street-level surveillance. This is more than that license plate readers Are quite interesting in the sense that you can be parked and they can know something about it But you we cannot maybe draw a lot of conclusions from one particular License plate in one particular location, but through being placed at crossroads at nexus is at bridges. We're nail now Those that have the power Are able to now link up and see how often we go where they're able to create? Maps of our of our daily commutes these sort of things I don't know offered a Palin's here, which is the Peter Teals like big data company Actually just got into a fight with a New York PD And so New York PD is actually not using them as a as a service provider anymore But let's just say that this this is articles from October 7th 2016 But the Palin's here was still a client of the new of New York City. So We don't know maybe if you were in New York in the last six months Palin's here may have a record of your face but You we don't know that Another time another bone this is an example of a vulnerable population That is being used as a as a petri dish for this technology New York City is going to use homeless or use facial recognition to to track homeless people So if you don't have a home somehow your ability to Not have your image taken is somehow decreased again back to a social contract We don't have a reasonable expectation of privacy when it comes to facial recognition technology Berlin where I live In sucoits train station there has now been a Facial recognition technology rolled out on a pilot program highly controversial but controversy and public backlash did not stop them from doing this and Berlin Germany is being actually a quite privacy-centric come country. This goes to show that this is Happening everywhere Milan the Milan central train station like has a very interesting New facial recognition technology program, but this is not for surveillance. It's actually a new ad platform I so advertising totems in my lawns train station have facial recognition in order to collect passenger data without authorization Again going back to the different modes of development of these technologies is it commercial Development is it a national security development? Is it an economic advantage? Development is it something like emergency disaster response? And so there's different ways in which this technology is being used, but as it gets more and more sophisticated It is easier for large data sets or underlining code to be transportable a few more Japan grants aid to install face recognition systems worth three point four million dollars and Pakistan's airports Amsterdam Not far from here testing facial recognition and boarding gates Australia say yo, what if we didn't have passports anymore? What if your face was your ticket? again British Airways BA is using biometric scanners for faster airport check-in and The US administration has also fast track facial recognition in the airport China China is now Incorporating 600 million CCTV cameras That's a shit ton of CT TV cameras there And there's some quite interesting. I know this this story here is one of my favorites Somebody stealing the toilet paper we could buy more toilet paper or we can install a draconian Facial recognition system to stop these people So for me, this is one of those things that is about how do we how are policy makers and corporate entities? constructive constructively Working to solve problems, and I feel too often The answer is let's let technology take care of it. Let's slap some AI and video cameras on it This will do this will do the job And it doesn't stop there China is now testing as is using predictive a to AI to stop crimes before they happen. This is literally out of minority report, right? Ireland where a lot of the large headquarters for for companies are are based now driver's license to vehicle so now faces get automatically Input it a whole registry can be inputted and matched India the the our program, which is the new biometric ID as anybody been following The the case of India's new IDs Right, so I find this really interesting this As well as the currency swap when the 510 rupee notes were were changed overnight The Modi administration did the largest currency swap in the history of the world without anybody even know it was coming It was actually an impressive feat of logistics and and and macro Micro-economic and the country didn't collapse. I mean there was it was I was blown away when I saw the scale and size of this currency exchange of We're taking all these notes out of circulation tonight and these are the new notes going in and within of course of a few weeks The smaller notes were no longer accepted. Well, I think it's interesting is because no technologies No technology technological Advancement exists in a vacuum. It doesn't exist in a bubble. There's other factors going on around it And so when everyone now in India 1.2 1.3 billion people now have a biometric card that is essential for them to access public services That we that now these things become Become more pressing that the ability for those in power to oppress those that don't agree with them becomes more worrying So as we see here, there's a couple of case studies from the US Oakland has all this information from its street-level surveillance programs And it doesn't really know what to do with it Right before the the most recent inauguration in January Police surveillance cameras were owned Don't really know why and there's actually been some pushback from DC Metropolitan police and the FBI about actually not saying so much about this As far as I know Nobody is quite sure who did it or hasn't publicized But we talked about the bridges in New York City But think about the information You about one day of having a being able to Move laterally through a cctv network for a city the size of washington dc We're now able to track diplomats had some state that are visiting other important figures and With this sort of information the ability of even one day snapshot. Maybe it can't tell you everything, but you're still able to understand Movements of people you're still able to to do mapping of of social mapping and network backing of these things and so I I think that When it comes to the vast vast amount of information when it of of Still and moving images of people There is a lot to be done around privacy controls I call a theory i'm working on drone budsman right and um budsman is someone who's responsible for a publication That's it's the person We can think of them a public editor right now That cities may have data protection officers But with drone data with cctv camera a lot of times these slip through the cracks and policy becomes about information retention and how to dispose of it, but it's not actually Interrogating why all this information is coming in to essentialize Database So now I'm going to talk about like some other things going on. Um taser, which I probably know from pop culture of um a non-lethal weapon they're Traditionally, I guess they come in two forms one a handheld that really gives out an electrical charge the other have these Nodes that go out and electrocute someone who is being shot by them But taser is now getting into the ai the machine learning and to visual analysis So this really concerns me because If a police officer is wearing a body camera And they walk by me I broke it If a police officer walks by me Who who like why I am at least in the in the us the assumption is innocent until proven guilty No, but I really feel that we are entering this new era of A change in this sort of uh and sort of this relationship in which you are not guilty yet Right, but we still want to know where you are where you were and eventually We'll get so good that we can map out where you're going to be So they are not the only one a lot of companies whether it's on social media geofedia another or in other companies And very soon in other companies are bringing are bringing new frameworks of Processing power of neural networks and and other machine learning To to the The tools in which they like have traditionally been rolled out but have been static and now they're we're creating a dynamic data collection framework So let's talk about a little bit about this relationship, right? I mean so See here, but uh, yeah that more and more our faces are being recognized So So I really found this is starting with statistic that 68 major city uh city departments in the us That two-thirds of them have body cameras worn and but a lot of them don't We don't really know much about them. There's no transparency around these around the way that these Uh tools are being used and as well or what happens to the data after the fact Is there a is there a shelf life on this data? Does it get expunged overwritten at a certain point in time or people just sitting on What hours and today's in the months into Years of information that can retroactively be an analyzed Again, the idea of like no image is static anymore that all the images that That are captured by police video cameras can eventually even if not now we don't have the technology now can at a later date Be repurposed to draw conclusions that weren't technologically available at the time So right now how big of a problem this is 117 million Americans Have their face in some sort of database Most don't know there's something like 20 states. I believe in which that are now have not put up a Um that have willingly put their driver's data driver's license databases into the federal level very little oversight and As we've seen sometimes democracy Is pretty fucking stupid And or at least populations in which they vote on things are not in a position to make an informed decision because of propaganda Or because of whatever may be but we are holding we are putting the fate of our faces in Uh, we're putting the fate of our faces To institutions in which we're not may not always have our back So again, are we really still? um innocent until proven guilty or Are the systems being created now really making us guilty until proven innocent Yeah, I mean right now there's active research being done in order to identify who's guilty Again some minority report shit This is even further complicated by a few things that now See that apple and other companies are working on 3d modeling on on three on taking 2d images and creating a third dimension Vector to be able to like see things Project things that might not even be captured by the original frames We have drones all over the world We're now now people are working on Real-time targets and typing manifold algorithms for the for the United States Air Force I want to take this moment to say That a lot of the times the data sets that are being used on for these trainings are not Neutral data sets. I don't even know if there is something as an unbiased data set, but that's a different conversation 9 30 this evening at We I think I'm going to be talking about proprietary algorithms So I'll get into that more then but I'm really interested in the idea of the scalability of the of the proliferation of once this technology is available, whether it's to For an espionage or through a private partner private public partnerships that this technology will eventually find its way into Uh into ways into ungovernable untransparent unaccountable systems without oversight Right now we may be able to we may be feel comfortable that there's a chain of command in the u.s. Military And if they're going to use a drone based on some sort of image recognition technology that there's a height of confidence in which these are bad people are the words of the president bad ombres Right and so but we have really very little assurance a lot has been written around around The mass civilian casualties Um undertaken by especially drone strikes trying to get one high value target But I didn't apparently blew up 50 people at a wedding So these are these are things that are not just no longer science fiction They're actively being deployed in battlefields, but they won't stop there So as I mentioned earlier, this is one of the things that I think that um, I that I think we need more and more Discussion more of a broader conversation about and this is the ability of a single device with very little With very little connection to a central server to you know server farm to be able to Perform it more and more advanced sophisticated technology on the device itself There are gpu's that are now Being rolled out that are able to perform Some degree of neural network Um that really takes the burden away from the processing on at home base at headquarters of the server So as more and more work can be done on our devices themselves It actually decreases the processing load on the server side And I think that this is also quite interesting in the sense that the miniaturization of these technology is also a threat vector in and of itself That no longer you need massive Terraflop to run all these things if you can front load this computing on the on the device itself And by the time it gets to uh to server to the database that like it there's already in France is drawn cafe 2 from facebook um is is Yeah, they've open sourced that recently. Um, so this is out there. Yeah, um google's doing it as well so what is Little remix now and say that Right now seeing is believing in some ways. I'd say that's right now. We see an image and we Don't necessarily We're not necessarily skeptical of it But very soon very very soon. I believe that we are going to have to question more and more about the images that we see There's the relationship between us and our technology. And what does it look like to trust an image? Soon our eyes may lie to us The economist wrote a really interesting piece about this a couple of weeks ago um Well, I guess last month uh july 1st it was published. Um, but kind of went into the um, actually I was taking uh, in case we're doing a song and and saying that you know, how do we know that one person is who they say they are What are the what are the resolution? What is the pixel resolution that we need in order to Ensure that what we're seeing has not been manufactured has not been doctored doesn't have a vector layer on top of it That is looks like somebody's saying something but is in fact not this goes hand in hand with the idea of like, um Um After one image as we get more precise about tagging and indexing Um, and even creating metadata around images machines don't no longer are Needing less and less training data in order to quantify Um characteristics if not state in a human readable form what something is Um, this is interesting. There's some interesting work being done on satellite images um and Comparing a comparative analysis between time frames So maybe before a flood and after a flood but more and more automated systems are going on to match up and see where Um to see what has changed And so we're getting we're doing this from you know from low earth orbits But at the same time these same technologies are able to to be well Probably eventually be able to roll out and things like cctv cameras Um, I I mentioned this like now that we don't actually have to see you have our whole face seen by a system, maybe just a profile of you now the uh Yeah, that that that iphone is now working. Um, and I think the iphone 8 will even have more um sophisticated technology to be able to to render and re render people's faces based on a incomplete sample so um I think this kind of gets to to what I really want to kind of get across is the emergent so-called the emergent social contract That has been forged between us and our technology is under is operating under asymmetric pretenses That my relationship to technology is not the same as technology's relationship to me That our physical bodies have a contract with technology even if we don't know it And this is the this comes in the form of captured images The services rendered locations But we're never really able to peer inside and see the technology that sees us So quickly who's developing a lot of this on the private side maverickville in china face plus plus Uh blipper, which is a which is augmented reality, but it's also using facial you also facial recognition technology Um and fine face which is creepy as fuck Um fine face uses the vk um The large facebook clone in russia and it has been used to basically identify so the the two male co-founders basically said Um in the part of their marketing scheme You can take a photo of anybody run it through our system And then you can find the profile of that cute girl that you saw at the bar and Discover what she's into It's really creepy and if we look at to if we look at um these how these uh how these facial recognition the real-time face recognition um Is being marketed people are smiling and it's it's women like here. Oh, oh, this is something like this is interesting but there's very much a what I feel is a A phenomenon which is unhealthy that there are Males there are tech bros if you will designing technology rolling out marketing strategies And but it's not giving people the chance to say hey, maybe I don't want to be in the system Just because I have a vk profile or maybe just because I have a facebook profile Does not mean that I've consented to be discoverable and public But this technology rolls on um, there's some really interesting stuff going on with um uh Facial recognition when it or image recognition of the human body um automated dermatologists, um superhuman vision these sort of things uh This was going to the point I was saying that like there is real world replication bias or that that the Bias that occurs in real life also has a knock on effect and has a replication a bias inside of our technologies If we have no black people if we have no people from Of we have no women inside of a database that It is going to be less and less precise with that less training data But who's investigating who who are our data interrogators? Who are our data ethnographers? currently I I I insist that we don't have we have not built these these roles into a technological development cycle that we have to do more As as activists and as actors to think about like how do we inject? I a stronger a stronger safeguards into Um into these systems, but as well. How do we make sure that the data that we're using to For testing and for training is as objective as it can be So there are some things that we're doing as more and more glasses and patterns come online There are some really interesting ways to resist Silicon Valley has been a little more open to the idea of of of bringing philosophers into into the loop But said the consequences of failure So the replication of historical inequalities the undermining of democratic and judicial institutions profit maximization as a basis of future technological development amplification of a global global technological hegemony That is able to rival nation states The removal of humans as the primary decision-making force And the proliferation of lethal autonomous weapon systems laws So I just have a couple slides left. I'm going to open them for a couple minutes a quick Q&A But if anybody has any questions and start thinking about them now so some things around Um, just these are I would not say these are my my thesis to to combat these sort of things But more more food for thought That we have to do more in the due diligence of diversity We need to bring more people and more backgrounds Into the decision-making process to turn a critical eye On to the the systems that are being created Do we want technology to follow something like the Hippocratic oath that's used in medicine to do no harm? Do we want to distill to distill into the technological DNA of machines to not To not kill to not do harm What about the Magna Carta as a defining character as a defining um As something to build a theoretical technology on Do we want the idea of of that you in order to be found guilty that somebody that a judge has to find you guilty? Um, and what about the a new valuation system? How do we put a value on this technology when advertising and surveillance? Have their own revenue streams. What does it mean? How do we put humans back at the center of the equation? so If we're gonna if humans are going to be out of the loop, which I hope they will not be But the introduction of digital imaging into ai has brought about a new type of relationship That the proprietary nature which I'm most scared about but even when it comes to open source It's very difficult to sometimes audit code But even taking proprietary ai that uses our images to formulate opinions about us To drive that tries to nudge and drive our habits and define our identity If a state did this if an an estate structure, we would probably be up in arms But because this has happened under the surface because we don't see this now It's not there's is not front page headlines. This is not protests in the street And so unless we can create a new way to give humans the capacity to have sovereign power over their image The social contract to the future will not be negotiated With a unified entity but with a multiplicity of for-profit technology That are oftentimes non-state actors So in a few years if we have amazon drones flying overhead and we have tesla's lifts and uber's all driving by And we're in the backgrounds of selfies both on iphone's androids being uploaded to facebook and instagram These are what i call It is what i call a private a distributed privatized governance framework Which for me to have to try to get anyone record a sponge to arbitrate that i'm saying i don't want this image in your database i can't just I can't negotiate with technology in general. I have to negotiate If i can and even now there's not any solid way to do this negotiate with each of these different corporate actors And i find that very problematic and something that i think that we have to elevate As far as our calls to a more open And transparent technological future We have to reevaluate what we feel comfortable with when it comes to the social contract all right well That's all i have if anybody has any questions i think we have a couple of minutes still um questions comments thoughts We can chat now first of all. Thank you Scary but quite interesting Well, let me say this i do i do believe That we can make a better future. I am not jaded. I think that although the powers of the world that Companies with half a trillion dollar market capitalizations and nation states that have the capacity to end all human life these are big challenges, but I'm not resigned to the fact that this is going to be the 21st century And i think together working on philosophical and ethical frameworks The idea of at what do what level are we comfortable with enforcement being able to Mandate transparency inside of proprietary algorithms is going to be a large conversation, but it's something we have to start now Please line up at the microphones Hey, um, thank you great talk for i missed the beginning of it But um, then again, i'm also in this field and i know a lot of things you talk about um And uh, it's funny to see someone usually i'm on stage saying this and now it's someone else. That's really interesting Um, but i wonder what that led to me was to give some feedback, which was um You say that the human is out of the loop But what i found is that Describe of it it's math washing is that the human is not really gone. It's just hidden right it's these algorithms are designed by people who You know are to a degree self-learning but also to a degree designed by which data goes in so these They're not really gone. They're just hidden. So if in that sense, they are still accountable humans can still be held accountable to that and we can still Yeah, so they're there to agree So i'd say like two things on that one. I think it's interesting if we're looking at the causal Really technology on a on a chronological time access There are humans that are intermediary dispersed So we have the coders and the programmers those who deploy these systems People that are used checking in to like see what information they want whether it's police, you know body cameras or whatever Um, but I think that once this technology gets out into the wild that the human capacity decreases because of the automated Dashboards now now. I mean these Geophedia and these other social monitor social media monitoring Uh platforms are kind of that pretty much very limited training is required now to use these very sophisticated platforms So I I am not saying that humans are completely out of the loop But at the same time we're making more and more user friendly tools that decrease the amount of technological Know-how to use them. Um, and they're able to also export. I mean we talk about vasinar and like uh malware, you know, like cyber weapon malware research and and export licenses, but that's mostly kind of on the the digital munitions sort of side of things, but What's to stop a company in the u.s. Or china from exporting a very powerful of facial recognition platform that requires that already is there in a dashboard and you just click Here's our target and whatever so I I think that I absolutely agree But I think that as technology we're actually it's with These systems are being designed for people that are not technologically advanced um sickly I I worry that If we have not built in Kind of the benchmarks in which that data has to be erased that information has to be reviewed that the mass accumulation of information inside of Inside of servers Has a couple of drawbacks one. It's a vulnerable to being compromised What happens if with the opm hack in the u.s. What happens if a massive new york city database gets gets hacked and now A state an apt like now is able to do their own mapping and so I think that while there's still room for humans in the loop humans have been De-prioritized and now Israel china and russia are all developing autonomous weapons platforms that need no human Interaction and if this is based on facial recognition targeting systems I find this that next level of problematic in which that you don't even have to push nobody has to push a button anymore And I hope that you didn't get mixed up as somebody else at least in the u.s. Military chain of command There's quite robust. I mean it's not perfect and they they're really bad at what they do But um with the civilian casualties that are caused around it But there's now actively by nation states working to create systems that further bring humans out of the loop Um, I I definitely agree, but I think that if we are able to institutionalize I'm kind of in a social contract way. What we expect technology To how do we expect it to be accountable? Not just to those who make it or deploy it But to the people that it inadvertently captures along the way We're just talking about different levels of the of the question where I'm talking about the design phase where there's still a lot of Possibility for accountability and I understand also that in the user phase that it becomes more opaque Well, and I think that the the development phase is where really probably we can make the largest amount of impact Once it's in the wild it's very hard to recall these things But if we think more about something idea of a data ethnographer, why why does facebook not you know when they're are you think they're wrong I know maybe things are dictated by the policy teams or the global standards teams But what about yeah the data investigator to what like how? And as we're combining these databases or building them on top of each other where maybe the columns fits. Okay. Well, that's that but What does it mean if for example? New York City stop and frisk if you were to take this data and pull it in I'm k. Crawford and Gather to know I've talked a bit about this If you were just to rely on something like New York City crime statistics you might see Oh, wow. Actually young men of color are Exponentially more likely to have a weapon or drugs on them But if you don't wait for the idea of stopping frisk this law that gave the gave police officers the ability to go up to Anybody suspicious and pat them down then we're not properly waiting for the true the way that the The cultural biases have replicated into the digital form. So I think that is important that we start thinking Okay, what does it mean to be a data philosopher a data ethnographer a data and interrogator and and to me this is like where we need more diversity of humanities and other fields into stem fields in order to make it a much more rounded Yeah, vibe Who else wants to ask us have got a second one um, my second question is um our nuance would be uh That a lot of these things are very based on American examples, right? But in Europe, they're currently stronger protections as to for example, what counts as a Private piece of data and what counts as a identifiable piece of information. So, um, perhaps that's a good or hopeful part of it is that in at least in Europe there seems to be Some degree of uh, you know first like a fine phase could not happen in Europe That would not be okay So that might be helpful. I mean, I think it will be interesting to see the final form of the gdpr The idea even with the right to be right to an explanation I think the way that this comes out in case law and in court rulings is going to be quite interesting and consequential at a union like europe that has That is a major economic market for For these companies really I think has to be a leader the u.s. Has no political capital There's this revolving door between the washington and silicon valley and the corporate capture is really quite startling Other places that china and india are now becoming surveillance states. So I think that europe is kind of You know help us e you you're our only hope But I think that things like the right to explanation and once we get into the the devils and the details To what degree do are there is is a is a judge or a panel or you know European court of human rights or Just as what is the case? What is the test case? And so I think this is actually a a I mean I guess I'm now advocating for strategic litigation That now we need to start thinking about the ways that we're going to use the impending especially gdpr But other but other things and and taking cases from other case law and things How are we going to as activists and as Actors, how are we going to frame these issues? I think the eff and access and there's some organizations that are working on these things on the larger level And their respective fields, but from me The I think that we now need to be thinking about Expanding the next two to three to five years of strategic litigation targeted advocacy Strategic lobbying How are we going to like push this over and to make sure that the right to explanation? Which I think is particularly practical or applicable here because if If decisions are made about you using only your face And an automated you know way that this gives maybe that somebody has standing in order to bring a case And there's other things in it, but I find That it is now the time that we need to start talking about how do we use The progressive policies that europe is rolling out in a way that ensures and it's actually kind of corners Technology companies to if they're going to roll this out in europe Maybe it's easier it gives them cover to roll it out with protections globally Very hopeful about is that um I'm very hopeful about is that i'm starting to see that this Field this market is Well, they have to have to use what they got for example. There was a Two weeks ago. There was some news about people who got a physical mail Like you know through their letterbox about a skin condition they had and they had only looked it up on the internet So the connection was that there was are now companies that will connect you or the online anonymous browser with the real person who is You know where they live And then and they are now sending those mails So the point there is that people will start to see this invisible thing because they will start to get targeted quite rigorously And then they will hopefully Awake get awoken and my personal hope is that as the market for As people will come across this there will be a growing market for privacy and anti technologies that you know protect them in any way Or that this debate will come In the next 10 years the the first example that I Is similar to that was target the supermarket kind of a household good store in the u.s Was one of the first companies to run like a pretty comprehensive data analysis on its customers trying to find patterns and And they found that and a large case around this was when somebody got coupons for baby diapers That did not even know they were pregnant But they had run the diagnostics on a big data scale to be able to like identify who was shopping who is in these trends And so it's only when it kind of Enter excuse me when only when it kind of enters into our world we get like Wait, how do they know something? I don't know and I think that this is again part of this social contract We have our original instance that were captured We have who we are now and a lot of times we are throughout to right now I think we've been seeing these two points me now And me then as this relationship, but the ability to extrapolate and forecast Information about these relationships and to be able to predict what we're going to do tomorrow This is something that I hope there will be some sensational stories because otherwise I think for enough people that will be able to pass under the radar Um, but it's a very but I think this really is what it comes down to with this all this information Is not just about what we're doing? But gives them the power to predict what we're going to do tomorrow and not just predict But by quantifying our identity as a user non-user subscriber non-user It puts us in a box which ultimately enables them to put hindrances and burdens in our path to influence our agency And this is really I think for me one of the things that we are slowly seeding our individual agency our personal sovereignty To corporations and other interests when we don't even know that they were involved in our decision-making process. Thank you Thank you very much As you said you have another talk later on that goes I think a bit more into the Algorithm side of those things. So if you are interested it's in 925 at RE or pa the other not not here. Yeah, look it up. Look up the speed car So, please one more. Thanks for speaker Matthew stand up