 Stage B. It is my great and abiding pleasure to introduce Dr. Katherine Flick from the Centre for Computing and Social Responsibility at De Montfort University in the lovely United Kingdom. Ladies and gentlemen and everybody else, please welcome Katherine. No. Thank you my dear friend here. We've been friends for a long time so it's all good. Hi, I have to apologise a little bit for the state of my slides which are a little bit quick because I've been trying to fight fires most of today trying to get the stages actually moving so please apologise. Please accept my apologies. I'm basically here with two hats on. I'm a technology ethicist and I'm actually on the Committee of Professional Ethics for the ACM. Some of you may be ACM members, don't worry I won't get you to put your hands up or anything but some of you may know what the ACM is, it's the Association of Computing Machinery. It's the largest professional organisation for computing, anyone who does anything with computing basically. So computer scientists, games designers, game art people, even like we have financial service people, it runs the whole gamut of tech. And what we've done recently, oh yeah before we start, if you have questions I probably won't be able to get to them but what I will be able to do is look at them and then possibly write something about them later. So I have a Slido, Slido.com and if you put in, so if you open up your phones and put that in and then have hash, just the join code is emfcode and you can put a dilemma or a question in there and it may get turned into a case study for the code of ethics if it's like we'll obviously anonymise it and all that sort of thing but it may turn into a generic style case study if it's an interesting dilemma or if you want to come and talk to me afterwards or if I get time I'll go through the best ones. Okay, so the code of ethics, so like I said the ACM is a really big computing organisation and we had a code of ethics, it was sort of a code of conduct before 1992 which basically said these are things you shouldn't be doing and then in 1992 they actually decided to sit down and write an actual code of ethics and what then happened is they said oh well we'll update it frequently and then they didn't and then the internet came along and AI came along, well machine learning as we know it currently came along and a whole bunch of other things came along and then it got to 2018 and they were like yeah we probably really do need to update it now. So over the last two years I've basically been part of this process, I was on the steering committee for updating this code of ethics and it was a huge participative experience where we got loads of people to basically comment on drafts and all this sort of stuff so it's pretty much ground up built but based on the original code so there are some similar things and some different and some new things so I'm not going to expect you to have read the code, what I'm going to do though is take you through a few things that might be useful to look at if you're an ACM member. If you are an ACM member one of the conditions of your membership of the ACM is that you abide by this code so you probably really want to have a listen to this. You also probably got a bunch of emails from us saying please help so thank you if you did help us. So why do we need something like this? Well it's really interesting as a technology ethicist I've been doing this for a really long time now and like 10 something 15 I don't know a really long time longer than I would like to think and basically it was only about a up until about a year ago I was the one knocking on people's doors saying you need to think about ethics. About a year ago something happened Cambridge Analytica and a bunch of AI stuff that people got worried about and I started getting all these invitations to come and talk at like particularly AI conventions like big industry conventions and now it just seems like the flood gates have opened so it's really good for me I guess. I would actually really like to be out of a job at some point because then we'll have solved all these problems but basically there seems to be a lot of call for ethics right now because people are realizing that technology can actually have some significant social harm attached to it. So one of the key focuses that we really wanted to put into the new code of ethics was a focus on the fact that we should be using our skills as technology creators, software developers etc innovators of any kind that has anything to do with tech we should be using it for the public good and we should be using it for the social good. There's no point in making tech for tech's sake if it doesn't actually have some benefit to somebody somewhere and then it's about then also things like benefits looking at the benefits versus the harms which I'll get into in a bit. But the key thing was the first sentence of the preamble we wrote computing professionals actions change the world to act responsibly they should reflect upon the wider impacts of their work consistently supporting the public good. The ACM code of ethics blah blah blah expresses the conscience of the profession and we claim that it expresses the conscience of the profession mostly because it has been a ground up activity like we really wanted to make sure that this wasn't just a bunch of people from ivory academic towers coming down and preaching upon preaching the words to the you know the people we wanted to make sure that actually everybody was involved we wanted to practice what we preach why we wanted this to actually be relevant to people actually doing things in the real world. So the very first principle that we have is that a computing professional should contribute to society and to human well-being acknowledging that all people are stakeholders in computing so it's no longer accept acceptable if you're creating things to like basically kind of do it in a bubble if especially if it's going to have some sort of social impact you need to be out there making sure that you're doing it with the public good in mind and also collaborating with stakeholders anyone who might be impacted by the technology that you're creating. So a couple of the major changes that happened between 1992 I mean we had the internet showed up we had a whole bunch of awakenings and I think the previous talk is a really good example of how kind of things have shifted over the past you know 2030 however many years we're now like much more aware of women's role in computing computer science and how that's an unlike there's there's still a long way to go there before there's kind of equal treatment of women in computing area although there's a lot of work being done which is fantastic so there's a much broader description of discrimination and what that entails so before the code didn't have anything about say harassment particularly sexual harassment that's now specifically in the code and it's basically there are also a whole lot more aspects of like discrimination that are covered so we used to just have kind of like race gender and something else but we now have a whole bunch of other other aspects that are now covered. There were two big things on intellectual property rights so it used to say basically you must abide by intellectual property rights which I'm sure you guys many of you here would be horrified to think that that was something that we would be that would be required to do and me too so I was given the homework as part of this thing to redraft the intellectual property section which basically turned into respect the work required to reduce new ideas inventions creative works and computing artifacts which now basically puts the emphasis on respecting the decisions that creators make within reason and there are basically if it's ethically justifiable to break these things you've got an out in there as well so there's there's a there's a way to get around that now yeah anyway design and implement systems that are robustly and useably secure back in 1992 there was very little security apart from physical security so now we are requiring security by design which is something that most people hopefully do anyway but it's something that you now actually are required to do by the code of ethics 3.6 use care when modifying or retiring systems this was certainly not on anyone's radar back in 1992 but things like Microsoft retiring certain operating systems and various other large-scale things google reader comes to mind is one that really annoyed me basically if people are using your software you have to have a really good reason to retire it and you then also need to help shepherd them on to something that will like basically replace the thing that you're retiring 3.7 recognize and take specials to care of systems that become integrated into the infrastructure of society facebook right if you're twitter facebook anything if you're creating something that then goes on to become not just that thing you did in your garage or with a couple of your mates but becomes like a massive infrastructure thing you need to be really careful about how you handle that and there's some guidance in the code of ethics that helps you to do that now so how do I actually use this practically well the idea is basically it's supposed to be a way for you to reflect on what it is that you're doing it's not a step you know a how to do things ethically there is no one size fits all rule for any of this stuff ethics is not a set of rules it's a way of getting you to think about what it is you're doing there are no easy way outs in ethics you need to actually sit down and think about it so I'm going to take a take through a few examples of some things so this is slightly adapted from a video games conference that I went to recently but I've tried to kind of adapt it to be a bit more general so I'm going to take you through just a couple of ways of how you might think through these issues so if you're creating technology for vulnerable people so children old people anyone with various disabilities that might be make them be considered vulnerable there are a whole bunch of things that you can think about that the code will help you think through for example there's a whole section on avoidance of harm and what it is about what exactly is harm and how you can like you know make sure that the technology that you create doesn't inflict this on people I'm going to get there's a little bit of a complication to the harm thing which I'm going to get to in a bit obviously if you need to be honest and trustworthy particularly with with vulnerable people and this is because they don't often sometimes there are reasons why they're not able to make very good decisions for themselves so I've got the picture up here of revenue generated by premiums children premium children's games in the mobile game industry because basically that's one area where children are not able to consent to say pay money for games necessarily and it gets very you need to think about that when you're developing it fostering public awareness and understanding of computing related technologies and their consequences principle 2.7 basically what this asks you to do is make sure that people can understand what it is that you're creating and yes for the machine learning people out there I know this is complicated but you need to have a go and if you have actually struggle with that I do a lot of that sort of thing so if you want to come and talk to me at some point this weekend I'm more than happy to help you explain you know I can explain Bitcoin in 15 seconds for the local radio station so I'm quite good at simplifying things 2.9 design and implement systems that are robustly unusably secure basically security by design you've got to protect people you can't just throw stuff out there now that's not got actually security by design in it another one data analytics so basically discrimination obviously is a huge people are worried about bias of algorithms the fact that a lot of the data that is put into the training for these systems is biased to start with how do you remove bias this is an ongoing question that machine learning people need to think about particularly and data analytics people need to think about and it's not that I'm there's a solution in the code but it basically says you need to think about this and you have to have really good reasons behind the decisions that you make about what you include and what you exclude and how you anonymize and all that sort of thing respect privacy hopefully is self-evident on a confidentiality this happen can have this is an issue about basically if you're if you're got confidential information you need to be careful with how you treat that and that just kind of goes into data analytics so some of the information you might be dealing with might be confidential high quality in the processes and products professional work this basically has a section in it that basically says if you don't understand what your analytics is doing if you don't understand how your machine learning algorithm is making decisions then maybe that's not something you should like I know this is a really complicated area so I don't want to simply oversimplify it but you need to really think about what it is that you're actually doing with that algorithm and how it how it might work and what information actually might be those key kind of things that that then used for decision making and that's part of the quality in testing and I've got a picture of Tay Bot which some of you might remember who was basically thrown out on the internet and became very much like a horrible person very quickly and basically this principle would say that you need to test you need to test things better and if you don't and if you're not able to test them sufficiently beforehand you need to monitor them while they're out in the wild doing their thing and pull it back if it's doing terrible things quickly um oh my laptop's just turned off uh knowing respect existing rules pertaining to professional work stay in the law where you can there are exceptions for if the laws so we've got a lot of exceptions for basically we're worried about um kind of fascist regimes and things like that where uh and this is actually something that in 1992 they weren't so worried about but we are now and um so there are basically if there if the rules are unethical there are ways of challenging the rules and there's some guidance in the code for you to do that and basically like I said before this should all be done with the public good as the central concern I'm going to skip over diversity hopefully I mean it's mostly just for time because I want to get through the rest of this stuff but you see what I'm kind of doing I'm using aspects of the code get to basically um get questions up in your heads about perhaps what it is that you're doing to challenge what it is that you do and some of that might be uncomfortable and that's totally fine um but it doesn't mean that you can ignore it it means that that needs to be something that you need to start thinking about integrating into you how you do your work that sort of reflection that reflection time um and I'm going to talk about that in a sec so some frequently asked questions we get at the code central which is ethics.acm.org what if my boss thinks codes of ethics are for losers so basically what if like your boss asks you to do something that is against the code of ethics if you're an ACM member you can actually hold this up and wave it at them and then if they sort of say well screw you you can there are actually ways of going through whistle blowing and stuff and the ACM is we have to be very careful about how we promote that aspect of the code because although it's been used for like legal decisions and things in the States it's actually very difficult like obviously for an organization to help support people all around the world but we're able to give people guidance but ultimately unfortunately when it comes down to it the the rules of the land are the rules of the land you have to understand that you might be breaking those in some ways right um uh yeah so another question we get is what if I'm in the military or security does that mean that the code of ethics isn't for me because I'm obviously going to be harming people and that might not be the public good may not be the center of my concern um and basically we have been very careful to word the code so that if you are in the security or military domains and you do do things that might be considered slightly um that there are ways of thinking through the code that actually can keep you in in this process so for example in the harm the the avoid harm we change that so many times you hear about do no harm right and that's kind of like the Hippocratic oath we don't say that specifically and that's because it's about um justifiable ethically justifiable harm and sometimes when you're in security there may be some ethically justifiable um attacks you might need to make against a system or something like that that might be um considered otherwise against the code of ethics why didn't I plug this thing in um but um or if you're in like weapons development or something like that there might be times where you like your efforts might be um better spent sort of working on how to make it so that other people aren't harmed by the specific weapon I mean these are all things we had to think about and it's really really complicated and not we're never going to please everybody although I really I mean I'm a pacifist so this stuff is really uncomfortable for me to talk about but I can kind of see the point that um uh basically we want to keep people in in thinking about this we don't want people to like people in these sectors particularly just to say oh well this isn't for me um basically like as far as I concerned if the code gets some thinking about this sort of thing and then thinking well actually maybe military or um you know bad security isn't actually for me then that's a good thing as far as I'm concerned um but you know there are kind of these very fine lines are very difficult to work walk along I have to say come and see me up at the bar after I've had a few drinks and I'll tell you more um how how is this code different from other codes so some of you may know the IEEE software engineering code um it's different because it's firstly it's newer uh and also it's um a much more general and it's an aspirational code so the software engineering code's much more a list of don'ts um but the ACM code's very much aimed at students to kind of inspire them to do things well as opposed to you know slapping them on the wrist if they don't do it um but speaking of that what if I break the code uh if you're an eight if you're not an ACM member nothing unless you break the law in which case that's a different thing uh if you are an ACM member we can discipline you which sounds scarier than it is um but also we have oh that's not what I wanted to do there we go um also uh basically uh the committee it's very complicated but we have a much better system now if someone or if you think someone has broken the code of ethics that might be an ACM member um there are um the the committee on professional ethics will actually take it on as a case and then we kind of deal with it um and there are a range of disciplinary actions that we can take including stopping you from coming to our conferences and that's probably about the worst one you'd get from the ACM unfortunately as far as I'm concerned but anyway um yeah um so that's basically yeah if you break the code you're an ACM member that's not good all right so previously I talked about integration of this sort of stuff into your ways of working so how can you actually do this practically well there's another um uh thing that we do in Europe called responsible innovation and this is actually my other hat my academic hat is now on um responsible innovation is basically the idea of creating technology with and for society and the this is ethically aligned the idea is to get people thinking about ethics and getting them to integrate it into the ways that they work uh with the innovations that they they make so this area framework is um the one that we mostly use in the UK uh and in parts of Europe as well but basically it's about getting you to asking you to anticipate what the potential impacts of um your in of your technology or your innovation might be on society reflecting on the ethical and social issues that might raise so you can use the code for that that's a really good place to use the code of ethics engaging with relevant stakeholders to help them i to help identify potential issues and mitigate those and not just in any testing stage but the very beginning of the um innovation cycle um and then act by putting methods in place to ensure issues are resolved so this is like having good HR policies and processes having good testing qa that sort of thing it's about basically codifying a lot of the good practice you probably do already into methods that you follow every single time and hopefully those are responsible ones so um benefits to behaving ethically um oh this is left over from my video game store which is great i'll leave that one there um so we've been doing some work with companies i've been working with cyber security companies in the uk for the last two years now getting them thinking about responsible innovation and ethics in what they do um and they've uh talked to me about what they feel is the the other benefits that they've been getting from this um basically it's all about trust um they have better reception from the public so they have better a better reputation which builds trust they have employee satisfaction which builds trust within the employees that they have because people like to work for ethical companies uh they have better quality innovations uh because people are stimulated to work and do things better uh and you get a wider audience than just nerd bros um which sorry for all the nerd bros out there but you are actually a minority uh and there's a much broader audience out there for all sorts of technology uh than just the power users um sorry i'm not equating power users to nerd bros so they're different not doing that um and you're doing the right thing so you can sleep at night so that's nice as well um so more on ethics and technology um basically there's a bunch of links here um if you just google acm code of ethics you can find the acm code of ethics tier and i do a podcast on video games and ethics if you're interested in that um tier's been doing a fabulous job here in stage b all day so thank you um and also there's a there's the responsible innovation stuff that i talked about um is available at orbit rri um which is the main center for the uk's responsible innovation um information stuff and you can find me on twitter and the acm ethics stuff is on twitter and we've done reddit stuff i mean we're trying to really get the word out and if you'd like a sticker that has acm ethics like that one um come down to the green room and i'll give you one because i was stupid and forgot to bring them up um all right so got a dilemma have we got time tier how much time have we got we got some time i think we have time okay good all right i'm gonna refresh my slider and see if anyone's put a dilemma oh here we go here we go people have said stuff fantastic okay um we're at the peak of hype for tech ethics when we get out of this phase and it's less hip again what is the biggest thing that will have changed in the world oh i think actually what will probably happen is that um tech companies will have much broader stakeholder work like they'll they'll work much more with the general public in ways that are much more inclusive of not just people who are white middle class people who can afford to go to a one day workshop and try out a new piece of technology it'll be ways that will include people who perhaps are of lower socioeconomic status who can uh you know don't perhaps who may only use the technology in uh kind of emergency situations who are much more removed from the direct usage of a technology and i think that actually will really improve a lot of that will be the biggest thing that will have changed i'm not quite sure if that's what you meant when in terms of biggest thing that will have changed uh there probably will be deaths of some big technology companies in some ways but anyway oh yeah i wanted to mention how google um so the google employees who are pushing back against this the chinese censorship recently used the new acm code of ethics as a basis to push back to their employer um and actually i wrote a uh a piece in the conversation about uh google and and google in china and how the acm code of ethics thinks that's a bad idea and then they use that as um like which is really nice so i was quite happy about that um all right uh i just anonymous asked you're all anonymous that's great i just completed a computer science degree ethics seemed included by necessity rather than doing due to understanding its importance what can we do to change this attitude uh employ ethicists to teach the ethics courses not just people who um like often in computer science departments it's left to someone that has like spare time and or maybe like they've mentioned that they were interested in ethics once and now they're kind of lumped with all this teaching that isn't computer science i know certainly in some departments that ethics is considered because you need it to get bcs accredited in the uk right so the british computing society requires you to have a certain amount of ethics in your degree program uh if you want to get accredited by them uh and certainly i've known uh certain departments which i will not name that uh basically see it as um something that is taking away from teaching students tech and that that's a bad thing and so it has a negative like lump that comes along with it even at the the top level so it's it's about changing attitudes like with the cyber security companies i've been working with the only way you can get uh companies to take on kind of an ethical culture is to change the minds of the people at the top otherwise it really doesn't nothing i mean unless you've got a really big workforce and they're all aligned with being ethical which rarely happens um you're you're gonna it's difficult to push back against a ceo or a department head that doesn't want to do this sort of thing um good they're coming in quick and fast do let me know when i'm out of time to um two minutes okay what should facebook do to act responsibly about their role in society specifically in relation to political advertising data they have internally well specifically with in relation to that they should probably delete it um but um i mean i think they have they're really doing a lot of soul searching at the moment which is a good thing um i still wouldn't have a facebook account um yeah i wrote about that too i wrote a thing about informed consent on that and that was very interesting um anyway um uh what should they do so what facebook didn't do let's just say so let's let's let's let's talk about what facebook didn't do what they didn't do is they didn't um establish a method for reliably communicating consent aspects uh when it came to the data that they were collecting so they weren't very good at communicating what data they were going to collect and how they were going to use it and they also didn't do a very good job of um managing the um kind of fault basically they did did a very poor job of managing the kind of political aspect of that so actually there's a really good um radio lab if anyone listens to the radio lab podcast on facebook and how they do a lot of their content moderation for example and the fact is they have all these rules about what can and can't be on facebook and they're just like the problem with ethics like i said before is it's not a how to you can't step through and say step one do this step two do that the same thing is when you come when you have a whole lot of context and you have to then make decisions about do we allow this on our platform or do we not allow this on our platform it usually comes with a lot of context and when you're a worldwide company you have a lot of worldwide context and it gets really really complicated because in china you can say things that you can't you can't what you can't say things that you can say in the uk or america or the netherlands or where it is and and and everything gets very very complicated and i think they really did that very poorly and i mean in some ways it was kind of inevitable but the way that they can solve this problem now is probably to be honest to really kind of scale back their operations certainly on the political side really focus on kind of going back to what it is that they actually wanted to do which was to be a way for people to communicate with their friends and family but then also and this is sort of walking a fine line just be very careful about what sorts of content that those sorts of communications actually have but that's that needs to be something that's agreed with the community as well like they need to build it up with certain groups and i guess very very complicated come see me in the bath that would be a good one one more or are we good one more all right laws are very slow to change and adapt this is a really good good one laws are very slow to change and adapt compared to evolving technology what can we do in the meantime to counter the groups unwilling to follow the code so i mean this is a classic more policy vacuum and this is something that technology is just we've we've all like this is going back generate like many many many years we've been dealing with this what we can do is we can do things we can be good examples we can show the benefits of doing things ethically so these cyber security companies for example who've been doing this you know that like they are now being like they're sort of being held up as kind of pinnacles of good tech and we can also you know vote with our pop you know with our wallets right so don't get a facebook account you know don't sign up to the latest thing just because it's there i actually have a look at what it is and what they want to do and also educate your friends and family that may not be as tech savvy as you are that's certainly a big thing that you can do help them make informed decisions because at the moment they're not able to really do that because they're not being informed very well okay thank you very much i can do more of these in the bar afterwards well actually i'm gonna i think i'm gonna emcee the next talk so