 So in terms of class, I think that when you leave this room, what you would like you to have is a bit of a language to talk about responsible tech, and you can decide as individual or as organization, if this is a movement that you would like to join. So we would like to give you a bit of a language, a bit of an example, and then kind of ask you, okay, and where are you, what do you want to be part of that? Yeah, so let's get started. Okay, so this is our agenda. We're going to deep dive into some of the concepts that Martha mentioned around what is responsible tech. We're going to show you our approach around it, and how we think in a system approach way we can make it really normal. We're going to offer you some collaborative group exercise to practice the thinking around responsible tech, and show you our approach about how we can build this field, and to make sure that we do collaborate with our sector, and kind of show you, share with you next steps. Yeah? Okay, so Martha gave you a few statistics, but it will be interesting. We are not going to present them here in any case, but it will still be interesting to see how many of you feel that Internet has made your life better as individuals, by raising your hand. Okay, so we have a good 90-something percent. You can make your life worse. Yeah, you're good. Okay. So somewhere in the middle, we will say. So we have two people sitting somewhere. Right in the middle, didn't we? Yeah. Okay, so from the survey we conducted, we got everyone, so 50% of the population, so this was representative of the UK population, 50% found that they made their life a lot better, some kind of a little better, and a really small percent actually said that it made their life worse. Now, if we will, oh, if you want to say, so do you feel that Internet has had a positive impact on society? Can we also have a review? Yes, let's talk about why, why would we say those? Well, because the Internet has been previously excluded from, you know, access to communication channels like the news or like, you know, political office, have a lot more ways to get their message out. However, negative, you know, it's also lowered the cost for negative messages and negative intentions to be spread. So there's a lot, the people who are benefiting are benefiting differently than they were before, but also were harmed in ways that they weren't before. Yeah, I agree. So I think that's a much more complex answer than what we got in our data, because as you can see, the data actually shows a really big gap, like 50% said that their life is better because of tech, but only six things that had a really positive impact on society. And this is, I think, not only about the gap, we actually, when we dive deep, this is very weird, I'm sorry, we have to turn around trade-offs, trade-offs of technology. So people say that they love the Internet, but when we asked about trade-offs, for example, between innovation and public service, people found those trade-offs to be unacceptable. So just to make that a bit clearer, I love to buy an Amazon, right? It's cheap and gets to my house. But if that would mean that all the shop in my local neighborhood will close, I might think again. So there are trade-offs. And people definitely feel overwhelmed with the power and potential of the Internet, and we are only expecting that to grow. So yes, now we have Facebook as like a big technology and everybody talking about what will happen with technology as machine learning, AI, robotics, Bitcoin will grow. And this is kind of why we really emphasize the point, now is the time for responsible technology, now is the time to think about, now is the time to define it, now is the time to start researching it, putting metrics and so on, not as with other technologies and we're kind of like looking back and thinking, oh, what could we have done differently? Which is still very important, but this is a complementary approach. So what is responsible tech? Marissa touched upon that. Responsible technology considers the social impact it creates and seeks to understand and minimize its potential unintended consequences. And the exercise we will do will be exactly around that and we're going to kind of feel that it's not easy, but equally important. So Marissa also touched about that, the kind of different values you can say and these are not exclusive. Other organizations have been doing amazing work in this field for almost two years and they have different definitions, it's a starting point, it's a thing that we'd like to break down and see how we can not bring it only into tech, but also into regulation, into research and so on. So dive a little bit deeper than what Martha went into, but we came up with three core concepts that sit underneath and support responsible technology. So the first is context, which is really looking beyond the individual user and considering the greater context environments that technology operates within. So that's considering the impact it can have on institutions and it's also going back to trade-offs, actually thinking through and capturing what those are. The next one is contribution. And so this is looking at the value that gets exchanged within a technology system. And so value in this case can mean formal labor as in the money that you pay a team to develop technology. It can mean informal labor, which is when people would enter information in the capture that then feeds into a machine learning system, or it could be your information. If I'm exchanging an email address and an access for a service, then that's value being exchanged both ways. So that's the first piece of it, but the second piece is then sharing what those value exchanges are in a very transparent and easily understandable way so that the user actually has the ability and power to make an informed decision about whether or not they agree and accept those trade-offs and those exchanges of value when they use that service. And then the last is continuity, which is really around ensuring best practice throughout the building, creation, maintenance, support of your technology. And so that's inclusive design, safety and security, doing all of your patches, all of those things that should be ubiquitous and yet aren't quite. Before we dive deeper into what everyone does, I do want to mention that we do not live in the bubble. This is a civic tech conference, and we heard a lot about the work of civic tech, but there are other organizations that are doing amazing work around our responsible and ethical tech. As you can see, the log is behind me. I will just throw a few examples. DeepMind has a great ethical and society research program around AI. A meteor just launched their Society and Tech Solution Lab, and they build and scale solutions for unintended consequences of the internet. There's a new Ada Lovelace institution. They do also research around understanding different definitions of ethics within tech. So the ecosystem is starting to build. It's very new, but they're a great work. And we are, everyone, want to contribute with our own methodology and our own ways. Dot Everyone, as Martha touched on as well, was founded in 2015 by Martha. We're a London-based think tank, and our current mission is to champion responsible technology for the good of everyone. We do that through researching how technology impacts on society, using that research in order to form different prototypes and testing that we can show what good looks like, and then collaborating and building networks, which is why we are here today, and exactly what staff just touched on now and will a little bit more later. And also, as Martha mentioned, we've been doing a lot of work in the last year, and what we've really discovered is that if we want to make a difference, then we really have to look at the big picture. And that entails taking a systems approach. And so this ecosystem is very, very complex, and the impact that technology can have on all of these different areas is very interplayed and connected. And so it's really important that we feel that it's really important that we do work in all of these different areas in order to make change cohesively and across the board. So right now, we're actually focused on providing knowledge and ability to all of these different parts of the system in order to make responsible testing normal. Okay, so let's deep dive into some of the data and the research that we do. So we're very much, our starting point is people. It's not users and it's all customers about how people and kind of what people think and feel around the internet. And our biggest piece of research is this report that's called the Digital Attitude Report. And we launched this a couple of months ago, and next week actually we're launching the other parts. The first part was around attitudes and feeling towards the internet, was it will be about how people understand the internet and if they understand the internet. So here are a few interesting statistics. Only a third of the people are aware that they have not actively chosen to share and have been collected. So for example, if your friends tag you in a photo, people don't understand that this can be shared. Fourth, I have no idea how the internet company makes their money. So I use, we all use Facebook and WhatsApp and LinkedIn all the time. People don't know how those companies generate any kind of revenue. This is from the new report. We have a specific section on online news and information. So just for example, 62% don't realize the news and information they see online can depend on the people they are connected to on social media. So again, don't really understand how these things work. Only 31% think most news and websites are trustworthy. But this is a good point to say that if you have questions, just ask them. Yeah. This is from the second part of the report. Yeah, next week and it's going to be on our website. Yes. Where do you get these stats from? These people from, like where are these people? Who do you mean? So it was naturally represented then surveyed across the UK. So 2,500 people were surveyed and we had interviews and focus groups. Yeah. How did you recruit people? So we took an external research company who was kind of expert in doing national surveys. And yes. And all the data can be found in that appendix of their course. So, yeah. Feel free to ask the real one. And it's interesting who people kind of hold account of this. It's not the tech company. It's our same government who's helping to ensure companies change customers, staff, society, therapy. And I find this extremely interesting because in the time that actually a lot of political institutions are more fragile, when it comes to technology, people want government to hold big companies to account. So, yeah. So we say one of the first steps into making the technology responsible tech than the normal is to make people understand actually what is tech? What are their consequences? How do you use it? Yeah. So then that feeds into the government piece. So out of the, based on the research, we kind of came up with a set of policy recommendations for the government. So the first being we really need government to invest in new forms of public engagement and education. That allows people to interact and understand these really, really complex issues in a new way that's not quite been done before, traditionally. We also need to do a bit more research and develop shared understandings and standards for understandability and transparency. And as we heard from the keynote yesterday, can be really complex if even the words that we're using to describe these things you know, it's hard to be agreed on those. And also we need to set up an independent body that's strong and is able to regulate in a whole so that people are accountable for the different problems that people encounter on the internet. But in order to do all of those things, we really need our government leaders to actually understand technology. And so, as Mark had mentioned, we're doing a lot of work within the government in the UK in order to help them bring up the upskill. But I think this is something that could very much be used in a lot of other countries across the world. Just to emphasize the point about independent body, because I think that's interesting when we conducted the interviews, people said, you know, in the UK if I have a problem, I went, as a consumer I went to buy, I don't know, clothes or even medical, some kind of medicine, I have a problem I know where to go to. I have a problem on the internet. Who do I go to? Who helps me? So that's kind of really emphasized that point. So, the internet has impact not only on people and consumers but also in communities. And even though it kind of like communicates in different ways, I think as a community we should also have the power to strengthen and protect what matters that is not digital. Or how can we come with community approaches to problems that the internet is causing? So this is an example of a project we did in London with an organization called Citizen Advice. And basically we tried to target the problem of pricing discrimination. You might notice that it's not as common as pricing personalization, which basically being, if you and I get the same, buy the same product online at the same time, we'll probably see different prices and if this is not because of simple supply and demand, this is because we come from different genders and different places and we have a completely different search history which is fair to say is discriminatory by nature. So then we thought, okay, as kind of putting our collective community power together, what can we do around it? And we designed a prototype Facebook now might steal it that basically it's a Chrome extension that you can actively choose to plug in, which means that when you go and buy the product, you see what price you get and you see other people who chose to share their prices and you can compare and then that collective power kind of enables you to make decisions around your purchasing on the internet. So this is kind of like how as community as sharing information, we can tackle some of these issues that the internet is calling. So yeah, we want to join the dots between social issues and the chronology, we want to expose them and we want to not only bring tech solution to community challenge, we want to bring community solution to tech challenges. Which brings us to the work that we're doing within the technology industry. So in order for technology to actually work with and around communities, then they need to basically understand their responsibilities and the different frameworks that they can operate within. So we've been doing work in this area for over, or just about a year now. And where we started was doing a number of workshops with people within the industry to come up with different aspects of what responsible technology is. And we then took those different aspects and built a prototype program where we got a number of startups to come in as partners. And most of these startups were already having a focus on operating ethically and a positive impact on society. But we asked them a number of very in depth questions about their business practices and the different types of obstacles that they faced and opportunities that they had around operating more responsibly. And so at the end of that we actually went back and we interviewed all of our partners around responsible technology and what their thoughts about it were and we put together a really short video which is quite pretty good. And not only the customers but also the people that they interact with as well. Doing good is good for business so should embrace that not just because of the business and commercial outcomes but because it's good for everyone socially and environmentally. I think being trustworthy is not just limited to to take just like a way of seeing business a way of acting in a way of being considerate to everything that is around your business basically from your customers to your employees to shareholders to the community and broader sense which is being trustworthy and being able to be accountable for your actions when you are. There is one chance to build a relationship with all the estate holders and if you are not if you mess that up then you might as well not be in your business. We acknowledge that you can't really draw a line as to what's good and what's bad as sort of like a state drone. So we are engaging in a journey and we are trying to find parties that can help us scrutinise what we are doing and provide a framework for us and even richer ethical approach and ethical framework for our business. Nobody gives you that, it's not like the past before a roadmap, you have to find your way through it and the fact that world changes so fast it's really hard to keep up. Every time you bring a new stakeholder in they might want you to do something that benefits them that doesn't benefit some of your other customers and stakeholders. Because we don't know where we will be in six months in some product that's evolving a lot we are considering so many different new products from our portfolio new things to do, new ways of doing stuff so it's really hard to when you are in an early stage start up to do some of the exercises. I keep seeing this everywhere more and more people are talking about doing good as good business. The future will be driven by people who buy because it's ethical. There's a lot to be done getting people to think about these things like what do you sign a word what do you sign a social network online what happens to your data and do you know what happens to your data and are you happy with that? This will become ever more important as a way to differentiate yourself and your business model both to your workforce the community around you and your customers to be funded or already funded and looking to launch to compete for customers in very hard markets what a new action and new insurance provider and so there's a lot of interplay and complexity that goes into these systems but this is just a really small example of a huge amount of different start ups that have approached us in the last several months that are looking to do the same thing and are looking for guidance around how they can be better and be responsible. This is our first cohort and now we've taken all of the input from that cohort and a lot of the things that we learned we've refined the different aspects of responsible technology and put them into the bigger definitions in those three concepts that we shared with you before and we're now creating a comprehensive system that interplays and supports each piece of it that provides tools that can be embedded within an actual sprint cycle in a business that will help you determine whether or not you're being responsible so the first piece of that is simply the information that you need around what these concepts are and how you can implement them and actually either be pitching it to your business leaders in order to adopt that practice or it could be if you're a lab or an incubator teaching us start ups within your cohorts how to implement that the next is an automated self-assessment tool that will actually ask you guided questions around what you're doing within each of those core concepts and produce a business dashboard that can be implemented again in any business and reviewed by tech leadership and we also have thought exercises around each of the concepts that again will help you think through the consequences and the impacts of your technology so the one for context that we're making is actually be very similar to a board game where it will literally be the developer sitting around your room being able to play a game that will help them tease out how their tech can influence different areas and then the last is a resource guide that will contain a number of best practices and example policies and things that people can leverage to put in place so we're looking to actually embed that system within businesses at the product donor development level and then with the same leadership guide that we're creating for government leaders to complement that at the COVC speed level now we can get to the front part yeah so we're going to do our exercise very shortly maybe two groups I believe but let me give one more example before we drill down and you'll probably kind of wondering why there is a force in the end of the 19th century picture a couple of months ago the Economist I don't know if you saw it but you should did a big report on ADs on the Economist vehicles and they did a whole section on unforeseen foreseen and unforeseen consequences of this new technology I always think that when the Economist does something it means that it's the beginning of mainstreaming so you know only maybe in kind of in ADs but I think this example is a really good framework for us to think about these things so that example goes like this you know in the 19th century the London time kind of went with this huge call and said one of the biggest problems, social problems of that time is horse manure and they predicted by 1940 the entire city of London would be just filled with shit and this is not only a smelly problem this is also a social problem because it caused a lot of diseases and that smell of pollution and so on so when the cars came around they were seen as this amazing innovation to solve this solution and for many eyes to solve this problem and if it was in many cases because again it doesn't stink not from shit at least maybe from other things but it also had this amazing social impact with it having less freedom to go places it changed the way the shopping mall it's going to kind of change our life but it did have infinite consequences which they are very clear from congestion and pollution and accidents so until today 1.25 million people die each year from car accidents which is a huge problem which IVs say now they're going to solve so the statistics go that when IVs will be implemented the death in the US will drop so this is about saving lives amazing social impact but it will also have different social impact it will again probably change the way we shop and the way we trade now I need to go get a haircut I can take it away to work for the names of the examples being shared with about you as me why not you see this is not getting creative about how these technologies work suddenly Facebook doesn't seem such a pretty thing but I mean just saying so another interesting example is around a dating app they do serve it and say why would I spend my evening with a stranger now I have this technology an app we know much better about my preferences and I can just share my date into work all good so these are kind of like social impact that I might generate and what about the consequences so one thing that I do think about is really interesting is organ trash play so most of the organs today come from young people who die in car accidents no car accidents, no organs so there's a big one, a smaller one but still very interesting is that they predict that cigarette consumption holds off because most of the cigarettes are actually being bought in petrol stations and we won't have petrol stations so again this is only what we can think about so I think this is a very interesting example of how shifts in 150 years will completely change our society but we should not and they conclude the article and we kind of agree that we should not be naive around how great less people will die which actually think about what are these unintended consequences and how we shape them before they shape us okay so this is a quick run through but I think you'll get the point by now but yeah although actually I do want to be quick so even if you're building tech for good your tech can still have unintended consequences so just because you have all the benefits in the world doesn't mean you're exempt from having an impact on society that might be negative whether intended or not and so this is now the end of the thought exercise so this is a real tech startup in London that currently has like they developed this technology it's being prototyped and developed now so it hasn't scaled yet and you don't need to worry too much about the actual technology and the understanding of it but essentially the way this technology works is you are able to lock into a virtual reality system and with your mind control the trauma and so the uses that they envision and they're called themselves a tech for good is that this is going to save a lot of lives it means in a police situation you can send if you have somebody in a house and you don't know you can send the drone in instead of a human being so it reduces risk if there's a wildfire you don't need to send in firefighters to fight the flaming blazes you can send in the drone if someone is lost on the sea in a small boat you can stand on the shore and send the drone out and find them probably much more effectively and send out a couple of different boats or helicopters so they have all kinds of uses for this technology that they think can really save lives but are asked to but the first case is as we mentioned this is taking a systems approach there are different parts of the system so as a group choose which part of the system you would actually like to be and then we're going to ask you to come up with what the different impacts of this technology could actually be not just its intended uses but its unintended ones then we're going to ask you within your chosen role what would you do to reduce or minimize those impacts then we'll just have you kind of share what you've come up with we've put on this more information so this is an article about this company I think it was in the forums we're not making this up this is actual technology I don't know if you have but you can go on yeah these are older if you want I'm going to go now next to the dozer one sorry no worries I don't know I'm assuming within a minute and we also have in-depth instructions on each piece of and to say you were a community you could be the government if you wanted to be government could either be like a local government like the city of London or the UN it's a civil society that's essentially what we are here doing a little bit of time for a human rights organization so two doesn't really change depending on who you are but three and four and then I'm going to come back and do this wholeheartedly oh yeah there's a lot of PR so this is more the tricky part I don't know if there's something on the other side I'll be not so let's assume it's like a super drain don't worry about it yeah what do you mean choose a policy choose a policy for an individual community so who do you want to reply to by what lens do you want to think about what do we want to be do you want to be the government what do you fancy the government all want to be the prime minister at the end oh no I don't want yeah I don't have strong opinions back at one yeah that's part of it identify some of the potential impacts of what is obviously the most difficult one narrow it down to the top two you can also say practical impacts as an individual how do you best use this tag switch to post maybe let's do that throw everything you have it doesn't really matter like which ones everything actually because you come from both taken government it could be interesting to get the different things if I was an individual doing a second it would just be for fun if you control the drain with your mind sounds like a fun thing to be able to do I think that's a big gaming problem sick pictures yeah or for crying if you were like an individual that was a bit morally like just seeing if you could use that checking out like scanning a place to see if you want to burgle it or not for example as a woman walking if I'll just have a drone above me I'll feel much safer walking in the streets real at night for example all more vulnerable by someone yeah I think being stalked by like a drone sounds quite daunting too much black mirrors but that is a kind of fun thing for rich people for example it's intended to be used by the police but what happens if private hands can get a hold of it yeah create or deepen an existing equality while it will be an expensive piece of pack right yeah I'm sure depends on how like the nearest thing is like delivery drains actually looks like this but even if it's expensive now and we're thinking 20 years like again yeah think about scale we can also think as an individual what our local government could use so there's this podcast where they talk about using drones to chase car robberies they used it somewhere in the town in the US I don't know and it would use the crime quite a lot because they could basically they were scanning the town 24 hours with an airplane with a drone and then they would actually see when the robbery was happening and you could trace the car where the robbers would escape civilians but that comes with privacy yeah but that's good let's put it all because I think yeah privacy is a huge issue here I could have bad actors for this tech yeah yeah blur the police into area the police think there's nothing wrong because the drone recorded nothing wrong oh wow what's going on in that mind that was really interesting well it's like in general like when you rely on tech to give you signal about like about the world you like tend to be overconfident and like there was the the only thing that's fine is like the big data study where Facebook tried to mark stories is not being true and then it makes people more likely to trust stories that aren't marked but like they could still be false just if Facebook doesn't know yeah yeah yeah yeah yeah yeah yeah yeah yeah yeah we're going to ask you within your chosen role what would you do to your life the risk so if you're in government you would do something very different yeah it's really frustrating yeah so there's just a few paragraphs that I haven't envisioning there I mean I think we should be in government because we have a lot of power and things that we'll come up with we'll be hilarious so both of them national trigger national national probably the hardest also it's a transfer yeah so I guess one of the other questions would be as a national government um how would you best use this technology would you buy it what scenarios would you use it in would you do a bad evaluation do you think anyone could be used it could be bad but this tech could be used for good I mean a democrat we're definitely a democrat that is the next question if we were just a government not a department there's a really horrible obvious one it could be used for the defence department where they could just think of a war instead of having one that's a virtual virtual warfare yeah guys sorry but this is a bit too much