 We're going to get rolling here because I've got about 60 slides for us to get through in 30 minutes, so let's have some fun, all right. So as a kid when I was around eight or nine years old, my dad was out cleaning the garage and I remember him calling me down and I remember him pulling out this old box and this old box had a very distinct smell that I still smell. And in it was his collection of comic books and these are original, original comic books. And so as an eight year old, nine year old kid, I just went crazy, right. I had this treasure trove of comics from my dad's childhood. And then I myself got into comic books and I started reading, you know, the more modern ones, the X-Men, the Superman, the Spider-Men. And so even nowadays, you know, with Marvel coming back and being the big monstrosity it is now and I came to the realization that we are superheroes. And it's something that we all strive to as something as people we all want to be. These superheroes are kind of the personification of perfection for us. And so at Drupal Camp London I had the amazing opportunity to keynote and my talk there was all about us as Drupal developers being superheroes and what that meant and how to empower ourselves as superheroes. And this is a great representation of it, but Drupal does some really cool stuff, right. We power some of the world's largest brands. My fun quote on this one is that it takes longer for the page to load than the car to go 60 miles an hour. So that's either really good for the car technology or we need to put some cash in place somewhere. We power, this is a project I was personally involved with, we power these massive worldwide events and people from all over the globe are being touched by it. We're transacting literally billions upon billions upon billions of dollars through our ecosystems, through our projects. And so what I came up with is that the way that we build Drupal and the way that we operate as technologists to a lot of people is like the same thing as Superman flying. It's magic. We do magic in a box and something appears on the screen. And I don't know about you, but I've tried to explain what I do to family members and extended friends and family and it all goes back to you do some magic and something happens on the screen. And this came to a realization for me when I was in Turkey in Istanbul on my way back from Drupal, I stopped by a cafe to smoke some hookah with the locals and ended up striking up a conversation with a few guys that were in the silver trade and they were in commerce. And they said, well, I've been trying to build an e-commerce site for two or three years. How long does it take you to build an e-commerce site? And I was like, I can have one up in a couple of hours, right? Like with Drupal commerce and Kickstarter and everything that Ryan and his group is doing, we can do that really fast. And they go, well, what about hosting? Are hostings always down? I go, no, I got friends that run 99.99% hosting that you just push and it works, right? And it was like, like, minds are exploding and like, how do you do what you do? And it came to the realization that I'm doing magic. It's the same thing as a magician to the non-trained eye. You don't understand it. You can't believe what's going on. Now, at the same time, we have this great power. We have a duty and a responsibility that comes along with it. So this is a quote from Batman. And just for all the comic nerds or the non-comic nerds, the superhero stuff is going to subside here in a bit. But this is what we're up against. Some men that are looking for anything, they're not looking for anything logical like money. They can't be bought, bullied, or reasoned or negotiated with. Some men just want to watch the world burn. And as technologists, we're up against that. And we have the power to make that difference. But we also have the power to be the people that make the world burn. So in the phrase of Uncle Ben, with great power comes great responsibility. And so this is going to be us discussing the ethics of privacy and technology and what that means to us as a group and as a project. Now, throughout this talk, we're going to ask a lot of questions. We're going to raise a lot of concepts. I will be the first one to say that I don't have all the answers. So don't look to me to be a moral or ethical authority. I will be the rabble rouser just pushing out the questions and poking the bear, if you will. And so I can't talk about superheroes without talking about my wife. Man, that's art and my kids. And I always think about what is the web that I want to build for them? And if you take a step back, that's the type of web we should be building. We're not building webs for ourselves. We're building the web. We're building the technology for the next generation. And so I'm getting all the clumped right now, but this was a talk in Serbia last year. Morten ran Hendrickson from the WordPress community. Go watch this. I have never seen a question and answer period after a talk be filled with tears and like more therapy in 15 minutes of question and answer than I have at that session. But it brought up some really core principles of what does it mean to build? And as we build and we create, we also have the ability to manipulate, right? And we've seen this. We've seen this happen in our society. We're living through it currently. Unfortunately, we provide these technology platforms and we do these cool things, but we do it in a way that at times can bring up unintended consequences, right? And so you have folks like Alex Jones out there who leverage these technology platforms to spread hate, to spread misinformation, to spread, you know, to watch the world burn, if you will. And so Facebook stood up and said, okay, we're going to ban Alex Jones. We're not going to let him on the platform anymore. They've recently done that with white supremacism as well. And they said, we're not going to allow that to be on our platform. And it's an altruistic bull and it's a stance that I commend. But the issue was is that Twitter didn't and Twitter said, no, we're going to allow the platform to be what it is, the technology to be what it is, right? And that was quickly reversed once the PR hammer hit them and they realized that taking that open stance probably wasn't a good thing because the people that they were defending were the people that were not worth defending. And this even happened in a fellow CMS project, WordPress.com, which is powered by obviously WordPress, updated their policies to shut down the blog of Sandy Hook Deniers, which was kind of scooped up in this whole thing with Alex Jones. And they rewrote their copyright rules to say that you can't use the images or likenesses of minors without the express permissions of the parents of those minors. And we're able to selectively take down speech that was hateful while not banning or hindering free speech on the platform. And so we have to look at this and say, no one builds a Twitter expecting an Alex Jones to occur. You don't build Twitter. It wasn't in 2006 or whenever they were building it to say, we're going to build a platform that fake news is going to be distributed on and we're going to build a platform that anyone can say any hateful thing and bully kids into committing suicide. Nobody does that, right? But that's what ends up happening. And at the same time, nobody builds a Facebook expecting that a Cambridge Analytica is going to come along and scrape it all and use it for nefarious purposes. We don't do that. I intrinsically think that the tech community does have a baseline of morality that says that this shouldn't happen. But it does. And so what we run into is the the classic Latin phrase, and I'm going to butcher it completely and I apologize, but it's kias custodias epsos custodias and it's who watches the watchman, right? We're the technologists. We're building all this stuff. Who's watching us? Who's making sure that what we're building is actually doing good, right? And so what we've come to realize is that big data carries with it a massive responsibility. We're creating these platforms that are worldwide brands that are getting millions upon billions of page views and we're collecting the data. We're bringing it in and with that we have some incredible, incredible power. Now this is an older Economist article from two years ago and it's arguing that the world's most valuable resources no longer fossil fuels its data and that data is now the most powerful economic driver and I would I would wholly agree with this. The amount of data that that traffics just through the brands that you can see here is mind boggling, absolutely mind boggling. And what we're looking at now with with a lot of the regulations and policies that are getting put in place is that for for far too long there was there was kind of an open open field for this data collection. There was an open field for what and how you could collect. And I like to use this example of Strava. Anyone know Strava? My fitness people out there. I don't know it because I don't use it but it's it's you know if you have a watch or a Fitbit or something like that it's how you track your runs, your bikes, your swims, whatever you do. It's just gathering data on everyone in the entire world working out. Pretty damn cool, right? And so what they decided was hey we've got this wealth of geo data. Let's map it. Let's create an interactive, you know, sweet, sleek, sexy map that you can go and look at all the data on the platform, right? And so they did. This is pretty damn cool. Now they can go out and here's Seattle and if I'm visiting Seattle, I'm a native here but if I'm visiting from out of state I can go and see what are the runs that people are doing. These are probably the safer corridors to run down. These are probably the bike paths that I want to go try out. These are all the places I want to go. That's pretty cool, right? Now what they didn't realize was that mapping all of their data, blanket across the entire world. There were certain parts of the globe and I apologize for the contrast here. But this is a shot of the Middle East. And so people started going around and looking at the Middle East and saying wow this is pretty cool. There's actually people running in the Middle Afghanistan. And then what they found was wow they're running in very odd patterns for being in the middle of the desert. And what they then started doing is overlaying this with the Google Maps information that also was out there. Again, a ton of data being sourced into one place. And they realized they were mapping secret military bases in the Middle East that soldiers were running around because they couldn't go outside the base because they were going to get shot. So they just ran around the perimeter of the base and Strava unknowingly put all of these secret bases on their maps. They caused a huge concern and the army and the military finally said hey stop wearing those FitBits when you run around our base because you're giving data to somebody that you don't know. Now did Strava go out to build something to say we're going to take down the US government and we're going to show out where all these bases are? No. There's a bunch of marketing nerds that got together and said hey let's do something fun, right? And they did. And so what we have to think about are what are the unintended consequences of what we build. We're building all these really cool experiences. We're out there doing the Stravas of the world. But what are the unintended consequences of our actions? And again going back to the I don't think everyone's inherently bad. And I don't think Strava was out to do bad things. And the other thing that we're starting to do with big data is we're starting to mine it, right? We're starting to look through it for trends. One of the really nice things that Facebook has started to do is it pops up alerts. It says hey some friends are worried about you. They think you might be suicidal. Or our algorithm thinks that you need help. And so in a nice and non provocative way it says do you need help? We'd like to get you some help. That's a pretty cool tool, right? I don't think anyone here could find fault with Facebook using its data to say let's go out and help prevent some suicides. Now on the flip side of that you had Target a few years back. And this is a classic privacy case. And what they did is they looked at all of the data and whether you knew it or not. Every time you swiped a credit card you did a transaction. That transaction was given an ID and it was anonymously mined. Or not so anonymously mined. And what they found were that there were trends among women who were in the early stages of pregnancy that they would buy certain items. And they would go out and buy prenatal vitamins. And they would go out and buy maybe a different size jeans than they had bought the month before, right? So they could start looking at these purchases and saying based on that we've given this person a high probability that this woman is pregnant and let's start pushing out some ads to them. And so they started pushing out coupons and saying congratulations on your new baby. We'd like to give you some coupons to keep you healthy. Now the example with this is that a father got these ads in the mail and he got really upset. And because they were addressed to his teenage daughter. And so he went to Target and started yelling at the manager at Target saying how dare you send my daughter coupons for pregnancy. She's only 15 years old. Well a few weeks later he came back in and apologized and he said I'm sorry I didn't mean to yell at you like that. My daughter is pregnant and we're dealing with that now. So Target knew that this teenage girl was pregnant before her father did. Again, marketing nerds going out and doing hey let's go do some really cool stuff with this data. They don't realize the unintended impact that this can have on people's lives. And one of the folks that was in charge of this even said we're very conservative about all the in compliance with all the privacy laws. But even if you're following the laws you can still go do something that makes people queasy, right? And that's what we deal with is that from a policy standpoint we stand up and say yes this is GDPR. We're going to stand up and do good things. We're going to go out and put privacy policy in place. But even within those frameworks you can still do some stuff that just doesn't fit, right? And so this brings up the 80-20 paradox. We all do the 80-20. Let's build for the 80 and then ship it, right? Because if it works for 80% of the people it's great. If it works for 80% of the browsers it's fine. Let's get it out there. Let's impact this. Go, go, go. But what happens to the 20% of people that are impacted by this? Are you building something that's going to marginalize the 1%, right? So an example of this is everyone that's on Facebook every once in a while you get these pop-ups and it's like five years ago here's a post that everyone liked, right? And that's really fun. Like I get to see baby pictures of my kids and trips that I went on with friends and that's all real fun. But what happens when that notification is about your son's suicide? And you re-trigger something. That's not good. Now Facebook has gone back and put in some algorithms in place to try to coal that out. But again, unintended consequences of good things that they're trying to build. And so we have to think about not just ship for the 80%, but what are the impacts all the way down the line? And so technology has the power to change millions upon millions of lives. For the better or for the worse. We have this power. Now growing up, I'm a third generation tech nerd. My grandpa worked at IBM in the 50s. He worked for 30 years there. My dad said to hell with this, I'm going to go be a teacher. He's worked for IBM for 30 years. He just retired this year. I said to hell with this, I'm going to go be a doctor. And now I'm standing in front of you. So I can tell you DNA is the hell of a thing. But growing up and as I was learning to code for my father, he kept telling me computers are only as smart as a programmer is that programmer. Now this was a subtle way to dig at me thing. You built it wrong. But with that, we have the power. It's not these technologists out there. Like we can pontificate towards Facebook and it's easy to pin Google, right? It's easy for us to say, you're doing this wrong. But then we look at what we're building. Are the modules that we're building, the sites that we're building, are they impacting millions of people? And if they are, how are they impacting millions of people? And so because of that, we have folks like the World Wide Web Foundation that are out there who've put out a kind of a manifesto that says we need to have a better web and here's how we're going to do it. And so they go out and they put up, this is the contract for the web. And these are the core principles. The web was designed to bring people together and make knowledge freely available. Everyone has a role to play to ensure the web serves humanity. By committing to the following principles, governments, companies and citizens around the world can help protect the open web as a public good and a basic right for everyone. We build the open web. That's what we're doing, okay? Whether we like it or not, your choices and your code are impacting folks all over. And so what they do is they go out and say, okay, governments, this is what we need you to do. We need you to ensure that everyone can connect to the internet. You know, connectivity is a basic right. Keep the internet available at all times. Don't shut it down when it's convenient for you. And respect the people's fundamental right to privacy as a government. And this is what GDPR and now here in the U.S. we have CCPA from California. Each state is trying to do their own thing and in true U.S. bravado and government fashion the federal government doesn't like the state government telling them what to do. So now we're working on federal level policies that we can use for privacy as well. And so we're dealing with a lot of these policies and a lot of these decisions that are being made by governments. And it says as companies, we will make the internet affordable and accessible to everyone. We will respect the consumers' privacy and private data. And we're going to develop technologies that support the best in humanity and challenge the worst. That's a pretty lofty ideal. Now as citizens, we're going to be creators and collaborators. We're going to build strong communities that respect civil discourse and human dignity. Yes, we can disagree. But you don't have to go and slander someone online because of it. And then we're going to fight for the web. So as policies and people try to stand in the way of what the open web stands for, we as people need to stand up for that. And so as a Drupal community, we can and should do better to provide our developers and site builders and privacy tools. I've been lucky enough to work with WordPress and Joomla in a wide variety of the CMSs out there. And I will say that we are getting our asses kicked. Like we are woefully behind. And so we need to, and I've proposed one, and that's what the box we're in just now, we need to have this as part core. And what we're discussing in the box was that it's not just technology. Yes, we can go write a bunch of code that says, mark these fields as private. But we also need to go out and make this a policy change. We need to go look at how do we have the modules that say, I will give you GDPR compliance. No, you won't. We need to stop that. But we can't because we don't have those policies in place. And so this is the link to the core privacy initiative. It's in the ideas track. I would love and appreciate for you to post in there what you like about it, what you don't like about it. We have some ideas. Again, I'm offering and asking questions. I'm not trying to be the moral authority on it. But one of the nice things is that we're not alone. And as much as we like to harass and play the frenemy with the other CMSs out of some discussions last year and then into Drupal Europe, we actually got the core privacy teams from Joomla, WordPress, Umbrako, Type 03, and what will be the Drupal Privacy Initiative all together. And we meet weekly to talk about how can we, as a representative of almost 50% of the web, start being proactive in our policies? How can we set the bar that allows us to say, this is what 50% of the world is doing or 50% of the web is doing, come along with us? Far too often right now we are reactive and we need to be more proactive. Because when GDPR comes along, we all roll our eyes and go, great, another policy that we have to work with, another thing that we have to do. The problem is that unless we have a unified voice or unless we have a stance or a platform to stand on, we can't inform those policies. And so this group has been great. If you want to get involved, talk to me. And we're on Twitter and we have it all on GitHub repo. We're trying to be as open and transparent with all the steps as we can. So we got through all of our slides and we have a couple of minutes for questions here. But think about the impact of your code. Think about what your code is doing. Think about what your projects are doing. And I don't mean that in a bad way because a lot of you guys are building some really cool stuff. But just think about the unintended consequences. And let's go build the web of the future, right? Let's go build what the generations deserve. So my name is Chris. There's a microphone here. I think for the recording they would like us to use the microphone. If you don't want to use that, just want to shout. I can repeat it out through this microphone as well. So any questions? Yeah. Okay. So hi, I'm Erin. I'm a web developer at Rochester Institute of Technologies Libraries. Awesome. So, I mean, two weeks ago I went to a conference and we had a very interesting talk about AI in future-proofing AI. And sort of about how to program for the future for hypothetical technologies. Like, I don't know if everyone's ever heard about the paperclip thought experiment. If anyone doesn't know, it's the one where an AI is given an initiative to make as many paperclips as possible. And that's its only directive. So it buys out stocks. It destroys companies. It takes over the world. It ends with a galactic fight in the end. It's a pretty funny HTML, a little thick process thought experiment. I guess have you had any kind of like experimental future-proofing kind of ideas or thoughts? I know it's very into the realm of what happens next. It totally is. It's hypothetical. And I mean, when the HTML specs were being written, they weren't thinking about how it was going to be used, right? I have some personal thoughts on AI. I think that it's an incredibly powerful tool. I also think that we're kind of playing with the nuclear vision on it a bit. I mean, from my perspective, from a book standpoint, like the Windows Store is closing down their bookstore. So it's one of the things that unless people know how to undrm their books or how to somehow find those DRM books, they lose all that information about how archiving information is great and good, but also extra space, but also data that could be lost and manipulated in the future. Correct, yeah. And moving into a fully digital world and moving into a smarter and smarter AI, again, great technology, great stuff, but it has some detriments to it. And so what we have to do is we just have to look at and have real conversations around and real thought experiments around what can we do as developers, because again, computers are only as smart as us. They're not smarter than us. AIs are getting there, but they're not. So we need to be the ones that put those guardrails on. So I would say publicly, personally, I wouldn't say I'm the expert in that topic, but I do think that it's something that we have to look at. Even with what we're doing at the web is future-proofing. Again, nobody builds a Twitter for an Alex Jones. What are we building in Drupal that's going to be manipulated next? That would be my response. Thank you. Right here, and then we'll go down here. I'm Kristi. I'm a web developer. I'm wondering when you're in a brainstorming session and people have a lot of really great ideas, nobody wants to be the Debbie Downer. So I'm wondering what is the tactful way, because my brain starts going a million miles an hour and I start thinking of these negative scenarios. What's a good tactful way to sort of bring attention to those ideas without killing the brainstorming process? Just as a general rule, I would say, I never say no in a brainstorm. I say that's a wonderful thought or that's something we should look at. But I don't think it's a bad thing to raise those, but for the sake of creativity, let the brainstorm run, but don't make a decision in a brainstorm. Brainstorms are just a storm. So go do the brainstorm and then come back, because brainstorms will lend to group think and people are just rolling on each other. That's awesome. That's what you want out of a brainstorm. Then take a step back and say, hey, I've got some questions about this and here's some things that we haven't thought about and then maybe that will spur secondary conversations or whatnot. So that would be my tact. I try to in conversations not stifle it right away unless it's like, hey, let's go build something for daily storm or it's like, that's a horrible idea. But I don't think that I would be involved in a conversation that's that direct. So I would say that from a brainstorming perspective, let the juices flow, but then don't be afraid to stand up and say, hey, we thought about the privacy implications of this. It's really cool that we're handing out coupons to pregnant women. We thought about the pregnant teens. So I think it's fully within your responsibility to write up the person thinking about that to voice those. And all too often, if that's not voiced, then you don't have the power from the guidelines of the brainstorming. Sometimes it gets too far and you kind of hit the point in overturn and at that point you're doing damage control and not proactive. And then real quickly, how can we contribute? I know how to contribute as a coder, but I don't know how to contribute to the think tank on this issue for Drupal specifically. How would we go about that? I would say that everyone has an ability to contribute to the think tank. It's a tank of things, so join. But from a coding perspective, what we're looking at is taking the GDPR module, stripping it up GDPR, using that as a baseline, and then along with WordPress and Joomla and others saying, this is our baseline that we're going to draw. We're going to build to that standard. So there will be a technology portion of it. There will also be a policy portion of it that needs to be updated. And then with those combined, we'll go out and build the GDPR module. We'll go build the CCPA module that will kind of tack on the top. But from a core perspective, there's just some underlying fundamentals that we have to have. So I would say that you can get involved at all levels of it. Great. Yeah, thank you. Thank you. Do you still have a question? You good? Hi, my name is Mickey, and I work with Agaric Cooperative and we're web developers with Drupal and other things. But the question I have is about privacy and ethics. Since I joined Drupal, I've been trying to help people understand that privacy and ethics are at the core of everything and bundled in with your personal power. And it's really important, as you started your talk, you mentioned, people have to have the personal power to do any of this thinking. And what I've been trying to do or working towards is getting people to understand or look at the Free Software Foundation and their four freedoms and how the words open source and free software are two different things simply because of the ethics is removed from open source. Open source means the code has to work and run well. Free software combines the community and ethics and privacy and things like that. How would you approach that to the Drupal community because there seems to be a major disconnect from a lot of people with that and they don't really get what you're saying? I think that's why we don't have a core initiative already. Yes. One way that we can tell it to Drupal is there's free as in beer and there's free as in freedom, right? Drupal knows beer really well, so that's a good analogy to use. It's one of those things where it's like you can have a free product and you can have a free librae. What is free gratis and what is free librae? They're both the same thing, but they both have wildly different impacts, right? Yes. I think we've got the gratis part, the free as in beer part. I don't think we have the free as in freedom part. It was great to hear Dries this morning talk about diversity and inclusion. Accessibility, diversity, inclusion, that all bundles in with privacy. This is part of all of that. I think the tides in technology and the tides in the open web in general are moving in that direction for us to have those conversations and not be the kind of fringe people yelling at everyone in the center. It's good to see even from the top that that's happening. I liked that, but then he still said open source. Right. Dries, come on. I won't get into parsing words, but I think there is a time now for the Libre part of open source, not just the gratis part. Thank you for bringing that up at the bottom. All right. We are over time. I'm actually on kind of a whirlwind right now. I'm going over to 6B, I think, and talking about encryption next. If you want to follow me over there, we'll talk a little more about privacy there as well. Thank you.