 Awesome. Thank you so much. And yes, let's go rules. I don't care what y'all say. Y'all can fight me. All right, so my name is Ceci and I'm here to talk to you about the psychology of fake news and what we can do about it as technologists. Before I get started, I did want to just give a shout out to Keep Ruby Weird. Before I decided to become a programmer, I was kind of soul searching, trying to figure out what program I should learn and what community I joined was really important to me. So I decided to buy a ticket to the inaugural Keep Ruby Weird back in 2014, not knowing anything about Ruby. And I sat right there in the front row and I saw this guy talk about his cats. And I knew that this was a community for me and I've been into Ruby ever since. So still a newbie, but I'm very grateful to this community and it's for me just amazing to be here talking to you right now. So thank you, Keep Ruby Weird. So before we start talking about fake news right now, I want to talk about the future of fake news and you can all, if you want to learn more about some of the stuff that we're going to talk about today, you can go to futureoffakenews.com. This is an episode put on by Radiolab where they go into fake news a lot more in depth and I'm just going to show you a quick clip. This film has been modified from its original version. It has been formatted to fit the screen and it never happened. On the back end now of my presidency, now that it's almost completed, although there are all kinds of issues that I care about. The single most important thing I can do is play go because our parties have moved further and further apart and it's harder and harder to find common ground. So when I said in 2004 that there were no red states or blue states or the United States of America, I was wrong. On the back end now of my presidency, now that it's almost completed, although there are all kinds of issues that I care about, the single most important thing I can do is play go because our parties have moved further and further apart and it's harder and harder to find common ground. So when I said in 2004 that there were no red states or blue states or the United States of America, I was wrong. So there you go. That is the future of fake news and if we take a look at that, obviously the sort of face manipulation is not quite there yet. I think it's pretty apparent that it's not real, but what I think is really interesting was the technology behind the audio. So essentially this comes from a piece of software called Project Voco that hasn't been released as far as I know, but it is upcoming with Adobe Audition. And essentially what this does is that it takes existing audio and because any person that's speaking for say a few seconds, maybe a couple minutes, they'll run through every sound necessary to produce the English language in just a few seconds, maybe a couple minutes. So if you have enough of an audio bite from someone, you can sort of run that through Project Voco and literally just type words and you'll get an audio clip in that person's voice of something that they didn't say. And I think that's really interesting because then you don't necessarily have to have video of someone actually moving their mouth, you can try to manipulate an existing video, not show someone's mouth moving, but have the audio clip and produce something that's pretty close to something that someone could say. So that's not out there in the public yet, it's still upcoming. And again if you want to learn more about this technology, go to futureoffakenews.com. So I think that's interesting. Let's talk about the present though. Let's talk about how we got to this world we live in right now with fake news sort of coming into our sphere. A lot of people tend to blame the social media bubble that platforms like Facebook have created. So let's kind of talk and go through that bubble. So first of all we have things like selective feeds. So essentially this means that a platform like Facebook and I think we all know this, they track the things that you click on, the things that you like, the things that you comment, and they try to show you content that fits that bill. And obviously from a product perspective this makes sense because they want you to continue to engage with that platform, right? But what ends up happening is that people are only receiving one type of information. Now if you do happen to see a piece of information that you disagree with that you don't really want to see anymore, it's very easy to block. So you can go ahead and do that. But then also we have things like link previews within things like Facebook where you can see a headline, you can see a blurb, and you can also see like a photo and you can kind of get the gist of whatever that article is trying to say. So what ends up happening is that people no longer click on articles, especially if you're consuming all of your information in a feed, you're probably just skimming through. You might like something, but if you're actually going to click through it really needs to grab your attention. So what ends up happening is that this gives way to click bait. So it's a real problem when you have to write headlines in a way that really gets people's attention. So I'm going to share with you 10 ways that you can write effective headlines and number nine is going to shock you. So what ends up happening now as a result of these things such as click bait is that we get really targeted content and news outlets are beginning to really try to write their content in a way that they know their audience is going to react to whether it's positively or negatively because if you react to it at that time that's what's going to get you to click. So this is why coverage now is no longer really objective. It's really more okay what is what our audience will want to hear about this particular thing that's going to get them to actually interact with this story. So for example this is the same event covered by two established media outlets. Two completely different perspectives. So who do you believe? You're probably going to believe whichever outlet conforms already to your world view. And I think that this makes it really hard to really parse what's happening around us. Now again these these are both two established news sources. Fake news outlets also use this approach. So here you have two news outlets. One is liberal society and the other one is conservative 101. Guess what? They're both owned by the same person. Now notice that I said person and not people or company because it's so easy now for someone to just get a couple of WordPress instances up on AWS and then just spin up two sites. You throw on a theme on there. Get some logo and you just write the same story but with two different slants. And now if you notice something about both of these headlines how they end are you glad and the other one is prepared to be infuriated. So and again this was not actual news. This was a rumor rumor that was going around. It wasn't even anything substantiated but you have to cover even the smallest whisper and then try to give it some sort of slant and you throw it on on some website and then people will click on it. So just so you get an idea NPR did a story on people that write fake news and maintain fake news websites and the figure that's going around was that during the peak times of the 2016 election cycle websites like these can make anywhere between 10,000 and 30,000 dollars a month. So yeah it's it's big business. Now fake news has been around for a long time. It really is nothing new especially when you think about things like clickbait. It's nothing more than yellow journalism which by the way yellow journalism was a term that was coined in 1890s. So it's been around for over a century. This type of news coverage it just it works. So let's let's try to talk more about what might get people to believe things even if they are distorted or just plain untrue. So let's take a look a little bit into the psychology of this so-called bubble and we're going to take a look at three sort of principles that can kind of explain this and I hope y'all don't fall asleep. So the first thing is system one and system two thinking. This is from thinking fast and slow. So essentially we have two sort of ways of thinking and using our brain power. System one is fast, intuitive. It's things that you don't have to think about to do so you don't have to actively think about driving like if you if you drive home and you've driven home a million times and you're just kind of on autopilot you just do it. You're not really actively thinking about it it's kind of in the back of your mind. That's the type of like system one thinking and beliefs are system one thinking. This is why it's hard to get people to change their beliefs because they're very much an intrinsic part of them. Then you have system two thinking and system two thinking it's more analytical it's your reasoning. So this is why when you are presented with things that challenge your worldview it's not that it's challenging your worldview that's negative it's that it just it's literally more challenging to think about those things because you have to use your analytical mind. So this is why it's hard to process information if it differs from our own beliefs or worldview or if it challenges that. This is also why it's just easier to information that you already tend to agree with that already conforms to your own beliefs. Now the next part comes from a book called The Knowledge Illusion which is kind of fascinating and it talks kind of about how much we don't know. So essentially what the authors of The Knowledge Illusion talk about in relation to how little we don't know is that if we had to know every single detail about everything that we use in our lives on a day-to-day basis we'd like we just wouldn't be able to function because it's too much information just how like Ben was saying in his talk. And essentially the the example that they like to use is if everyone insisted on mastering the principles of metalworking before picking up a knife the bronze age wouldn't have amounted too much. To sort of bring that into the now it's kind of like if you needed to know absolutely everything there is to know about a car not just driving a car but everything with the motor and the tires and the radio like imagine every single part that goes into a car knowing intimately every detail about it on top of also knowing how to drive like it's just too much information you wouldn't be able to drive your brain would just explode right so your brain does a really good job of just you all you need to do is know enough to how to do this thing and your brain will sort of fool you and give you the comfort that it's okay that you don't know all of these other things so you can actually do the thing that you need to do so I'm also going to talk about the bike study so essentially this study was asking people to draw a bike but also rank their own knowledge of a bike so they would be asked how well do you know bikes on a scale from one to ten ten being the highest and then people would say I know bikes you know I would give myself maybe an eight or nine and then they would say okay draw the bike and then people would be like whoa I actually can't do this and you start thinking just like just like in that other drawing that Ben was showing and if anybody is watching this on the recording go and watch learning to see from this same conference because it's essentially the same thing people realize it's you know it's your perception of something and you only remember as much as you need so some people really drew the tires really big because that's what they remembered some people they always drew the seat because they know they gotta sit on it and after people attempted to draw the bike and then they realized that they really couldn't then they would be asked again how well do you would you say you know bikes on a scale from one to ten ten being the highest and people would say oh well I guess it's more of a you know six so typically what would happen is that people would experience their own lack of knowledge and become aware that they didn't know something but you don't know that you're not aware of that until you experience that depth so we are wired to be okay with not knowing and in fact we tend to think that we know more than we actually do and this is because beliefs are not actually rooted in deep understanding you just need to know a little bit but furthermore what's even more interesting about that is that if that belief is shared in a group that belief is reinforced so we only need to know very little about things and if we believe in this in a group with other people there's like the sense of belonging and it also reinforces that belief so again this is another reason why it's really hard to get people to be convinced otherwise when they are challenged with something that might be different from what they already believe or what their worldview is now I think that Clay Johnson really distilled a lot of this information really well in the information information diet so what I want you to do right now is close your eyes and think about your most favorite delicious food that is like bad for you but you love it so for me that would be like a bacon cheeseburger and some cheesy fries and now I want you to think about food that's good for you that you do consume every now and then because you know it's good for you but you don't really like it and for me that would be like kale so hold on to that you can open your eyes now just just know what your foods are what your likes and your dislikes are and let's think about this so in information diet Clay Johnson says information that you agree with and therefore you find stimulating it's like eating that delicious food it's like eating for me it'd be like eating a delicious cheeseburger it's that pleasurable it like it sort of lights up those same parts of your brain that light up when you eat food that you find delicious and it's the same thing for information that you find challenging it would be like eating that thing that you don't really like to eat even if it's good for you this is why when people are presented even when they are presented with facts if they challenge a belief that they have it's hard for them to literally digest and if you think about system one and system two thinking this is because if it's challenging their beliefs it is requiring them to use their thinking too and that is your analytical mind so to try to put all of this and wrap it up TLDR so deep beliefs don't require deep understanding they are actually deepened if they are experienced in a group we're also not really likely to seek challenging information or information that doesn't really conform to our worldview because again it requires system system two thinking so it's a little bit harder for us to process but then also if you think about knowing what we don't know we may just not know but we don't know so we're not going to go seek it out and then also it's just easier to consume information that we agree with just like junk food is yummy so that's really how that bubble is made so is it tech's fault and a lot of people have said you know Facebook needs to take accountability for their role in spreading fake news etc and I like to say that you know what it's not necessarily tech's fault because as you can see we're wired already to sort of behave this way but I do think that it is a little bit of fault just because we created these tools that reinforce the behavior so here's another quote I like which is it's by Alan Kay and he says the internet was done so well that most people think of it as a natural resource like the Pacific Ocean rather than rather than something that was man-made when was the last time that a technology with a scale like that was so error-free and we have moved to a point in our lives with the internet that we literally went from 1998 don't get in strangers cars don't meet people from the internet to now literally some strangers from the internet and get in their car so I mean in the span of about 20 years we've done a complete 180 and if you think about it people that think the internet is a is like a natural resource they're not really going to stop and question what they're reading also if you think about the internet as like this stop gap of our shallow knowledge because now we know that there's a lot that we don't know even though we might feel like our knowledge is deep we don't know until we experience that dip however we have access to the internet so the internet can tell us what we don't know but the internet can tell us a lot of things that may or may not necessarily be right so what can we do as technologists and I think that there's human solutions and tech solutions so I'm going to talk about human solutions first and first of all I feel that media and tech literacy need to become way more important in our society and it's not just about like hey let's try to lobby for schools to teach media literacy to our kids it's not just that it's also it starts with us as technologists educating the people around us that are not in our industry because people need to know that hey you know what anybody can literally go and get a domain name let's take usatoday.com or whatever if they're smart they probably bought different variations of us a today but you know if someone's not really paying attention they might buy us a today with a five and if someone's not with a five for an s and if someone's not paying attention and they try to make it look like us a today like anybody can do that and anybody can make a mock of a website and people need to know how that's done how easy it is and I feel like as technologists we can try to educate the people around us there are not in this industry to let them know hey you know what you really should verify the URLs make sure that they're correct etc little things like that so I feel like a lot of that just starts with us in our own lives and then also there are a ton of organizations nonprofit or otherwise that offer programs for teaching kids how to program and I think that's also really important now whenever we look at that clip that we saw in the beginning people typically say well people are smart enough to figure out that it's fake and here's the thing we work in technology we know what we can do it's probably easier for us so we are we are in a bubble in and of itself because it's easier for us to be able to see the seams but imagine if someone shares a clip that's kind of like what I showed you today which is a video that's fake with fake audio and fake video and someone shares that within their social sphere well if that if the message in that video already sort of reflects my own beliefs and it was shared by someone that I know that shares those beliefs do you really think that that person's gonna say like that's fake that's that's challenging a belief that they have so they would probably be more likely to maybe buy into it we just we just don't know and we can't really hide behind that excuse of will people are smart to know that it's fake especially when we don't really provide media literacy classes to people so we really can't just say people will know it's fake because they probably won't now from a tech perspective I want to talk real quick about some tech solutions so there's this thing called perspective api that's really cool it uses machine learning and it hooks up into your common commenting system and it gives people feedback on what they're writing so that they can someone's writing a comment that might be kind of toxic the perspective api will provide some feedback like hey you might want to reconsider what you're writing kind of like clippy but I guess just nicer clippy there's also out of all places bus feed had this feature called outside the bubble which like essentially at the bottom of their articles they would have things like this like what are other people on the internet saying about this topic and I think that's interesting because this might give people a chance to actually go out of their own bubble there's also something along this this idea done by the wall street journal called blue feed red feed and this is essentially like a curated list around different topics of what a blue feed looks like and what a red feed looks like so that people can go and see differing views on a specific topic I think this is really neat however just like we learned today people aren't always going to go seek out information that they disagree with so I think that we need to I think that this isn't on the right track but people may not necessarily go out and seek something like this on their own and then also the night foundation is actually looking for projects that tackle this very thing combating fake news so I would say if there's anybody here or watching the talk that has an idea about how to fight fake news go and find an opportunity like this because there are people out there that are providing grants to people with ideas so go and make it happen because you can do it then lastly I think that we need to think more about the psychology and behavior surrounding the stuff that we build because I feel like as an industry we talk a lot about best practices we talk a lot about the nuts and bolts of what we build techniques how to test etc we talk a lot about the nitty gritty we don't always talk about psychology we don't always talk about ethics and I think that our industry is no longer the baby that it used to be we've been around for long enough that we need to start talking about these things as an industry if we really want to try to fight things like fake news uh in tech and then lastly kind of going along those lines I feel like part of the reason why we don't always get to talk about things like the psychology or the ethics of what we build is because we're so concerned with disrupting as opposed to actual change and I feel like that really needs to change in our industry because disruption is fast I can poke you and I just disrupted you but did I get you to actually stand up up change actual change takes time and I think that we need to start moving more towards an idea where we're working towards fixing actual issues that are going to take time and we're going to do it and we're going to do it right as opposed to we're just going to go and disrupt this thing right now because otherwise we're not really going to get to tackle the important issues that we need to tackle as a society so that's it for me thank you very much and if you want to read my slides and look at some of the links that I have you can go to sessie.co slash fake-news-psych and read more thank you