 So I'd like to start by telling the story of two stick figures named Jack and Daniel. So Jack and Daniel are both writers, but the difference between them is that Jack lives in the year 1966, while Daniel lives in the year 2016. So let's take a look to see how different life is for Jack and Daniel as two writers separated by a span of 50 years. Jack, if he wants to be published, first has to find himself an editor or an agent. This editor or an agent will decide if his stuff is good enough to pitch to a publisher and then the publisher decides if his material is good enough to publish to a mass audience. So you can see it takes Jack several steps to reach an audience in 1966. Now let's take a look at Daniel. All he has to do is get on the internet, start a blog, and start publishing, and just like that, he can reach an audience. This is something Jack never could have done 50 years ago. Of course, the concept that I'm talking about here is the democratization of publishing. Today, anyone can create a website and publish content. Anyone can reach a mass audience. And I'm sure all of you are familiar with this concept because that's why we're here. Democratization of publishing is the mission of WordPress and no one can dispute the great things it affords us. It grants a voice and power to everyday people. But you all know that, so that's not why I'm up here today. Rather, I'm here to look at the flip side to all this. Is democratization always a good thing? Today, everybody has the power to be heard, but does that mean that everyone should be heard? My name is Dennis Hong. I'm a Jack of Blog Trades at Automatic, and in this talk we're going to look at the dark side of democratization. I will be talking about the objectively bad things that can happen as a result of democratized publishing, and then we'll discuss potential solutions. Because the paraphrase of well-known saying, the first step in carrying the dark side of democratization is admitting there's a dark side to democratization. Also, for your reference, I've created a page to go along with this talk, so if you go to darksideofdemocratization.com, you can get links to all the citations and references I make here, as well as additional resources that you might be interested in. So feel free to check that out. Okay, let's dive right in and talk about problems now. These are the four main problems I will be addressing here. Field by emotions, shallow browsing, mountains of misinformation, and connecting extremists. Now, to be clear, these are not the only four problems with democratized publishing. These are just the main ones I'm going to be focusing on for the sake of keeping this to a half-hour talk. All right, diving right in, field by emotions. So have you ever come across a piece of writing on the Internet or a video or even just a meme that really riled you up, something that made you sit there and think to yourself, wow, this is an outrage. We have to do something about this. And so you do by getting on social media and sharing said piece of content so all of your friends can get riled up, too. Well, as it turns out, this is a pretty key component in democratized publishing. Going back to Daniel, our author from 2016, what I glossed over previously is that he can reach a mass audience, but that doesn't necessarily mean the audience will be listening. There are so many blogs and personal websites on the net today, it's just an overwhelming sea of voices. For this reason, the key to being heard on the Internet today is to go viral. You publish a piece of content that people enjoy, and they enjoy so much that they decide to share it with their friends who enjoy so much that they share it with their friends and so on and so forth, and eventually this piece of content you've created has spread all over the Internet. This is how you reach a mass audience in 2016. To study the idea of virality, a group of researchers looked at 7,000 articles from the New York Times over a three-month period to see which ones were the most widely shared. And this is what they found. Content that elicits an emotional response tends to be more widely shared. Content that produces greater emotional arousal, meaning that it gets our heart racing, is more likely to go viral. And then finally, anger-inducing content is more likely to go viral than sadness-inducing content because it creates more emotional arousal. So this was just one study. At this point, several other empirical studies have been done by both scientists and marketers, and they all come to the same general consensus. The more emotional a piece of content makes us, the more likely we are to share it, and the more likely it will go viral. So now, if we think about this, the very nature of virality means that thoughtful but unemotional content will tend to sit unshared on the Internet. These are the articles that may require the most analysis and consideration, but because they don't generate an emotional response, they rarely spread. Meanwhile, an article that really rouses us up, gets our blood boiling, is going to be shared and shared and shared and shared. As long as it generates a strong emotional response, it will spread. For this reason, we can see that a democratized Internet is going to promote emotionality over logic and reason. So can this be a problem? Absolutely. Especially if we combine it with the next two problems. For several years now, scientists have wondered whether the Internet, and the way it presents information to us, can affect our actual brain structures. The sheer amount of information on the Internet can be overwhelming. The latest stat that we have for WordPress.com is 65 million published posts per month. Before the Internet, we were used to reading books and other long-form writing that requires deep focus to get through. But that's not how information works on the Internet. Because there are so many websites out there, our brains have learned to scan and skim. So basically, we browse through whatever sites we visit on a daily basis. And then based on split-second decisions, we decide what we want to click on and read. And even then, we may not end up reading very far because there are so many other sites out there, all just a click away, sitting there tempting us. For this reason, scientists now believe that screen time, not just the Internet, but screen time in general, has led to an improvement in our visual skills with a concrete decline in critical thinking. So basically, our brains have gotten better at picking out information quickly and efficiently, but they're losing the ability, at the same time, for deep analysis of said information. And if we take this phenomenon and take it to its logical and most pathetic extreme, this is the kind of stuff we see. On April 1st, 2014, National Public Radio shared this link on social media. What has become of our brains was the title. Why doesn't America read anymore? In an age of readily available information and countless ways to get it, we seem to be losing touch with our powers of comprehension. Sounds a lot like what I was just talking about, right? Well, if you actually clicked on the link, this is what you saw. Congratulations, genuine readers, and happy April Fool's Day. We sometimes get the sense that some people are commenting on NPR stories that they haven't actually read. If you are reading this, please like this post and do not comment on it. Then let's see what people have to say about this, quote, unquote, story. Best wishes and have an enjoyable day. You're friends at NPR. I bet you can imagine what's going to happen, right? Sure enough, the comments came rolling in. Excuse me, I read every single day. There are several websites that I've ever missed, blah, blah, blah, blah, blah, blah, blah, blah, all over the Internet. People were commenting about how much reading they do when they obviously did not in this case. As you can see then, this perfect example of the fact that the sheer amount of content on the Internet encouraged skimming over in-depth reading, if any reading at all. And of course, content creators know this. After all, if a writer publishes a long, thoughtful article, but people get bored and nobody reads it, then she has to try something different. So she decides to publish something simpler, something she knows will get people excited. And sure enough, people do. Of course, this only encourages her to keep publishing simpler and simpler and simpler content, and it becomes a race to the bottom. So effectively, we respond to easily digestible content, and this only encourages publishers to keep producing easily digestible content. And this now takes us to the next problem, which helps to explain why there are mountains of misinformation on a democratized Internet. So at this point, I'd like to make a quick side note, which I refer to as the giant trumpeting elephant in the room. So I first started researching this topic over four years ago, and it was never a political issue. The problems with democratization are universal and not confined to any one single group. For some reason, however, since, well, like November 8th, many of the problems I've been talking about here happened all over the news in a very partisan manner. So I'd like to emphasize that I'm choosing to stay politically neutral in this talk. However, you are more than welcome to draw your own conclusions and connect what I talk about to what's been going on in American politics over the past year and a half. It is entirely up to you. I'm staying neutral. And with that, let's start this section by comparing traditional media and democratized media. Traditional media is filtered and curated. If any random person wants to be published, they have to submit the material to the media, and then the media reviews this and decides if it's good enough to be published. Ergo, the media acts as a filter to make sure only quality content gets through to the masses. Now, granted, the traditional media does occasionally mess up, and every now and then, misleading content does sneak through. And yes, there are certain publishers that are relatively low on the journalistic integrity scale. I get that. However, in traditional media, this tends to be the exception, not the norm. Now, let's compare this to democratized media, which is neither filtered nor curated. Today, anyone can start a website and publish anything they want. The barriers to publishing have been removed, but the flip side to that is now there is no filter, so any type of content, whether good or bad, however we want to define good and bad, can find its place on a democratized web. Before the internet, we got information from books, magazines, newspapers, television, all curated and filtered content. Today, anyone can go on WordPress or any other miscellaneous platform we don't care about, and they can start publishing their opinions, whether or not their opinions have any factual or intellectual merit whatsoever. So now, let's tie these last three sections together and see what happens. On any given day, we typical internet users will browse a bunch of different websites clicking on random links that strike our interest. If a link we click on happens to be a long drawn-out article, there is a good chance we won't even finish it. On the other hand, if a link we click on happens to enrage us or shock us or terrify us, then we are much more likely to share it without always pausing to consider the topic or if it's even true or not. As a result, all we ever end up noticing online is misleading content specifically designed to manipulate our emotions. To be clear, this is not the only content on the internet, it's just the stuff that gets the most attention. And by the way, I'm not going to make another reference to our current political climate, but I am going to give you an example that is fairly timely for the past few years now. And the example I'm going to give you is the anti-vaccination movement. Now hopefully at this point, all of you are aware of the sheer weight of the science behind vaccinations. If not, feel free to tweet at me or send me a message and I'll be happy to chat with you all about it. So even today, after literally everything about the claim that vaccines cause autism has been thoroughly debunked, anti-vaccination beliefs still persist. And part of the problem is that there are so many anti-vaccination websites out there. I googled vaccines cause autism and I got over 800,000 results. Now, the good news is if you take a look at the results, you'll see just as many, if not more, scientifically sound websites explaining that vaccines don't cause autism and providing all the science behind that. So that's good news, right? Well, there is still one very big problem with that. That illegitimate sites exist side by side with legitimate sites, gives them false equivalents. False equivalents is a logical fallacy where two sides of an argument are treated as equal and opposing, but on closer inspection, it's revealed that one side is based in evidence, in scientific evidence, in logic and reason, and the other side is based on completely made-up information. In a false equivalent, somebody will say, well, we have to look at both sides equally, even though there is a huge disparity in the quality and the quantity of evidence on either side. Effectively, it gives credibility to a bogus idea, and that's why just the fact that there are so many anti-vaccination websites sitting there right alongside with pro-vaccination websites continues to be a problem. As you can see, democratization allows misinformation to persist. Okay, we'll come back to this when we talk about solutions, but for now, let's move on to the last problem, connecting extremists. First off, humans have an innate need for community. That is to say, we instinctively seek out other people who share our values and our beliefs, and we band together and form tribes as a way to cooperate and fight off common enemies. Everything from politics, liberals versus conservatives to technology, iPhone users versus Android users. Everyday life, dog people versus cat people, people who hang their toilet paper in an overhand manner, versus people who hang their toilet paper in this freakish, crazy, abnormal, underhand manner. Thank you. I'm partisan when it comes to toilet paper, not politics. It's all rooted in our innate desire to belong to a group. For ancient humans, this behavior was vital for survival, but today, it still provides us with a sense of purpose and comfort. Oh, and by the way, the evolution of human tribalism I find to be a fascinating topic because it explains so much about modern human behavior. If this sounds like something that you might be interested in, I have left a few links at darksideofdemocratization.com for a further reading, if you'd like. Before the internet, geography was a limiting factor in our ability to find like-minded others. Your beliefs were very, very different from most of the people around you. You could go for your entire life and not ever meet somebody else who shared your beliefs. On the internet, everybody can find like-minded others. It doesn't matter how wildly unpopular or even extremist your beliefs might be, chances are, someone else out there on the internet has those same beliefs and there is a website out there where you can glom onto those beliefs together. As you can see then, a democratized internet allows extremists to find each other. It also enables recruitment. So before the internet, someone like Osama bin Laden never could have written an article to the national newspaper and tried to find recruits. He can't just write an article saying, well, the Taliban is awesome. You all should join, right? His extremist views never would have been published. The internet, however, allowed bin Laden to craft his messages exactly as he saw fit and he was able to deliver this message to anyone in the world who had internet access and that allowed him to recruit from anywhere. And that's why the internet of today is full of sites like these. White supremacist sites. 9-11 conspiracy sites. Holocaust denial sites. Every fringe or extremist belief you can think of, it's out there on the internet just waiting for new members to join. As you can see then, even though extremists represent a sliver of the human population, they are able to connect online and amplify their message and this can pose a very tangible danger. All right, so that wraps it up for this section on problems. Few, right? Just to recap really quickly, a democratized publishing environment promotes emotionality over logic and reason. The sheer amount of content encourages skimming over in-depth reading. Democratization allows misinformation to persist and democratization allows extremists to find each other. Again, these are not the only four problems. If you'd like to read up on more problems because you're a sucker for punishment, go to darksetofdemocratization.com and you'll see some other issues that I address as well. But let's talk about solutions now. Playing the game, combating misinformation to silence or not to silence. Let's dive right in. So if we ever come across a piece of content on the internet that really rouses up, makes us angry or makes us shocked or makes us terrified, the first thing we have to do is take a step back and evaluate our emotional response. Make sure we're being stoic Yoda, not disembodied Luke here, and decide if our emotional response is worth it, if this is something we're getting riled up about. If it is, then by all means, we should share a way. But if not, then we probably shouldn't share it after all. As content consumers, awareness that we're being manipulated is our greatest tool. Okay, what if we're content producers? Well, for that, let's just start with the metaphor, using cake. So let's say we have a slice of cake that is literally just a chunk of plain baked white flour, nothing else. Obviously that's going to be kind of boring and bland, and few people are going to want to eat this cake. On the other hand, what if we have a slice of cake that is just layer upon layer upon layer upon layer and layer of frosting? This cake might look really pretty, no, what we want is a perfect balance of thin, delicate layers of sugar-y sweet frosting over thick, rich, substantial layers of moist flour. That's how we bake a perfect cake. Internet publishing is the same. We have to catch people's attention with a flashy title, verbal frosting, if you will, something to get people to click on the link. And then once they do, then we start doling out heaping madefuls of deep, insightful information. There's nothing wrong with a little bit of fluff every now and then, as long as the information we're presenting is accurate and ultimately valuable. By the way, I do just want to apologize right now to all the cat lovers out there if I'm in any way implying that cat photos are superficial or manipulative. I swear it's just a metaphor because come on, we all know that cat photos are the awesomeness. So moving on, I'm now going to give you five incredible reasons the next section of this talk will blow your mind. You'll be astounded when you hear what I have to say at the end of this presentation. Right? This is how we play this is how we play the Internet publishing game. Okay, moving on for reals now. Aside from excuse me, checking our emotions, before we share any content online, we also have to make sure that information is valid. Now the good news is there are now a ton of different tools that we have to check misinformation and also identify questionable websites. So if you go to darksetofdemocratization.com you'll see links to several browser extensions and resources that you can try out. Okay, then what about helping others? What happens if we see someone else posting false or misleading content? What do we do then? Is the solution as simple as offering them concrete evidence to the contrary? The answer is no. A few years ago a group of scientists identified a phenomenon that they dubbed the backfire effect. And this is how it works. If a person with strong pre-existing beliefs is shown evidence that their beliefs are wrong whether that evidence be in the form of a news article, a scientific paper or any other reputable source of information this person's wrong beliefs become even stronger. Facts not only fail to correct misinformation, they somehow strengthen and reinforce it. And furthermore, when confronted with overwhelming evidence that they are wrong people will often engage in a behavior called flying from facts. And here, instead of attempting to dispute the validity of specific pieces of information they'll simply reframe the entire issue in an untestable way so they can sidestep the facts altogether. And here's what that looks like. Vaccines are dangerous. They cause autism in kids, says the anti-vaxxer. Actually, there is zero scientific evidence that they cause autism and any purported studies to the contrary happen thoroughly debunked at this point says the pretentiously voiced scientist who follows up with a detailed list of peer-reviewed studies. Well, of course, that's because it's all part of a massive cover-up by pharmaceutical companies. As you can see, when confronted with indisputable evidence that they are wrong the anti-vaxxer retreats to an argument that can't be verified or tested scientifically so we can't prove them wrong. Now, just a quick show of hands. Who has ever had a discussion that kind of went like this? It doesn't have to be vaccines or any... cool. Good for you. Good for you. Me too. And by the way, I'm staying neutral, of course. But if you want to take that last quote and replace pharmaceutical companies with mainstream media, maybe that might sound kind of familiar too. As you can see, combating bad information is not a simple matter of presenting good information. As studies have shown, good information alone is nowhere near enough to convince someone that they hold incorrect beliefs. Does that kind of blow your mind just a little bit? So, okay, what do we do? How do we combat bad information? Well, it comes down to having a lot of patience and empathy for those whom we perceive to be misguided. We're not willing to engage them individually and often privately with understanding and tolerance and not an ounce of shaming or hostility. We can't be combative. It's less, you're wrong. I'm right. And more, I see your point of view and I understand your concerns. Have you thought about it from this perspective? That's how we change minds. It's a slow, gradual process. And that's why I put the offer out there, sincerely. If any of you are concerned about vaccines, send me a message. And I'll be happy to discuss it with you. Okay. Finally, to silence or not to silence, if there are websites out there that can pose a tangible danger to people that can be considered hate speech or breeding grounds for violent behavior, what do we do about them? Do we take measures to prevent extremism on the Internet? Well, first, let's take a look at a couple of solutions that have been tried. First off, Google, Twitter and Facebook have attempted to target teenagers who might be at risk of being radicalized. They'll have algorithms that will track the teenager's online behavior and if they use any red flag terms like Sharia or Mujahideen, which is a person involved in a jihad, these teens will be directed to anti-radicalization websites or shown videos on how to resist being tempted by radicalists. And then, there's also the most extreme option to simply block any websites that are seen as dangerous. This is something that China is infamous for and there was a talk just yesterday about this. This is also something that's all over the news right now, not just in regards to extremism but to misinformation in general. But then, doesn't this sort of run against everything that democratization stands for? The reality is there are no simple solutions. So, what I'm going to do now is I'm going to open this up for a group discussion and do a Q&A at the same time. So, because I don't have all the answers, I will admit, I don't think anybody does. So, we'll do a group discussion for about 10, 12 minutes and then I'll wrap everything up and leave you with some closing thoughts. So, if you have any questions, comments, suggestions, ideas, about anything that I've talked about so far, we'll do the Q&A right now. Oh, I'm not done yet! Hello. So, what are your thoughts on satire websites who have a right to exist and are great and brighten up our lives? But, a lot of people fall victim to those sites because of the reasons you discussed and not paying attention and following through. How do we... do you agree with how Facebook is doing it and whether these sites are what do you view for those types of sites? First off, I absolutely support satire websites. The Onion is one of my favorite websites to read and you're right. It is very easily mistaken, especially if there are sites out there that are a little bit more subtle with their satire. In my opinion, if you mistake an Onion article for real, then maybe you need to do a little bit more... They're the outlier at this point. But there are a lot of websites. They're different now as long as it is very clear. But the problem isn't with very cut and dry websites like the Onion. There are other websites out there that purport to be satire but you can tell that they're doing it in a subtle enough way where you really have to suspect if they're doing it on purpose to try to mislead people and get clicks and views or if they're actually being satirical. As a... I don't know if you saw... I write comedy in my spare time. And as a comedy writer I can tell you that subtle satire is bad satire. That's not a good satire. But we can't dictate that. We can't be like this isn't good enough. We're not going to slap the satire label which is going to be false. So I think it's the gray areas that are going to be problematic. And yes, I support what Facebook is doing. I think that is a good first step but we have to find the nuances of it. I don't know. I think it's a bit unfair that someone can easily spread misinformation and have emotional responses share it so heavily with so many other people so quickly. Yet the response to correcting these misperceptions is a one-on-one patient conversation that will take a lot of time and energy if successful at all. Absolutely. So part of my question was I know Upworthy had a bit of success for a while because it might not have been very, very public using emotions. Are you aware of anyone else who is doing this for misinformation websites? So I forgot I was supposed to restate the question. The question is Upworthy has had some success sharing stories that are not based on emotions. Am I aware of other sites? I absolutely am aware of them. There are a lot of sites out there that attempt to disseminate good quality information. But even Upworthy like the two example joke titles those are Upworthy articles. So that's where the playing the game part comes in. I think it's absolutely okay to do that because you have to be competitive. You need to get people to click on it and then once you do that then you cross your fingers and hope that they will keep reading. So yes, I absolutely do believe that there are a lot of articles out there a lot of sites out there that give it an honest shot and I am way supportive of them and I think it has to be a concerted effort that we as a society decide that these are quality sites. These are the ones we should be reading and everything else is good for entertainment it's good for a cheap thrill so to speak but this is what we need to focus on. Does that address your question? How are you doing? Good. How are you? This is William Jackson. I teach educational technology and I wanted to get an idea from your experience and your knowledge. How do you continue to engage young people when you talk about misinformation that there's actually misinformation out there and they have to be careful in what they're reading and comprehending because we've had this discussion several times in my class about misinformation how to address it what to do about it. So from your ideas experience what could I talk to them about? Funny you should ask the question is from my experience how do I address people, how to encourage people basically to be cautious of misinformation. That's kind of a paraphrase. So before I became an ornamentation I was a high school teacher a high school science teacher at a continuation school which is the school for kids who have been kicked out of school and this is where I learned a lot of my teaching techniques because these kids were not the most motivated kids we'll just say that and so to me it becomes you have to empower people to make their own decisions you can't dictate you can't tell people this side is bad this side is bad this side is good you have to give people the tools to say okay here are some tools this is the type of thinking you need to engage in to make these critical decisions. You know read through this site and does it use a very inflammatory language is a very hyperbolic if it is then you might need to be careful about that or does it provide links to a bunch of different sources and so on. So to me my answer to would be empower your users your customers your students whatever population you're working with encourage them to do their own work and that's how you get them to take on that scientific mindset so to speak of being skeptical and always being cautious rather than somebody telling them this is right and this is wrong. Thank you. Hey Dennis my name is Kevin I'm a developer you mentioned the importance of emotionality and connection to credibility and as a developer a lot of the writing we do is technical and straightforward and to the point so an example would be how to write your own short code and we're trying to help other developers might be searching for how to write their own short code so how can we bring emotionality into otherwise technical or instructional writing? For more esoteric fields I don't think you have to. As an ex scientist if I'm looking for a specific study I don't want fluff I'm looking for very specific information and if I find it I'm great I just want to see the facts I just want to see the data and so I think for technical fields where people are already going to be looking for something specific it's not something you have to worry about what we're talking about is popular culture and popular news stuff that every day people quote unquote consume and that's why you do have to be competitive so as a developer if that's your focus I really don't think that's something you have to worry about because people will find it right? My name is Josh and I'm a developer there's two comments I want you to quickly make one thing that I found helpful as weird as it sounds is on occasion when misinformation is spread and it gets me fired up I can't believe that this stupid information is out there is to not bring attention to it because sometimes you actually give it to the press when you give it attention the other thing that I found is sometimes depending on the type of misinformation the most extreme forms like say anti-Semitism with holocaust denial if like on Facebook if the person crosses the line then go ahead and report them at that point and then sometimes that might make them think twice and maybe taper back some of their rhetoric yep so sorry I keep forgetting to do this the question where the comment was how to deal with misinformation and yeah I agree it's very extreme if they cross that very clear line then absolutely there are tools to handle that so I think the bigger issue is with that gray area kind of like what I was talking about previously yes I agree that when it comes to misinformation spreading misinformation any attention is good attention you can say holy crap this is awful what kind of horrible person you are you're still giving them a click we do have to be tempered with it and for me personally my solution is how often does somebody do it and if I see somebody doing it all the time I may message them privately and just kind of drop them a link to something that they can read to verify that but yeah you do want to minimize that and not to call attention to the bad stuff I agree absolutely thank you hi my name is Lauren I'm a librarian so we've been fighting this fight for a long time we call it information literacy and if you've ever heard of the pacific northwest tree octopus you can see some of the things that we've done to try to teach at least kids and teenagers and college students how to evaluate information but with the budget cuts we've all gone through a lot of school libraries and librarians are being sort of eliminated which takes that information literacy out of the equation in just the last two weeks I've seen a lot of college libraries putting together lib guides and infographics to try to spread that out and talk about how to evaluate information for you know being based in a fact in like a post truth world and I think basically this section could have happened at a conference you know to see the two communities work together to spread how to evaluate information would be a really good thing I think we have a lot in common and we could really do that absolutely I think the comment was how can we spread information literacy for librarians at public schools high schools and colleges it's a shrinking field because of budget cuts yes I absolutely agree and I think this crosses many fields this isn't just tech it's not just education it's really anything where you know because humans are social animals we've learned by spreading information from one person to another and we have to figure out how to do that okay so due to time constraints I think I'll take one more question sorry about that if you want to come and chat with me afterwards though I'll be happy to chat with you well thank you for addressing this topic welcome this is obviously something that we need to see a solution to but watching talking about being politically dispassionate taking a look at this last election for me it was very instructive particularly on election night to be switching between networks and so I would literally watch Fox I would watch CNN to diametrically oppose the views of the election and I learned a ton by doing that often when an adjective is used it indicates bias and when I take a look at bias myself and the way I perceive things and that would actually be one of the concerns I'd have with filters who's doing the filter and what is their bias and so taking different points of view and bringing them together and being dispassionate and taking the motion out from a personal point really seems to be the best way to approach these things obviously you have to be careful when there's true extremists involved I agree 100% and I think what you addressed a lot is this idea of filter bubbles that if we are only specifically reading if we're reading specific sites then that is going to limit our viewpoint so I do absolutely encourage everybody and it's hard I know it's really hard sometimes pick out a site that is opposite of your own beliefs and really read it sit down and read it and try to read it with an open mind and evaluate the sites what you're trying to do is get a link like you don't read it for the sake of saying this is wrong this is wrong read it for like why would somebody reading this buy into this why would somebody believe this why would this matter to somebody I think that's what we're missing a lot of right now with those echo chambers is how to cross connect to somebody who has a very different belief from us and with that let's just recap really quickly everything that we talked about that's content consumers awareness and conscientiousness are our best tools that's content producers it's okay to play the publishing game correcting misinformation requires patience and empathy and finally there are no simple answers for dealing with extremism and with that I'm going to leave you with a quote by Shazaf Rafiely who is the director at the center for internet research at this University of Haifa if we get into a moral panic and do too much too soon we might regret it we need to worry about hate and hate speech but the sorts of expressions we see online are not necessarily a cause for legislation nor are they a cause for litigation open culture is the very nature of the innovation that is the internet and you cannot achieve openness by closing doors this quote really resonated with me it does represent the spirit of democratization while acknowledging that we have to be careful the democratization of publishing does have a dark side but that does not in any way imply that it's a bad thing or that it needs to be suppressed it just means that we have to be careful and be cognizant of these unintended side effects and that's why I'm up here today hopefully I've brought a little bit more awareness of these issues that we as supporters of democratization will need to be thinking about for years to come to paraphrase another well known saying the price of democratization is eternal vigilance this is far from the end everyone thank you