 We are so happy to have partnered once again with the League of Women Voters for our eighth year for their speaker series, it's a wonderful collaboration, we appreciate their very good programming here at the library. Kate Rader is going to come up and introduce our panel and once again, thanks for being here. Good evening, I'm Kate Rader with the League of Women Voters, welcome to you in the room and to those watching from home. This is the last of our series this year, our eighth year of collaborating with the library with these subjects of civic interest. Keep watch, we'll let you know what our next year's theme and topics are. Tonight the topic is mis- and disinformation in our election process and how we can identify and resist it. Here are tonight's presenters, Dave Graham was a reporter for more than 30 years with the Vermont Bureau of the Associated Press with later stints at the Vermont Digger and his host of the Dave Graham show on WDEV. He wrote the fair game column in seven days from January through May 2021. M.E. Kabeh Mesh, emeritus professor of computer information systems at Norwood University and he's an operations management consultant. He offers security assessment, development and improvement and practical advice for novices and families on protecting themselves and their children against harm on the internet. Skye Barsh is the chief executive officer of Vermont Digger. Prior to joining the Digger in 2023, she worked in senior roles at the nation, Vermont Life and Civic News Company, one of the country's largest non-profit newsrooms. She began her career as a reporter for the Times-Argus and Burlington Free Press. And now I'll turn it over to Skye, who will leave the rest of the proceedings. Thank you Catherine. Hi everybody, welcome. Thanks for turning out for such an important topic and an important year. I'm really excited to have this conversation and we're going to talk through a few different concepts and how those relate to today and we're going to open it up for questions eventually. But if there's anything pressing that you want to jump in, feel free to raise your hand and it'd be great to have some engagement on our conversation. So we want to start out with a question to see who remembers when Sarah Palin said I can see Russia from my house. Okay, our first mis or disinformation. So it turns out she did not actually say that and we all think that she did because Tina Fey had the caricature of her on Saturday Night Live. And I didn't know that, Dave shared that in our pre-call and what did I do? I went to a website called Where Do People Go When You're Looking to Verify or Disverify Something? Snopes, exactly. Bad idea. Bad idea, snopes. Interesting. Is there advice by CIA, so any foreign policy things are kind of skeptical? Huh? Yeah, I might not only go to snopes, I might also go to Politifact, which is run by the same organization that runs the St. Petersburg Times and Floor, a very well-respected newspaper. And so if you get, well, at St. Petersburg Times, do you have any evidence that they're tied to the CIA in any way? That group knows, knows, yes. So that's why you go to more than one sort, more than one checkpoint like that. And you're going to probably want to get, if you have three maybe, is a routine thing that you do when you see something that looks like it might be disinformation. We can get to this a little bit later in the program, but this is how you defend yourself against disinformation is by essentially being skeptical, checking it out, and remembering also that occasionally the impetus to check for disinformation is disinformation in and of itself. For instance, it's very much disputed just for one example, whether there was a, whether there was collusion between the Russian state and the Trump campaign in the 2016 election. Just for an example, very much disputed. The former president calls that, quote, unquote, the Russian hoax. And so there he is basically saying that that was all disinformation that prompted the Mueller investigation, et cetera, et cetera. And so if there are individual parts of that, you probably would have to break into the pieces somewhat and check them out on multiple media sources. Are there media sources you trust? Are there, I mean, there's a wide range of sources of information out there. So go and see what they've reported. The ones you think are better than this one where you're seeing it in this instance. And then go ahead and check some of the sort of fact check sites. Washington Post is a very energetic fact check desk. As I mentioned, the St. Petersburg Times, the PolitiFact program that they run, and then Snopes is one that a lot of people rely on. I guess you don't, but that's, I mean, you're probably never going to get a single fact checking source that you're going to find is perfect. Let's talk about what we're talking about. We'll back up when we're talking about misinformation and disinformation. And Misha, I think you had some good definitions for the terms because they have some nuance and that they're different. These terms even have abbreviations in military parlance. We speak of misinformation and disinformation and also psyops, which refer to mistaken information. People who are spreading mistakes don't understand that what they are distributing is false. There are people who genuinely believe, I'm not making this up, that the earth is a disk and that the sun somehow travels around a disk floating in space. These are called flat earth people. That is likely to be misinformation. Well, it could explain how Sarah Palin could see Russia from her house, though, if the earth were in fact flat. The people who are spreading misinformation are simply mistaken. They really believe what they are saying. It gets worse, though. There are deliberate lies. We're familiar with disinformation from what we also call propaganda. These are deliberate statements meant to cause bias or disruption often of political processes. Unfortunately, once disinformation has entered what we often call the information ecosphere, the world of shared knowledge, once that happens, there are innocent, credulous people who will spread the disinformation. In some sense, you could argue that from their point of view, one would describe their actions as spreading misinformation. The originators deliberately lied. They knew what they were saying was false. They put it out for specific reasons or possibly for fun, and I'll give you an example. But the difficulty is that these spheres can merge. And we end up with psychological operations, PSYOPs, which deliberately manipulate a credulous audience, people who are willing to believe things without checking. And the psychological operations are designed intentionally to shift usually mass opinions, political, legislative, whatever you wish to describe it. That kind of operation can be highly disruptive, indeed even destructive. When we no longer share a common base of knowledge rooted in reality, we can be subject to demagoguery, to shifting political norms, to the acceptance of extremism as legitimate political positions. Those are just some basics. One other quick note that you and I were discussing earlier. At some point, we will probably want to talk about control over information. And there's two concepts that are important for us, new, their vocabulary terms, their concepts. One is called intermediation. For example, our discussions mean that as we follow your instructions, man, you're in charge. We're not going to give jokes about, I don't know, naked grandmothers. It's not appropriate. We have intermediation. We are controlling the content of our communications. The opposite is called disintermediation. And disintermediation means uncontrolled distribution of information, comments, opinions, and the like. There is a confusion in the United States due to the First Amendment of the United States Constitution, that explicitly bars control by government agencies over the content of speech by citizens. Let me be absolutely clear. The First Amendment does not apply, legally, I'm not a lawyer, this is not legal advice, does not apply to individuals' control of communications, such as, for example, what people can post on your Facebook page as comments. You are in charge. It's your Facebook page. This is a really important point, which I think, if you all remember the Hunter Biden laptop story that broke in the New York Post about a month before, three weeks to a month before the 2020 election, and then Twitter decided it wasn't going to distribute this on that social media site. And I think Facebook did the same. And there have been a lot of complaints by conservatives since then that this is effective with censorship. And the point I've tried to convince people of is that it might have been censorship if somehow the government had intervened and told Twitter and Facebook they were not allowed to carry this information. But they made individual decisions, corporate decisions, that they were going not to carry this. And to me, I've sent a couple of proposed op-ed columns to the New York Times in my past, neither of which got published. And the New York Times had every right to say thanks anyway, Dave. And that's what they did. And the government would have, under the First Amendment, has no role to step in and tell the New York Times, you've got to run this guy's column. Any more than the government has a role to step in and tell Twitter, you've got to allow this New York Post story on your site. Should we take questions now, or do you want to wait? I'm just curious if you are well informed about the Twitter files. Well, let's get back to that. I would love to ask, to try to keep this focused on the election year for now. I'm curious if you can share a little bit about what the motivations would be behind somebody sharing misinformation or disinformation. I understand misinformation isn't as motivated, but there's, what's there there, what's behind, you know, who, who, government, is it government, is it media, is it who is doing the misinformation and disinformation sharing? The biggest source in the election context, I would think, is political parties and individual candidates' campaigns who might want to put out information that's damaging to their opponents. And now, just in the last month or so, there's been this sort of scandal that's broken on the national political scene in which there have been allegations that the Bidens were involved in this company called Varisma over in Ukraine. There's tons of stories you can read about this online. Each of them allegedly took a $5 million bribe, and this all came from, or a lot, or mainly from this one whistleblower who was talking to the FBI. And the FBI turned around last month and charged this person, indicted this person on charges of lying to the FBI, which is a crime. And so, and this person had been, according to the recent stories, in touch with Russian intelligence. And so there seems to be a history and a pattern here, to some extent, of Russian, the Russian state at least trying to meddle in our elections. That, of course, is a long history in the United States trying to meddle in the elections of other countries. So we're not necessarily planning any moral superiority on that score, but clearly they have an interest in trying to influence our election outcomes. And so they have this story that the Bidens were taking bribes from this energy company and its officials in Ukraine. Now there is an indictment of this person named Alexander Smirnov, who allegedly was lying to the FBI about this stuff. So there's a case of, and I think if this story, the way it's unfolded down, is correct. Take it for what you want to take it for, but if it's correct, then I think it's a great example of sort of disinformation. Morphing into what I call motivated misinformation. And how that happens is, so the Russian state comes out and says, we're going to put out this story about the Bidens being corrupt. And the Republicans in Congress, Jim Jordan and Matt Gaetz and all these other folks, are going to say, okay, the FBI has this one whistleblower. By the way, the FBI is sort of cautioning us and saying, we're not sure how reliable this person is. But this one whistleblower is making these allegations about the Bidens and therefore, because they're damaging to our political opponents, boy, they must be true. And so they've been holding congressional hearings, trumpeting these charges and talking about, quote unquote, the Biden crime family, et cetera. And this is the way our government in the Congress anyways, conducting itself lately, you can think that's great or not so great. Can I jump in with a question for you? So when we think about public relations and sharing and from official sources, there was just recently a heavily photoshopped image from Kate Middleton and her family. Is that misinformation or disinformation or is that just a bad photoshop job? It's disinformation. It was a deliberate modification of an accurate photograph. Is it important? Well, if you look at the original photographs, it's not very different. In fact, puzzles many people why anyone would bother it. However, it has led it was involved in a burgeoning system of wild claims about that she was dead. She was being divorced. These were this that kind of misinformation, unless people knew it was wrong, in which case it was disinformation, was a result of credulity. People were willing to believe whatever is exciting. Let us briefly mention a little bit about human psychology. How interested is this audience going to be if I say A, B, C, D, E? Not at all. It's boring. It's so familiar and so devoid of anything unusual that it's nonsense. People are simply going to pay attention. What we find consistently is that contradictions of expectations result in increased mental activity. People become interested. If I said, well, actually this alphabet stuff comes from a planet orbiting a star 83 billion light years away and was created before the Earth was formed. Isn't that interesting? Well, look, people may send me to a mental asylum, but they're going to have increased activity, at least while this nonsense is being uttered. Misinformation and disinformation can attract increased attention. The second problem we face in psychology is called confirmation bias. Confirmation bias is a well-established observation about human psychology. When people are informed of what they already believe, it increases their likelihood of accepting the information. On the contrary, when people confronted with violations of long-held beliefs, the tendency will be not always, but the tendency will be to reject the information. We have some appalling examples that are being broadcast today of people of a particular political party who are being confronted with factual information that contradicts what these followers have been told to believe, and they flatly reject reality regardless of the strength of the evidence. Psychological issues are being exploited by those who deliberately engage in what are called PSYOPS, psychological operations, based in part on disinformation designed to strengthen the existing beliefs of those who are already biased or to shift some of the people who haven't yet decided. Very helpful. Dave, you mentioned something about the national election, but I'm wondering if from your years with the Associated Press, if you could talk about any times you experienced mis- or disinformation in Vermont. I remember there was one incident when a candidate for the U.S. Senate, I think it was in the year 2000, put out some stuff that was just contrary to his record on... And so I wrote a story about this, and he really... basically what a reporter often ends up doing is saying, here's what the politician is saying, here's what the record shows, and you try to keep up this sort of objectivity practice in which you invite the reader to reach their own conclusion. And that was what I did with this story. I can't remember much about the details now, and I should have done a little better advance work so that I'd be able to relay more of them. But I remember the candidate who lost called me up after the election and was pretty bitter and basically said, I cost him the election. And I said, actually I think you cost yourself the election. And all I did was report what you were saying and what the record showed, and there was two contradicted one another, and that's the way the cookie crumbled in that case. So there was another incident where... I don't even know if this really rises to the level of disinformation as much as it was just an outright lie, but I got a tip that there was an incident between here and St. John'sbury on Route 2, where a governor was traveling from this area over to St. John'sbury and pulled up behind an allegedly too slow motorist in front of the governor. The governor is late to give some talk over in St. John'sbury, and his state police driver is trying to step on it, and there's... I guess this was a section of the road where passing wasn't really a thing. So they started honking and flashing their lights at this motorist in front of them trying to get the person to pull over. And the governor allegedly had a hat that said governor on it and was holding it up to the windshield trying to show that the motorist, I guess the person was supposed to read this hat and the rearview mirror or something. Anyway, I thought it was kind of funny and goofy and weird, and I started chasing it as a possible story. One of the things I did was I called the governor's press secretary and I said, you know, I want to get the official line on this or whatever, and he said, didn't happen. Completely made up. Not true at all. Okay, so a few years later the governor had actually died. This press secretary, I ran into him just somewhere in downtown Montpelier. Actually, okay, I thought it was a thrush tavern. And he volunteered that he had lied to me. And so I guess that was a piece of disinformation directed at me, and it was successful in terms of keeping the story out of the papers. But that was a case of disinformation. And I think there have been other examples of people who haven't been completely truthful or just kind of shaded things a bit. But frankly, I've always been amazed, and I used to say this when I was covering the legislature for years, that it's a very high degree of honesty in Vermont. I mean, compared to other states I've read about and familiar with, I grew up in Massachusetts and certainly compared to the Chicanery on Beacon Hill, our state house is kind of pure as the driven snow. So I don't know, considering it's a lot of flocking or whatever. Mish, you mentioned confirmation bias. Can you talk about how social media plays a role in that? Very much so. Let's take the example of Facebook. There are algorithms, that is computer-based rules, which monitor keywords. And this will maybe bring us into artificial intelligence a little bit later. They monitor keywords in what is being posted by individuals, and they'll also scan photographs and keywords, anything that seems to be generated or interesting to an individual user. The algorithms then either covertly increase the frequency of what has been viewed as popular, or sometimes they very openly put a notice up that says, would you like to see more posts like this one? The latter is perfectly acceptable. They're asking a question. You get a choice. However, the modification algorithms that increase the frequency of what has been viewed as positive and decrease what has been viewed as negative by the individual user of that page, run the risk of increasingly distorted access to information. The problem becomes that with inadequate intermediation, incorrect information or extreme views, which are not particularly repressed, unless they are hateful or advocate punishment, death, persecution, and so on, that those existing biases will be reinforced by the content of an individual's access to other people's postings. So they will see more of what is consistent with existing views, feelings, and the like. Contrary, they will be exposed to less challenging posts, regardless of accuracy or whatever. That's part of the confirmation bias that is inherent in what we are seeing in social media. One quick note, social media are increasingly using, a topic we'll probably discuss in a little while, they're using artificial intelligence techniques to identify potentially illegal content. It is not permitted in the United States, and certainly not in Europe, to post pornographic photographs of children or photographs of abuse in Facebook or other social media. And that's in the United States, that's because there are explicit federal regulations which make it a federal felony to make, distribute, or store what is defined as child pornography. And the first violation can have up to five years in federal prison and up to $250,000 in penalties, and a second and following violation of that law can result in doubling the maximum fine and doubling the maximum period of incarceration. Let me quickly repeat, I am not a lawyer, this is not legal advice for legal advice consultant attorney specializing in the area. However, I did teach cyber law for almost 20 years, very carefully making a distinction between admitting legal opinions and reporting on legal cases and law. So that's a legitimate description. You can understand that the algorithms sometimes make a mistake. I posted a legitimate Guardian Weekly, Guardian from England, Guardian article about photographs of slavery, which included a picture of slaves in chains standing naked in front of their prison. And the algorithm not only removed my post, it punished me with 10 days of not being able to post. That was a legitimate article, but the algorithms didn't read the article, they spotted using artificial intelligence photo recognition, they spotted naked bodies in the picture and the algorithm immediately went into defensive mode. Is this terrible? Well, I'll tell you, ever since then, I have been meticulous in not posting a picture associated with an article. If the article has an interesting material and legitimate description of anti-racism and so on, I'll check to make sure that it's not going to spark the algorithms into what is called a false positive and cause interruptions. I should explain that my Facebook page is full of political resistance articles, a resistance to fascism, anti-racism, I'm a life member of the NAACP and proud of it, feminism, I'm a proud member of the National Organization for Women, and it also has funny cartoons about cats and dogs and pictures, it has pictures and zoological information about insects and octopuses and lions and tigers and so on. So that's not going to cause any trouble, but you get the idea. The intermediation can make mistakes and it can allow disinformation or it can have false positives and block legitimate communications. I think it's one of the, when I think about confirmation bias and the algorithm showing us what we already like and not showing us what we don't like, it's a tragedy when you think about the, we have access to the, we have the access to the most amount of information we've ever had access to and it's putting us in the most narrow box and wouldn't it be great if it were the other way around it, you know, we had. Newspapers have done the same. Well, as Professor Cabe just mentioned to me, the newspapers have done the same. I think there was supposed to be a sort of sort of voce, but it was true enough that, and by the way, speaking of disinformation and newspapers, I, as, as somebody who had a career in journalism, I have to issue a bit of a confession here, which is that newspapers have been known and media companies have been known to promulgate disinformation in, with profit motive in mind. And I'm thinking here, in particular of an era of what was called yellow journalism in the late 1890s, where there were newspapers in New York that were just coming up with these more and more outlandish stories and trying to attract readers by reporting stuff that was very poorly checked out, let's put it that way. And it was sort of like, they got into this kind of, can you top this thing? And actually a couple of them were, would have a box, the most outrageous story would be in a box on their front page that was done up in sort of a yellow tint, and hence the term yellow journalism, if you ever wondered where that came from. I think the same phenomenon, frankly, was in play much more recently. You may remember a lawsuit brought by a company called Dominion Voting Systems against the Fox News Company. And what was happening in the lawsuit, which resulted in a three quarters of a billion dollar settlement, so it was given a lot of credibility, these charges were given a lot of credibility via that large settlement. The allegation from Dominion was that Fox had actually lied about the 2020 election being stolen, and had invited guests on to make this charge, and had spent a couple of months after the 2020 election really kind of mucking up the public's view of what had happened in that election. And all of it in favor, of course, of the idea that Donald Trump had actually won and had been cheated out of his victory and so on. Well, Dominion was dragged through the mud by Fox News because they allegedly messed with their own computer algorithms and they were running voting systems in different places around the country, and supposedly they effectively cheated by having access to these computer systems. According to the allegations that Fox was making, lawsuit happens and the lawsuit in discovery, which is where you can get a lot of information from the opposing side that you may not really want to share, that produced a whole bunch of internal emails and texts between very well known anchors and staff people within Fox News saying effectively that we know this is all malarkey, as Joe Biden might call it, that this idea that the election was stolen, but our viewers really want to hear that and we're going to lose viewers if we don't feed them these messages every night. And so we're going to keep talking about how the election was stolen, whether we think it was or not. That was the basic internal messaging being shared among Fox News staff during that period. And there was another case, if you ask me, of a media company saying we're going to put garbage out there just because we think we have a financial interest in doing that. And so whether it's yellow journalism trying to boost your circulation or Fox News trying to keep their viewership high. I'll just quickly make a plug for nonprofit journalism because we don't have shareholders that we have nonprofits for. V.T. Diggers nonprofit. Sosie Associated Press, by the way. Just want to throw that in there. Just thought of one quick example that I've been personally involved in for the last over a year and a half. I have been on a, I don't think I should call it a crusade. I've been on a mission to force Facebook to stop putting advertisements for counterfeit U.S. stamps on its pages. And I collected in the last year, I collected over 550 images of advertisements for fake stamps. It's illegal not only to make counterfeit stamps, it's illegal to use them. So when you see an advertisement for 100 forever stamps for $19.98, by definition they're fake because the United States Postal Service A does not allow discounts on U.S. stamps. So I've reported these to the United States Postal Inspection Service. I sent letters to the lawyers at Facebook warning them that they are violating federal law and that I've already indicated their behavior to the USPIS. Whether it's just me or it's actually working, I don't know, I can't tell. But the frequency of those ads has dropped by an order of magnitude. Where I would see two a day, I might see one in two weeks. But that's disinformation for profit because Facebook was being paid by the criminals, most of them in China it turns out, who manufacture the counterfeit stamps. And by the way, the counterfeit stamps don't work. The United States Postal Service has a straightforward systematic mechanism for identifying fake stamps and it will not only reject delivery of the mail with the fake stamp, it can pursue legal proceedings against the victims who actually paid money for counterfeit stamps. That's disinformation for profit. So we know that there's a lot of frankly garbage on social media and in an election year, two questions for both of you. How do you recognize it when you see someone posting it? How do you approach it in a way that's going to be productive, that doesn't come across as partisan? How do we all take some action toward quelling and slowing down disinformation and misinformation? I have a very straightforward rule and then I'll turn it over to our public. The more horrifying and exciting the information that you are being presented with, the more carefully you should investigate independently using sources that you have come to trust. Looking for multiple analyses and comments on this horrific, the example that comes to mind, the Democrats were murdering babies and drinking their blood. As a Jew, it instantly provoked a memory of the blood libel which has been circulating for more than a thousand years. You see a story like that? Oh, by the way, they were doing this in the basement of a pizza store in Philadelphia. There was no basement in that particular pizza store and the entire story was disinformation. That was Hillary, right? Oh, yes. Oh, et cetera. So the worse it is, the more careful we should be. Hillary was slicing a pepperoni or something. Well, she was allegedly involved in this pizza company where the pizza place had a basement in which children were being murdered, et cetera. And that's a pretty outrageous thing to allege anybody doing and it seems as though you really want to check it out before. And it's actually in much... Sir, you have a question? I was just thinking back during the First World War that the British did almost the same thing from showing pictures of German soldiers with babies on their bay nets. Yeah. It's been going on quite a while. It's a pretty common kind of allegation that's designed to... As a propaganda. Yeah. Can I bring it to very present? I'm going to name some sources. Check me. The Grey Zone, Monde de Weiss, the Electronic Intifada, and Haaretz all debunked the stories that we were hearing about October 7th, about the rapes, about babies being murdered. There was a whole series of stories. The New York Times ran a report that fits the description of what you were saying. I didn't come prepared to speak about this, but I want to bring it right to the present. Those stories that are trying to convince the American public that it's fine for what Israel is doing, decimating Gaza, that it's documented to be false, starting on October 8th, when people who were... I'm not saying it was a horrible thing. I'm not saying it wasn't... October 7th was awful. And there was a whole series of exaggerated stories that were debunked by people who were at that concert, by Israeli generals who were talking about their particular protocol, which tells them to shoot everybody, even if they might get some of their own. And so some of the most horrific stories, including that one by the New York Times, there were three journalists, one who's very well known and then two or less are known, track my sources. I'm not being as articulate as I would be if I had notes. I know the Intercept had a piece on this. The Intercept. He did the best job. Particularly the story that the Times ran in December about rapes which were alleged to have occurred on October 7th. And there was one in particular that was very heavily featured in that story, which the family of the alleged victim, a rape victim later claimed that she in fact had not been raped. All I can really do is tell you what the various parties are saying about this and so on. But I do think that a lot of questions have been raised about some of that initial reporting. And that is a particularly fraught time, obviously, when an attack like that has just happened. Emotions are running not just high but off the charts, frankly. And people are eager to demonize their enemies. And so the temptation is almost too great to resist in terms of exaggerating and perhaps even fabricating, et cetera. I appreciate that you have some of the details and I just want to echo. I'm trying to echo and enhance what was said by giving actual specific sources, including you mentioning the intercept, Jeremy Scahill, a couple of people from the gray zone, from all the sources I've mentioned, were reporting early on immediately. And then maybe a month ago Jeremy Scahill took all of the information which he had corroborated and published the most comprehensive analysis of that whole story as a summary for the intercept. So I appreciate you bringing that up. This is a time when journalists are especially called, I think, to be hyper-careful. If you're writing stories on October 8th, 9th, 10th, 11th, weeks afterward, you need to be really, really solid with your facts. And unfortunately, the sort of backstory that came out about what happened at the time was that they had a mean sort of international writer who's been with the paper a long time and has covered a number of war zones. He apparently was spending a lot of his time in the office, in the bureau there, and had a couple of essentially stringers who were out on the scene in southern Israel where these incidents were alleged to have occurred. It later emerged that one of the reporters on the scene had actually been a member of the Israeli Defense Forces and had liked a tweet which said some very pretty outrageous things about Gaza. I think the tweet said something about, let's turn Gaza into a slaughterhouse. And so clearly showing a great deal of partisanship. And so if you're the New York Times editor who's managing that story, one thing you do not want to do is to be hiring any reporters who are that partisan. You have to be able to, very important, to be able to demonstrate your objectivity. Can I ask a question? What can we do about all this? I think we can go on and on about this information and this information. It's important to know how to bring people together. I watched something last night on Zoom where someone was talking about how to get your point across and they were saying patriotism. Put the flag up there. In this country we've become so right and left and we all believe in patriotism and putting the flag up there and talking about it can maybe bring some people together. I do have other ideas of things that can bring people together and love to hear them. Well, I think that we are in an interesting period of human evolution right now where we've just in the blink of an eye, I mean it's only really been 30 years or so since the internet became a thing. And all of a sudden we have immense amounts of information at our fingertips that we didn't have when most of us were kids. You can Google just about anything and find out a lot of information about it, some of it more reliable than the rest of it. And I think we are in a period when basically we're in this shock of kind of breaking through to a new level of evolution and that we haven't really learned how to manage these issues. Now you have a lot of information, some of it's garbage, some of it's right on. How do you sort that out? And I think our challenge in the next years coming will be to try to separate the wheat from the chafe. And it's not going to be necessarily easy but I think we all understand now that that's a challenge we have to tackle. Do you see AI, artificial intelligence playing the role in determining what's true and what's not? Or do you think it's going to make it better or worse? Worse. Let me explain why. There have been some very simple experiments in simulating human speech or communications. Way back in the 1970s there was a famous program called ELISA. And this is back in the days I remind you. I've been a programmer since 1965. So I remember this. This was when we communicated with computers by typing on a keyboard. And there would be this greenish screen and the letters would flow up. ELISA was a computer program of enormous simplicity. I'll tell you how it worked but I'll tell you what it did. It pretended to be a sympathetic psychotherapist. So it would start the discussion usually with the same question, how do you feel today? And the person would type and say, well I feel depressed. The algorithm was very simple. It had a list of adjectives. And if it saw depressed, it would say dollar string one. That's the adjective it identified. And it would say, why do you feel dollar string one? Regardless of what you would say. Depressed, funny, good, whatever it was. It would put that in the question, why do you feel dollar string one? And people would type away and they'd answer. The algorithm would scan for adjectives and verbs. And it would put together a simple comment. Always asking for more. So why do you, or how is it and so on. The irony is that the people who were being experimented on did not know they were talking to a computer program. They thought Eliza was a real person. Many of them developed very strong positive feelings towards Eliza. Some would say, oh I'd love to meet Eliza. She is so nice and so, now that was in the 1970s with a dollar string one consisting of a list of words. We have gone way beyond that folks. Artificial intelligence techniques of today use something extremely dangerous. I'm a computer scientist. I taught computer science for many, many years, 40 years. They are using correlations of words and sentence fragments and phrases, correlations, associations. Oh, if this phrase, this phrase has been said and it's followed in 2% of the cases by that phrase and 12% by this phrase and they will construct grammatically acceptable following rules of grammar which were programmed in. They will generate synthetic paragraphs. Try chat GPT. You'll see what happens. Synthetic paragraphs which sound grammatically correct and they may even address the information. The problem is they do not have what computer scientists, what we call a world model. Eliza had no world model. It didn't quote, know anything about reality of psychotherapy, about the reality of human fumes. All it did was construct sentences. There was, I'll get to you, there was no world model. There isn't one in today's AI. Today's AI's have been found repeatedly to generate what are now being called hallucinations. That is, they look like wonderful, sensible, interesting paragraphs and they include total nonsense in some cases. Those are the hallucinations. Why? Because there's no set of predictors to indicate that what's being said doesn't make sense in the world. It's their generating paragraphs based on what they found in billions of texts, many of them used without consideration of copyright, by the way, billions of texts that have been scanned for phrases. That includes science fiction and horror stories, but it doesn't know that it's a horror story. All it knows is that these words followed each other, these sentences followed. It'll say that if you ask it for a paragraph explaining say, why is it that we believe disinformation, if I get a paragraph, that includes a sentence somewhere that says, and the aliens, the extraterrestrials have been feeding us with disinformation, what? AI may not save us. So be careful. I want to go back to your question, ma'am. What do we do about it? I really want to get to this because it also morphed directly from what you were just saying. What do we do about this? And I really think that I'll just interject a little story. I once took a philosophy class in a field called epistemology. The theory of knowledge. The theory of knowledge. Part of that field talks about how each of us has what's called a doxastic system, meaning the pre-existing information and thoughts and knowledge that we have in our minds when we go about the process of taking in new information. And the problem, I think, was just identified by Professor Pebe with AIs, and it doesn't really have a doxastic system. It doesn't have a pre-existing set of experiences and past reading that gives it a world what was the phrase? A world model. Each of us has a world model and it's the source of our ability to think critically. And this is the most important thing we need to deploy as we proceed through this glut of information, which is to say we absolutely have to check everything we see against our first, second, against our own research skills. And research is really very, very fluid these days when you have online search engines like Google and you can immediately check out a weird media report with eight other media outlets to see if anybody else is reporting it. That's sort of a very early step in the process. A second thing you want to do is just do some of your own research into encyclopedia Britannica. I find that to be reasonably reliable when it's a discussion of a language. Maybe it's the Oxford English Pictionary. But you find sources that you like which will deploy facts and so when it's a question of, I could see Russia from my house first check out, well it turns out she didn't really say that as you mentioned and Sarah Palin didn't say that Tina Fey said that. But then I did a little googling today and I determined that even if she had said it it would be just complete disinformation because the farthest you can see from the top of, according to its elevation from the top of the Hubbard Park Tower is about 30 miles. It's about 670 miles from Wasilla, Alaska where Sarah Palin's house was to the nearest Russian landmass. So there's no way, it's just physically impossible for Sarah Palin or anybody to see Russia from Wasilla, Alaska. And so you think like that and you can sometimes go down these little rabbit holes but it's so quick on Google to debunk stuff that you can quickly figure out that I'm not going to trust this thing that I just saw. But that kind of a process is very important for any of us who's looking at information that is at all suspect and it's actually I think a duty for each of us to do that kind of work before we share or repost anything. That is, as a citizen we all should take it on as a duty not to repost, not to share unless we have taken really checked the thing. And I actually have a rule that says if I agree with it if it seems to support my point of view I check harder. It's more important because you're trying to defeat your own confirmation bias. First of all, thank you for this important discussion. To me it's one of the most vexing and challenging of our times. And I heard something around critical thinking which is one of the remedies that you're talking about, particularly around young people and social media is a great empty heart piece. I guess the lawyer in the end not because I'm a lawyer only goes back to common law and constitutional law as a framework for remedies. We were brought up to call a fraud a fraud. Should we have a government fact tech office? Should we have certain news outlets that don't meet the standard required for journalists? Should we have a methodology to enforce when there are knowing statements that cause harm? The old common law is duty breach causation harm. If you tell people to drink bleach to get rid of their COVID you're directly impacting many hundreds of lives. I don't know how many people actually drink bleach but the actual statement itself we all know in this case who it was but it's evil and it's knowing evil and I'm sorry that's an abstract word it's not legal but I have Jewish too on my mother's side we went through this fascist authoritarianism where somehow we could allow opinions to supplant what humans know knowledge is largely known I do think you need to take out the profit motive in journalism I think that's important probably in every area but I'm not trying to say I have all the answers I'm just trying to say why are we so timid to call a fraud a fraud? Why don't we use the tools of government to enforce more robustly more quickly I understand proof is hard I've done in many large trials and discovery can take years but at the end of the day if it's clear and some of you know I'm not going to misinformation for a moment but in disinformation that's fraud and if it causes harm it should be punished and that's why it's so important that's why the dominion case is a very important case huge award in that case and that was because they were banning about no one who was like the filibur smoking gun right they found that people knew and yet the liberty did something to harm people that's crimes those are crimes I agree that the civil law process can be very important and obviously with tobacco with opiates with the dominion the dominion lawsuit was a great use of the civil court system I think I am very very reluctant though when I hear anything about government stepping in regulating or restricting what a company like used to be Twitter now X can do what a company like Facebook can do because the those companies in my estimation are just sort of new technology versions of the Washington Post in the New York Times they are privately managed privately owned stockholder owned corporations which operate separately from the government can I just jump in quickly I took a year under archival cops and First Amendment freedoms we have restrictions around certain types of speech commercial street obscenity incitement sorry you wanted to give me the fourth and they are very very carefully balanced in the judicial system which is a slow wheeling because I am not disagreeing with you at all so maybe it is not the government's role per se but we already have court cases and other kinds of mechanisms to limit speech and to hold people accountable for their speech so in this discussion today which I am thrilled to have and I want to be able to jump in we were talking about these elections these elections are using disinformation very knowingly to sway maybe it is gestalt think and mass think and that is a whole other thing you touched on but I would really like to get at the culprits directly and what are we so timid about do we have to be an atom shift to bring it up who is willing to stand up and speak out and say this person is lying and we can show that they are lying and people should hear that over and over and I guess I am being too I am not a deluded person but I am hopeful yet for humanity we just went through the holocaust really my whole family that is not long ago we are young people that is why this critical thinking and so immediate did anyone else hear the NPR piece on that what are the German and French recently passed laws about AI they started those governments the French and the Germans passed laws recently I think yesterday I know about it my worry is that let's say that you are Galileo and you come up with a theory that no actually the earth revolves around the sun instead of vice versa and the government comes in and says that is disinformation remember that is what the government did and the church did back then Galileo came up with this new theory and he spent the rest of his life under house arrest effectively wearing an ankle brace I think as an historian that kind of smudges things together really deep respect for you Dave I think that was a unique situation a unique time we got but I do want to give credence to what you are saying if you let the powers that be just say well that is BS I think there is a way human knowledge is pretty vast right now there is a lot known in science there is a lot known that we actually know as a humanity you were talking about social media being that I don't know at all of you but I was a car catalog always I mean I could go and find out things about the whole world pretty quickly right I have to do it alphabetically in the paper and I'll write it down but I just think that I think anyway I'm just at this point your point that we all have a duty ourselves also to question the information I like that but I just think there is a role for oversight here there is a role through the legal process which allows for the presentation of evidence it allows for rational evaluation of correctness of truth of intention even and I would much rather that the courts of law which are less not entirely less subject to political bias I notice I said less subject not impervious but I'd rather see the ACLU and the National Organization for Women that I belong to and no the NAACP that I belong to all of these agencies using the legal system to bring to bear the consequences of deliberate disinformation that makes sense to me having a government based politically biased regulatory agency not keen we see that in totalitarian regimes all over the world and it's a disaster you can take the consumer fraud bureau you can take any number where there are executive branch or parts of executive branch that are specifically charged with oversight alongside the judiciary that have their own review powers and arguably the legislature when it finds out things are not being adhered to according to their intent that's why there are three branches so I'm trying to agree with everything this is important but why only holster one solution we should part of that metaphor as you get off of those kind of metaphor that's wrong why go down a singular solution track when there are multiple possible routes I'd like to offer some resources so we have a federal communications commission which has been disempowered politically until recently it could be in their purview we also have an extremely corrupt system I want to bring up a legal case that probably no one in this room I hope someone in this room has heard about but has anybody heard the legal case of Missouri versus Biden? yes I don't know much about it though it was a group and I'm going to bring up the Twitter files again and I'm going to identify myself I've been working on a critical media literacy project for about 20 odd years so I'm not just a random person in the crowd so I want to be fair and I wrote a whole bit of testimony for the Vermont one of the committees yesterday which is talking about rhetoric omissions constructing narratives and how to understand that just because the public has a general concept of what a narrative is doesn't make it accurate and throwing in about how obvious it's work and public relations professionals so I just wrote that yesterday I don't want to talk about that but I do want to be honest about where I'm coming from let me ask you, give me your expertise and maybe Theo also should Twitter have been required to run the New York probe to allow full Twitter treatment of the New York Post's I can't answer that but I'll tell you that Twitter files that were released when the person who had Mr. Tesla bought it and he released all of the undercurrent of emails and distributed it to a small team of independent investigative journalists so you're going to have some more specific language than what I have and so that's great because I'm kind of a generalist and I need notes for specific words the bottom line about the Twitter files was that multiple US government agencies told Twitter what they could and could not be publishing they were the ones who said don't run the Biden story and there is an equivalent of that in the report about Facebook I don't think they said don't run the Biden story I think what they said is this smells like Russian disinfo however they put it that my point is that we have Homeland Security the FBI the CIA I don't know maybe more agencies those are the three I remember the government has been finding and doing massive censorship and it's related to this story I don't think that actually happened let's talk about Missouri versus Biden so there were I don't remember how many medical professionals who felt their medical information was being suppressed and censored and they sued the Biden administration somebody more legally adept can say exactly how they did it the state of Missouri supported them to sue the Biden administration because the Biden administration was censoring the public health information and they won this case and it's up for appeal so you're googling and I just want to say Google is a very large corporation you can do a search for the 10 best search engines and come up with 9 more alternatives the point the truth the truth is filtered through the lens of our expectations thank you and there's a huge problem of lack of trust your truth discovered by the law or by an independent journalist may not be believed and it's likely to be recreated the belief systems are not fact systems so it's it's fine to talk about strategies and what we should do and be skeptical but when the truth is presented in this country we have a phenomenon of mistrust and we are not addressing that in hand here but it's a tremendous problem maybe we need a ministry of truth ministry of truth ministry of truth i i hope we get around ministry of truth I just i I really that doesn't bring all of me at all so you remember when you were college I would be expelled if I plagiarized once in my college if I plagiarized once and it would find out Our standards have gone way down. We have primary sources for information in this world. We're not supposed to be using secondary and tertiary sources to confirm our facts, right? That's why we go to individuals about their own life experience. If they're a witness, that's why we go to the chemistry of something to determine its makeup, like on and on and on. So I don't believe we're in a post-fact universe. I just think that people pretend we're in a post-fact universe and they use it to foster distrust. I want to encourage you. I was a professor at Norwich University, the Military College of Vermont, for five years, and I personally arranged the expulsion of multiple students for plagiarism. I checked every single essay for plagiarism today. We are having a significant problem not so much with essays, which can still be plagiarism-checked. We're having a problem with exams because students who are permitted to use their smartphones or their laptop computers in class are posting the question on the exam to chat GPT, or some other one, and then copying and pasting rubbish into the answers. That's why I've been following this over the United States and the world. More and more institutions are forbidding the use of electronic equipment during exams, and oh my goodness, they're going back to handwritten answers, which is causing shocks across the student world. I applaud you. That basic rule is what we all grew up with. I understand we've just gotten so off the rails, and I do think we need to put resources to do this sort of thing. I don't care where it resides, frankly, but there has to be some system of opinions and beliefs. Those are wonderful. I don't want to have different opinions, but not different facts. It sounds like we're all in agreement that it's a problem, and we all have skin of the game, and we have a responsibility to chip away at it, because no one thing's going to solve it. There's no silver bullet, and it's not going to get any better. The guy's not going to save us. But I think critical thinking, and I love that advice that you gave Dave about before you share something to research it and check it. I think if everybody did that. It's the same thing I heard in a newsroom years and years ago from some salty old city editor who said, if your mother says she loves you, check it out. Don't check out one of them. I was going to say, I felt like that was a good note to end up. So can I just echo it from this today to check multiple sources and also especially independent, verifiable, fact-based, because if anybody remembers Judith Miller at the Times and weapons of mass destruction and the aluminum tubes, the story I brought up about their story is not a new thing for the New York Times. And as much as we want to rely on the New York Times and the Washington Post, even the UK Guardian, but I'd say they're about ten points better, there are a huge number of reliable independent sources. All the people that got fired from mainstream corporate media because they wanted to talk about war and peace or they wanted to talk about corporate incursion into corrupt government or whatever, they now are publishing on Substack, they're doing video on Rumble, and there are other sources and they are very credible, fact-based investigative journalists who want to tell us what our government and what foundation media and corporate-based media doesn't want us to know. So thank you for letting me say that. Thank you both of you very, very much. This was a super fascinating day. And Karen's concerned about such an important issue. Can we thank Scott? Yes. Thank you. Thank you. And thank you to our video. Yes. Thank you for being here. Nice to see you. Nice to see you. Thank you. This is fun.