 Hello, everybody, and welcome. It is 1.01 PM and time for the very first episode of Vision, a show peering into the trends, ideas, and disruptions affecting the future of our democracy. My name is Sam Gill. I'm the chief program officer and senior vice president at the Knight Foundation. And I am delighted to be guiding you on this incredible journey during an incredible and unprecedented time into some of the most vexing issues. Over the next few weeks, we're going to be focusing on a very particular disruption. And that is the infodemic that has been accompanying the coronavirus pandemic. And we're going to be talking to a number of experts, thinkers, and innovators about how we can sort of flatten the curve to use the vernacular of the moment on this infodemic. I am delighted to introduce our very first guest for our first episode, Jevin West. Jevin is an associate professor in the Information School at the University of Washington. He has been all over the news fighting misinformation about COVID-19. Wired called him one of the professors calling BS. This is an FCC approved show calling BS on COVID-Miss Info. I am absolutely delighted to welcome to join us Jevin West from the University of Washington. Jevin, great to see you. How are you? Thanks, Sam. I mean, what an amazing honor to have to be the first guest on this show. I think you are muted. Oh, am I muted? Because I can hear my. This is a classic episode one of a COVID-era webcast. No, folks can hear you. OK, good. All right, so this is a first up. This is a true first episode, as you say. All right, I can hear you now. OK, perfect. Episode one foyable. Well, it's very good to have you here. 265 people with us live right now. I assume dozens of them are digital propagandists from Russia and China, eager to see their work celebrated. And we will not disappoint them. Jevin, I actually, I don't want to start talking about COVID, because I think in so many ways, you were really built for this moment. This is a question that you've been thinking about for a long time. Tell us a little bit about the work that you've been doing on the type of issue that we're dealing with, which is the way in which misinformation or misinterpretation of scientific technical information is so easy in our era. Yeah, I mean, I'll start with some of my colleagues, just because I think this is a team effort for sure. And when I look to my colleagues in our new center here that you supported, and we're very, very thankful of, the Knight Foundation has made this happen. When I look at Kate Starbird or Emma Spiro, who are colleagues that sit next to me virtually and also physically in office space, they've been thinking about this for over a decade. Not just misinformation, but misinformation during crisis events. There's something special about crisis events that sort of really allow disinformation and disinformation to really explode. So they've been thinking about this, basically their entire career. And I get to sit next to them and think about all the work they've done in their career to set them up for this particular crisis. And then me personally, I mean, my colleague Carl Bergstrom and I have been thinking a lot about how the new world in which we live, both in our professional lives and science, this is the one thing I care a lot about, and also just in our personal lives with information just flooding into our brains at rates that are just way too much for one human brain. We wanted to figure out if we could sort of, teach the public and ourselves actually, how to become better information consumers when everything is just coming at us in many different forms. It comes in forms of in data. Of course, people all talk about fake news and all the different ways in which media can be distorted. But it's generally just trying to improve, all of us as information consumers. What's the, I guess one thing I wanna ask though is, what is it that's changed? I mean, technical information has existed for a long time. It probably was scrutinized more closely in a less democratic setting for a long time in the hands of experts like you. But is what's changed something about the kinds of information decisions we're making or is what has changed the means of dissemination? Is this an information problem or an internet problem? The problem of BS? You know, I think it's both. And so let me give you a specific example. So there's a lot of talk about synthetic media, deep fakes. And I've been talking to students and people in our research group about this because we're gonna have a workshop on this pretty soon. And for a lot of people that have never heard of deep fakes, they assume that's a new thing. That's actually not a new thing. A lot of the technology and the ability to manipulate videos and audio files and any form of information has been around. If you spend enough time, you know, in the early 2000s, you could have created some of the deep fakes that we're seeing. It was just taken a long time. The difference is that it's almost been commoditized. I mean, you can, anyone for a very small amount of money, we found out literally yesterday that it costs about $2 an hour to pay someone to create you a deep fake and it takes them about five or six hours. A deep fakes, just for our audience here, deep fakes is a video that is doctored to put, to have someone, so that it looks like Barack Obama's saying whatever you want him to say. But it's been created by someone in a laboratory, effectively, on a computer. That's right. Thanks for the clarity sound. And actually, we're gonna do a survey to see how many people are familiar with those because I think it will be an issue in the election. But with that particular example, we've had the technology. It's been around, it's that it's now cheaper to put together, almost anyone can put it together. You know, the actual seamlessness of the videos and the audio files are better. And so I see that more broadly in the information landscape. That we've had propagandists, we've had opportunists, we've had people that were, you know, at the time doing disinformation campaigns, that's the kind of term that we're using now. But that's been around forever too. What's changed is sort of the volume of information, the velocity, I'm almost sounding like I'm giving up a pitch deck for a Silicon Valley company. But it's the speed at which things are changing. And also the ways in which we can automate misinformation. So I think the ability for me to write a bunch of scripts in the time that we're gonna have this discussion, I could deploy, you know, a thousand bucks to say whatever I want, sort of thing. I mean, not to that, you know, it would take a little longer than that. But that's- I mean, I wanna, so I wanna come back to that idea that, you know, you joke for a minute that you're almost giving a pitch deck. Cause I think an important question here is, is the technology actually working the way it's supposed to? And the problem is that in enabling a high scale, high volume network with no friction, it's made us more vulnerable. I wanna come back to that question though. I also, 250 for idea, I mean, I feel like I could be using this technology to better effect in my, like, you're a great guest, but I'm just thinking like, what kinds of guests could we have? That's right. At those, at we could afford those rates. We've invested most of it in the microphone. But before we go to the next question, I do just wanna let our audience know we've now got over 300 people, which is fantastic. You can send us questions. Use the Q and A in your Zoom panel. You can tweet hashtag night live. You can use Facebook live. We've got people who are watching the questions. They'll be sending them to me and we'll make sure to get some of your questions answered as well. So, Jeff, let's, you know, we've been talking big picture. Now let's talk about COVID. You know, what is the InfoDevick that we're hearing so much about in relation to COVID-19? What is it, where is it coming from? Well, to link to the last thing that we were talking about, I don't think InfoDevick is anything new in COVID. I mean, we've seen this kind of situation with other crisis events, but of course the volume of chatter online and just everywhere in the world is much higher for COVID. So the amount of misinformation that's being created is a big enough deal that an international organization like the World Health Organization says, whoa, whoa, whoa, whoa. As damaging as the virus itself can be, the misinformation around this can also be quite damaging. And so we need to really invest a lot of effort into countering this, working with the tech world, working with the researchers, working with governments because it's to the point where it's affecting people's health behavior. I mean, there are examples of the person that drank fish tank because they thought hydroxychloroquine was an effective treatment. And so people are changing their behavior based on things they're getting and it's also affecting public policy. So I mean, it's, I don't wanna equate the misinformation, viruses or infections I guess to overuse the metaphor, but it is something that can be just almost as damaging as the biological virus itself. So let's break it down a little bit. What are some of the most common categories of misinformation that you're seeing in relation to COVID and how do they appear? Most of us probably aren't running into this. Too many of us are clearly. So what is it, what might I see on Facebook or in Twitter? What are some of the common things that are coming up? Well, first of all, I want to note this. I think most of the rumors that start and the ones that we track from their origin all the way to cascades on Twitter and Facebook are generally honest to goodness ideas that people might have that they wanna share with their friends, with their family, their people's, their health's at stake. And so they care a lot about it. But there are those kinds of conspiracies like 5G weakening your immune system and sort of the root of all this coronavirus to COVID-19 being a bio weapon engineered in various different labs to, just to the many elixirs that have been promised as fixes to this. I think what we see most often when we track this at the population level, at the level of 500 million tweets, we see common patterns in prescriptive advice on what to do, on the origins of the virus. I think the ones that are probably the most damaging are the ones that sort of claim that they are the new, it's the new treatment for COVID. Those are the ones that scare me the most. So what are you doing about it? How are you, I like, I hope many, and we sent them all to it, read the Wired article and thought, I know that guy, but can one person really start to rein in this spread of misinformation? What is it, what do we need to do now? We can talk about what we need to do in the future, but what do we need to do right now to limit what I hear you saying as the potential for actual fatalities as a result, maybe not the main cause, but a indirect cause of fatality being people getting incorrect information that leads to bad choices. For sure, and there's no way that myself or my colleagues can do all that. I mean, my colleague, Carl Berksram, has been really on the front line because he spent 10 years of his life studying infectious diseases, and now he's spent the last five, 10 years also thinking about misinformation. So I look to my colleagues as people that truly are on the front lines, but what can we do, everyone on this call, all 323 people on the call right now? I think the best thing that we can do is be good information consumers ourselves to do a little bit of sourcing, to be that fact checker, to be that journalist, to talk to our librarians and our teachers. I mean, the best thing that we can do in our center, I see the center that we have, and it's a big team effort. There's so many people I wish I had time to talk about all the great things that they're doing, but I think that the things that we can do is provide resources for those that are truly on the front lines. For those, whether it's the medical officials talking about the virus itself and about what to do to teachers, librarians, journalists, I think it's putting, we can throw laws and we can even throw some automation and technology at it, but I just think that the best thing that we can do is just be a little bit more careful at this time. Also, I mean, it sounds so simplistic, but it's trying to slow down a little bit and we're in this fast lane now and science is being pulled into the fast lane too, and science is not comfortable in the social media fast line because it makes a lot of mistakes and in turn can affect- But regular people never make mistakes in the social media fast lane. I appreciate you drawing that distinction. I got, let me push back on you though. I'm looking out in a general sense, I'm sympathetic to the idea that we need to be smart, reasonable information consumers and that maybe requires some new skills in the 21st century. In a general sense, I think I come from the school that says a really effective regulatory regime has good laws, good private actor participation and good behavior. Autosafety is a good example of this. You've got speed limits and some basic rules about safety. You've got auto manufacturers making safety equipment available and then you and I need to put our seatbelt in and we need to get trained and we need to commit to following the rules. So in a general sense, I'm specific, but I kind of wanna play devil's advocate because earlier you pointed out such a critical source of misinformation here are well intentioned people who are asking questions that turn into assertions and they're asking questions that turn into assertions in platforms, on technology platforms that as you sort of jokingly pointed out we're basically pitched on the idea that they take an enormous amount of friction out of communication, that they allow an idea to spread to a lot of people really quickly to build communities of affinity around people who are like-minded who might have the same vulnerabilities, the same fears, the same aspirations. I think it's interesting that treatments is a huge category of misinformation. That's a hopeful category of misinformation. You're trying to get better. You're trying to protect yourself and your family. So are we, is the lesson here we need to become more social media literate or is the lesson here that there is a limit to what individual media literacy is gonna do in a time of crisis? Now, I think it's a good pushback because sometimes I am a little pollyannish about what media literacy can do. It's still the thing I'm putting so much of my effort towards, but there may be a limit to what media literacy can do because we're all vulnerable. As something that we talk about in the center, people are vulnerable from K1 to K99. And that may just be a limit of human cognition. That may be a limit now that we've sort of set up that there's just the only thing that we maybe can do is change some infrastructure. Maybe we can change some norms. I know about seatbelts. I mean, you use seatbelts as an example. I think, you know, there was a time, I remember when I was a kid, we didn't wear seatbelts. I was sitting in the back trunk in a Volkswagen bug when you could actually sit in the trunk where, and you didn't wear a seatbelt, but pretty soon it became almost embarrassing. If you sat in someone's car and you didn't put your seatbelt on. And so maybe we can get to the point where it's embarrassing, like to create almost, I don't wanna go so far as to say it's shame, but really to make it so it's socially awkward to sort of brag about sending something that you didn't bet or whatever. I mean, even thinking like we have these national groups that get together and talk about things that just can't be done in wars. Well, maybe we shouldn't be having, maybe we should talk about, you know, purposely disinforming entire societies should be something maybe not to the level of biological warfare, but getting up there. Well, I think that's interesting idea. So that you're here now, I think, I hear you referring to some of the signs we're seeing that state actors, if not originating some of this content are amplifying it. And am I correct in hearing you saying that we should think about that as potentially being conduct that violates international norms? Yes, I would love to see that. And I know that some will say, well, we can't even have time to get the international leaders to talk about things that like nuclear proliferation and biological warfare. How are we gonna get them to talk about this? But I think it is up there in terms of the damage that can be done to people and just societies. So, you know, in my perfect world, I would say when things calm down, let's have a discussion with leaders around disinformation, purposeful intent, you know, misinformation that's intended to deceive just to just make people be confused and have no trust in any other system except their own brain. Now, I wanna ask you another regulatory question that is at this nexus, but it's a more direct regulator. We're getting a lot of questions about, you know, whether the big platforms are doing better than in the past at directing people toward quality information. And I wanna actually, I think this is a really important question I wanna talk about, but I wanna put a bit of a spin on it, which is directing toward what? And I think this raises the question of we're not, we don't enter COVID-19, we may enter the COVID-19 crisis totally unprepared for what a truly global pandemic looks like in an interconnected society, but we don't enter the COVID-19 pandemic bereft of institutions. You know, we've got institutions that we've taken generations to build in fields like public health, in journalism, in fields to inform people, in basic government and services. And so one question I think I'm struggling with is, so many of these companies have struggled with this question of editorial influence. And so I think one version of this question is, are they doing a better job of directing people to good information? And does that actually mean directing people toward institutions that properly have authority over information? That's one question. But the second question is, what counts as an effective information institution in this era? And there's a really good example of this that I know that you've thought about, which is that there's a tech executive who wrote an article in Medium called The Hammer and the Dance that I think now has something like 40 million medium views. And there's a lot of debate about this. You know, some people have said, he's done a really good job of curating information from authoritative sources and making it accessible. I know I was one of the people who received the article and it was an eye-opener for me at an era when we were just getting our minds around this. And then other people have said, well, wait a minute, a really smart guy with an MBA is not an information intermediary in a public health crisis. And we should be, that's not the kind of democratization of information we want. We've built institutions. We should be directing people to what the local public health authority is saying, to what the governor is saying, to what the CDC is saying. That's not, that is sensible editorial judgment in a world in which we've built institutions. How do you think of that regulatory question? Where should these companies be directing people and are they doing a better job than they have in the past? Oh, it's a perfect question because we've been looking into this just over the last several weeks with my colleague Franzy and her student, Christine. This question of whether tech companies are doing enough right now and what they're doing and whether it's being effective is something I'm super interested in. I mean, a lot of people are concerned that one of the sort of victims of all of COVID-19 will be privacy and various other things that we care about as well. But one of the things that we've seen consistently across the platforms, and they may be even doing it in concert, possibly, is they provide these banners and they say, if you're searching for coronavirus, boom, here are, go to the World Health Organization, go to the CDC. This is brand new. And I do think it's a good step in the right directions in pushing individuals to these institutions, not requiring them. It just says, hey, by the way, if you're searching for this, here are some good pieces of information. Now, before COVID-19, none of these big tech companies were really doing it. There was actually a couple, I think Pinterest, interestingly, was doing some interesting work in this space, but the idea that you will start directing them to institutions is relatively new. Whether it stays after COVID is interesting, but we looked at this, we did an initial survey, which is it's going into peer review actually next week, and we've written a medium post about this, where we looked at the banners and wanted to know whether this actually even changed behavior. Long story short, mixed results. Some people actually do find it useful and not offensive. Some people say it didn't do anything for me. So there's a lot more work to be done. So that's interesting all by itself. Are some people offended? Are some people offended by that? Yes, yes, yes. I mean, you can look at the quality, the sort of some of the responses, and some say, you know, I can figure this out myself. I don't need that. So I'm going to change my behavior. I don't need the platforms doing this for me. How do you think about that? I mean, because again, it's so easy to talk about other people. Like I think I'm a reasonably educated person. The Thomas Pueyo article was really influential for me. I didn't know, I did take the extra step after slogging through it of saying, I've never heard of this person. And that probably is not a step everyone takes, but I read the article. I'm sure I quoted the conventional wisdom out of it. What do we make of, where's the paternalism line? What do we make of people who say, I could figure this out on my own? Were those people saying the same thing when they were reading a newspaper article 20 years ago? Is there something different about our sense of independence about information on technology versus sources we used to think of as authoritative? That's an interesting question you're bringing up. I didn't really thought about it in that way that over time, you know, in this new information environment in which we live, are there more attitudes of independence? Cause we're all, we're sifting through more, we're actually the filterers of information more than maybe we had the opportunity 20, 30 years ago because, you know, it's our friends that are filtering its algorithms to some degree, and it's us that's doing a lot of the filtering. And so we've been in this new position and does it make us more confident, less confident? It's an interesting thing. But let me give you a specific example of where this is playing out in public health locally. So I had some roommates, you know, in college that ended up becoming dentists. And so like, for whatever reason, I sort of, I'm tangentially involved in conversation. Compared to them, you're leading the most exciting life possible. It's all right. That's true. No offense to dentists. I love my dentist. I see my dentist every six months without fail. Right. But so right before we went into these major lockdowns, the dentists, at least in the state of Washington were one of the last groups to have to be shut down. And they were passing around in some of their email conversations. This post that was about how bad the spread could possibly be. And this post had received like the medium post you mentioned, I think probably millions of views. So I looked into it and it turns out it wasn't an epidemiologist. It wasn't a public health official. It was a marketer in Silicon Valley that knew something about data and so it could do some work with modeling. But we're seeing a lot of this, what we call sort of armchair epidemiology. And this armchair epidemiology, oh, I'm seeing some background here. Is that, you're getting that, Sam? Yeah, I'm getting a little background noise. That's a ghost in the machine. Okay, well, we'll keep talking and hopefully it'll go away. So anyway, the idea behind this is that, there are a lot of people that are now adjacent experts even though they're not truly epidemiologists. And so we're looking into that at a large scale. So Amispero and Kate Starberg, myself, we're looking at the role that these sort of less institutional, our individuals that weren't in the sort of traditional institutions for doing this kind of work, the role that they're playing in pushing conversations. So let me, some really interesting questions are coming through. I've got a few more minutes left that I, one is a question that just does not get asked enough which is, are there any populations that are either being acutely targeted by misinformation and disinformation and or that are uniquely vulnerable? And actually, I think one of the important insights of the last few years is we've sort of gotten beyond the kind of outrage and anxiety of some of the misinformation, state sponsored misinformation we saw in 2016 is a much more nuanced view of the way that African-American and Black communities were in particular targeted through some of those techniques. Are there any signs that COVID, that in the COVID infodemic of sort of similarly segmented vulnerabilities? It's a really good question. I don't consider myself an expert on that particular area, but I have colleagues. Like, again, I keep referring to my colleagues because they do some of this work. Like Kate Starberg, some really interesting work on the Black Lives Movement, for example, online. There's the Blue Lives Movement that was moving and the Black Lives Movement that was going online. And one of the things that she found is that many of these foreign adversaries, these foreign adversary accounts that were labeled by Twitter and released to the research community, she found them almost equally in both groups. And so whether it's happening on the COVID side, I think it remains to be seen, but it wouldn't surprise me if it was the case. I mean, one area in which we are seeing this is around maybe some of the anti-vaccination groups that are online. So we just hired a postdoc and it's gonna be joining us. Coco from University of Austin, she just finished her PhD, she'll be joining us. She's spent a lot of time working in these communities of anti-vaccination and we've had some, you know, just tangential conversations on email and various other forms. And I think there's a likelihood that there are groups there that are being targeted, but I think it's a great question. I hope others are investigating it. And like I said, it wouldn't surprise me if it's the case. And what about a few, we're getting some requests for maybe a few top resources for folks who do wanna know where they can turn to see who's tracking misinformation, to get high quality curriculum on media literacy. I hear you may have developed some materials on this, but can you give folks maybe three resources to consider and we'll send them out folks after the show. Absolutely. So first of all, I would say the Knight Foundation network that you're forming, then you have formed, Sam, from all the centers and the nine different, and other projects across the country doing this. They're either really digging deep and they're the leaders on misinformation research or the media literacy. We have some real experts in Washington. Mike Caulfield is someone that I would turn to if you're, let's say a teacher or a librarian looking for media literacy resources. We of course in our center, the Center for Informed Public at the University of Washington, we're gonna be putting up a page, hopefully within the next couple of weeks, probably more like a month if I say a couple of weeks, but we're gonna be putting a bunch of resources up. You can also contact me personally. And if there's a specific question you have about media literacy or anything in this space, at least I can turn you to the, probably the expert or maybe the world expert because this network is smaller than you think right now. And I'd be happy to direct any of those people with those questions. And yeah, you're doing a misinformation day for librarians, educators and others virtually, right? Can you just quickly give people what's the day where to go to sign up? Yeah, so the idea is something we started a year or so ago where we bring hundreds of high school students from rural and urban areas to campus. We spend an entire day and make it fun of course with cookies and treats and things for teaching, getting students to focus on media literacy, data reasoning, misinformation in general. And what we do is we bring the high school teachers and librarians also, we can't do that now. So we're doing it virtually. On May 26th, we're gonna do a little session for librarians and teachers that they can take to their students. And then when we go back to the in-person world, next year we're gonna do one with, hopefully 500 to 1,000 on our campus and many other campuses that are gonna join us. Where do people sign up for that? So they can go to our website at cip.uw.edu and you'll find contact information there. It's again, cip.uw.edu and you'll find a resource. You can also just contact me directly and I can set you up and you can join us if you want in that conversation. We're also gonna do this on campus. So if you wanna do it on your campus or on your area, we'll help you. We'll give you all the template materials for you to run as well. The Russian Cyber Intelligence Agency has misinformation every day, actually. You got it, you need a new brand. So last question, last question before we let you go and let some other folks go. This is not the first misinformation campaign. It's not the first digital misinformation campaign or crisis. It's not really a campaign. It's a series of campaigns, crisis. What's the next infodemic? Oh, that's such a good question, Sam. I would say the elections are a big one. I mean, the census was one we were worried about until COVID came along and it's certainly still a worry, but I don't know if it's at the same scale. I think we're gonna see very soon plans that have been probably put in place or they've been designed for several years now to attack to some degree our elections. And this is not a partisan issue. When we look online, we actually, we see it in both areas or in both sides of the political spectrum, but this is not just our elections either. Actually, we've been tracking some of these things internationally and we find that deep fakes all of a sudden arise two days before the election. And so these, Brazil was an example of this, the Sampaolo mayoral candidacy. So there's this kind of stuff, I think it's elections and it would be definitely a big one. And anytime that we have one of these global events, I think, I mean, it's certainly gonna, this will not be the last one. I think this is endemic. I think we're making the reality. I mean, you and I have talked about this by colleague, Paul Chung, and I've talked about this. This is, if our resilience paradigm for the future doesn't include information resilience, I think we've missed the reality because the genie's obviously not going back in the bottom, we don't want it to. We're having this conversation today because this technology is incredible, accessible and brings people together from across the world. You're coming from what seems to be a very authentic international beach, Cal and that would only be made possible by the internet. So we're gonna let people go first. I wanna tell them about other places they can see you because you are everywhere. So first of all, for night owls, tonight at 7 p.m. Pacific, Jevin will be on a special panel on climate change misinformation, if you wanna learn more about that Chinese hoax. And I looked it up, the easiest place to sign up for this is on Twitter at climatetap, climate TAP, and you can register for tickets. The center that Jevin leads, the University of Washington Center for an informed public will be doing a series of webinars in town halls on misinformation. You go to cip.uw.edu to see the lineup. And August 4th, Jevin is out with a new book called, I'm not gonna say the full title because again, this is an FCC-approved webinar. It's called Calling BS, the Art of Skepticism in a Data-Driven World. You can pre-order it now on Amazon. It was $13 on Kindle a week ago, but post-COVID it's $10,000 to buy it back. I wanted to know they're investigating that. We are also not done talking about the infodemic here at Night Foundation. Next week, Thursday, 1 p.m. April 30th, we'll have Joan Donovan from Harvard, one of the most incisive voices on this problem. On May 7th, we'll have Safia Noble from UCLA, a wonderful opportunity to go deeper on this question of how different communities are affected differently, and particularly marginalized communities by the infodemic. Then we'll have Renee Doresta from Stanford on May 14th, and one of the leading experts on the way state actors are driving misinformation around COVID-19. So many of the topics that we touched on today, we're gonna be talking about more deeply. The recording of this episode is going to be available tomorrow. You just have to go to kf.org slash vision. You can email us at vision at kf.org. We would love your thoughts, your complaints. We'd love to see what you think is misinformation, and what can be learned about it. Send us tweets, examples, articles, pictures, videos. You can also visit us on Instagram at vision.kf. Lastly, please stay on for 30 seconds and take a quick online poll. It's just two questions. We really want your feedback. Before you take the poll, think about the part of this you enjoyed the most, and then tell us what you thought of the show. And other than that, really, Jevon, thank you so much for joining us and everyone out there. Please stay safe and please be well. Thanks so much, Sam. And thanks for all your efforts more generally on this problem. I really appreciate it. Thanks, Jevon. Thanks, everyone. Have a great day.