 I'm Zio Leary. I am the host of the Slate podcast, What Next CBD, where we talk about tech and power and the future. And this is the latest edition of Future Tenses, my favorite movie where we talk to lots of people, policymakers, scientists, thought leaders, and talk about the legacy of their favorite films or at least the ones they picked for this occasion. Our guest tonight is Ellen and she is a commissioner for the U.S. Federal Election Commission. She has done that since 2002, and she chaired the commission for the third time in 2019. Prior to her appointment there, she was a counsel to the political law firm, a political group of the law firm Perkins Cooey and counsel to the House Ethics Committee. And I guess I want to start Ellen by welcoming you. It's very nice to see you in your awesome sitting in line. Well, thank you, Lizzie. It's a pleasure to be here. It would have been a little bit more fun in the movie theater, but it's nice to see you in. I want to ask you why you picked the social network. What was behind that choice? Well, I was asked to pick a movie about technology and society, and a lot of those movies are a little bit dystopian, and I really didn't want to go in that direction. And of course, social network has all that great snappy Aaron Storken dialogue, so you got to love that. But really, it's because I think that Facebook has come to take on such an important role in our society and in politics. And since money and politics is what I do, I wanted some aspect of technology, some aspect of the movie that brought me back to what I think about all the time, which is the role of politics and money and politics and platforms that put out political messages in our society and in our elections. Is there anything that you noticed when you were rewatching this movie that you didn't see when you first saw it in the theater or at home where you thought, oh, that's especially relevant now, or that's not how I think about Mark Zuckerberg now? Well, of course, he was younger then. And I recognize that it's not a documentary, so yes, there may be some liberties that were taken. But I think that it's really interesting to realize, A, how recent Facebook is, and B, to focus again on its origins they were trying at the beginning. And I think this part, even though it's not a documentary, is probably right. They wanted to create something cool. And indeed, at the beginning, Zuckerberg didn't want to have any ads on the platform at all, because he thought that would take away the cool factor. And now it's become all about the ads. It's become a platform that all of the algorithms and all our business model is built around keeping you on the platform, keeping you clicking away so that you can see more of their ads, and they make more money. So I think it has really changed a lot from where it started. The other big difference I think is, well, maybe not a difference, but something that I think is worth commenting on is that it started out as a place to form connections. And I think that it still thinks of itself as a place that promotes a more connected world. But it's got another side to it that we have seen in more recent years that it has become a platform where also people promote divisiveness and polarization, and hate speech shows up and spreads on Facebook, as well as on other social media platforms. But it is the biggest. So I think there's this whole other side to it that was not envisioned when they first started it and social network is the origin story. And it's something that I think that they have failed to adequately grapple with. One of the reasons I like talking to you about this is because you've obviously spent a lot of time thinking about this, especially about Facebook and political ads. And you said this thing in January that stuck out to me. You said that their plan not to limit targeting of political ads, and I'm quoting here, suggests the company has no idea how seriously it is hurting democracy. And I wonder, I guess, what you think Facebook thinks it is doing? Well, that's a really good question. I think it is really immoral. I don't think it's a business. And it has a business model. It's trying to be big and powerful and trying to be the platform that everybody goes to for all sorts of information. And it wants to be in the middle of things. It wants to be the essential place to go for information, I think. And I don't think it has grappled, as I said, with the polarization that's been promoted there with the divisiveness. I mean, it was a platform for Russian interference in our election in 2016. And that took them totally by surprise. I mean, the Russians were buying ads in rubles and nobody noticed. They didn't have any system set up to check for that. They do now. They've got it now. They got that piece now. But they were so unaware of how the platform could be abused that they didn't even have a system in place to check for that. It's interesting to me that you at the FEC are interested in this. When I think about the vast universe of regulatory bodies in Washington and all the different kinds of things that Facebook does, I guess I wonder who should be overseeing them in DC? Well, I think there are a number of agencies that have a piece of this. The FEC, the Federal Election Commission, also the Federal Trade Commission, the FTC, and also the Federal Communications Commission, the FCC. So if you don't pronounce things very precisely, you're going to miss which alphabet soup you're referring to. But I think that they have each of these agencies has some jurisdiction that overlaps with what Facebook does. Is that a good thing that it's splintered jurisdictionally among different agencies? Or would it be helpful to have one agency looking at things? Does that risk a regulatory capture? Well, right now the problem is not regulatory capture. Then right now the problem is that it slips between the cracks of each of the three agencies and nobody captures it. And there's just not a lot of law. I mean, the Communications Act hasn't been updated since 1996. The Federal Election law has not been updated since 2002. And the last time the FEC wrote a regulation that was relevant to the internet was 2006. This is a lifetime ago on the internet. Our rules are out of date. And I think every agency's rules as they pertain to the internet are, in some ways, it would be hard to keep up. You could write a rule and probably somebody would use something the next day that would make it obsolete because things move so quickly online. But to have rules that are a decade out of date that were written at a time when it was an entirely different world online, when political advertising really didn't happen online very much, I think really all of our rules need updating and legislation needs updating. And in fact, Facebook doesn't contest this. Facebook has endorsed the Honest Adds Act, for example, which would bring legislation about political advertising online into the same world as broadcast advertising. And I think that would be a very good thing. I want to just let viewers know that you're probably hearing sirens. You might be hearing sirens going past my apartment in New York. I am in Brooklyn. Unfortunately, we do get a lot of sirens right now. I wanted to ask you about misinformation and disinformation. You and I talked on the phone yesterday and you made this distinction between things that are confusing people and things that are purposely set out to do that. Do you think Facebook has made strides in combating those things since, say, what we've learned about the 2016 election? Have they made strides? Yes. Have they solved the problem? No. Now, just this week, they announced with Great Fanfare their new oversight board. In some ways, Facebook operates like its own country. There's not a lot of laws. In fact, US laws are more notable for what they exempt in their regulation of the internet than what they actually cover. But Facebook seems to be adopting some of the trappings of its own little nation state. So it set up its own version of a Supreme Court with this oversight board. And it has attracted some incredibly impressive people. So hats off to them for reaching out and finding some really top-notch people. And I have total respect for all those folks. However, 20 people to take on what Facebook says is a billion posts a day, that they're just going to be nibbling at the edges. They're not going to be able to grapple with a very large amount of the information, misinformation, disinformation that gets posted. It's my understanding that they're going to start with by addressing concerns of people about content that has been taken down. And then later, they are going to move into looking at complaints about content that should be taken down, in at least some people's view. So to the extent that, I mean, that's where we are in the disinformation, misinformation category, that's the stuff that either should be taken down or identified or downgraded in their algorithms, so not as many people see them. And it doesn't sound like the board is going to even get to those issues until at least later in this year. And as I said, they're only going to be able to address a small fraction of them. And it's also not clear how the board's decisions are going to interact with other policy decisions that Facebook has made. So for example, you know, they've made a decision that they're not going to abandon micro-targeting of political ads, which I think is a problem because it's like whispering in a million ears at the same time, but you don't get the opportunity for counter speech for people to raise arguments on the other side because people on the other side never get the same message because it is only micro-targeted to the people who are most likely to be susceptible to those arguments or to agree with them. And the distinction between misinformation and disinformation generally is that misinformation, as people comes about when people are spreading information that's untrue, but they don't realize it's untrue. Disinformation comes from malicious actors, and they are out there. There are foreign malicious actors. There are domestic malicious actors who are trying to spread disinformation. And this can be very damaging if it's talking about, for example, COVID-19 and its health risks. But I think there is going to be an inherent overlap even with health information with the election because this is going to be the number one issue in the election. So if there is misinformation or disinformation about what is happening, what has happened, what the government has done, what the government hasn't done, not just at the federal level, but also many governors and even mayors have been involved in responses to this, this is going to be the number one election issue. So if there is bad information out there, it is going to bleed over into the electoral context. How do you at the FEC take a look at that then? Because you're right. There is already misinformation, perhaps disinformation about COVID-19 out there. You know, some things have been taken down. But as we move closer and closer to whatever our election is going to look like in the fall, and I think, who knows how we're going to vote at this point. Do you envision mechanisms through which the FEC can say, hey, wait a minute, that's not okay? Well, I mean, we're not involved in taking down content. We're not the same. What are you thinking about a policy framework, I guess? I don't, when I propose that Facebook not do micro-targeting of its political ads, I propose that as a policy that Facebook could adopt for itself in an effort to correct some of the harms that it has already caused in our democracy. And not as a government regulation. I think, frankly, a government regulation of that sort could be very problematic under the First Amendment. So I'm not proposing it as legislation or as regulation. But I do think that Facebook has to take some responsibility for the content that is put out on its platform, because it doesn't function as a neutral pipeline. It has algorithms that promote some information and downgrade other information. And what it seems to promote its whole business model is to keep people online, to keep them engaged. And what it seems to have discovered is that getting people riled up, making people angry, making people outraged, is the best way to keep them on the platform and keep them clicking away and looking at their ads. So that's what their algorithms promote. And I think that really has had a deleterious impact on our society, on public debate, on people fall into these information silos where they're only getting information that reaffirms their own biases. And it really undermines our ability to come together and form consensus and figure out what our problems are and then how to approach them. Because we can't agree on what the basic problems are. If there are people who are just living in a world where they believe as true things that are not true, then there's no way for us as a country to come together to find common ground and solve the very difficult and challenging problems that we are facing. Do you think they listen to you? They say they do, but I'm not sure. I'm not convinced. Well, I see employees of Facebook from time to time, often at conferences back in the day when we had conferences where people showed up in person. And they assured me that they read my op-eds and they were taking my point of view very seriously, but then the policies that they come out with don't seem to take my views into account very strongly. So I'm not sure if they're listening. If you could make one change now, we'll get them to listen kind of to one thing. Would it be on political ads? Well, I think it's on political, well obviously political ads are, that's my thing, right? So if there was anything that they should listen to me about, I would think it would be political ads, but I also think that they really ought to think about the role that they are playing in our public debate and whether it is a constructive one. Because as I said, I think they started out with a very high-minded goal of bringing people together and helping people form connections and helping people find their community online, even if they were not living in a place where they had people who were in their philosophical or friend communities right near them, that they could find people that they were in sync with in other places. But as I said, it has a very dark side to it and has also contributed to polarization and divisiveness and in some parts of the world too much worse. It's been a platform for hate speech and it has what went on with Rohingya and the messages that were being promoted about them on Facebook and on other social platforms. It's really horrible. One of the things that I'm struck by just thinking about sort of this moment in time was it felt like pre-Covid, there was some momentum in Washington by partisan momentum to look at not just Facebook, but Amazon, Google, big tech in general, whether it is you all looking at political speech, whether it's the FCC, whether it's Josh Hawley, you know, in Congress. And I wonder what happens now with all of that? With the investigations that were going forward and with that sort of energy, is it lost? Might it come back at some point? Or are we in this moment where we're all treading water as a world and there's just no way to know? Well, I think there is no way to deny that we are in the middle of an international emergency that is infecting the entire world. I said infecting, which is interesting, and say affecting the entire world. And that's going to be the number one priority and should be. But I do think that at a time like this when people are spending even more time online and are missing out on in-person conversations and on, you know, hanging out around the water cooler and exchanging ideas there, that it is more important than ever that the platforms not play a role in spreading misinformation and in pulling us further apart. And these issues are not going to go away. We may all be obsessed right now, rightfully so, with the health implications and the economic implications of COVID-19. But these issues are still there and I don't think people are going to forget about them. So we may not be dealing with them this week, but we will deal with them at some point. What worries you about this election cycle? Well, the number one thing that worries me about this election cycle now is something that, you know, a few months ago I wasn't worried about at all and that is whether people will be able to vote safely, securely, excessively, fairly, whether all of the states and localities that are administering elections are going to have the resources they need to provide people with expanded opportunities to vote in time and in place so they won't be crunched together. And let's face it, whether states want to encourage this or not, more and more people are going to be requesting absentee ballots. They are going to want the opportunity to vote from home and to do that safely. And states and localities are going to have to get up, they're going to have to ramp up their efforts to do this because in some parts of the country, they're very used to vote by mail and are used to having a predominantly vote by mail election in other parts of the country, not so much. And they're going to need new equipment and that equipment is going to have to be ordered now or they won't be able to get it in time, it won't be built, it won't be, they won't be able to get enough envelopes in time, they won't be able to get the machinery they need in time. And this is expensive. The Brennan Center in New York has estimated that this is going to cost between two and four billion dollars for all of the states to get the equipment that they need in order to conduct a safe and secure vote by mail operation that would have enough ballots available so that everybody who wanted to vote by mail could vote by mail and could do so safely and securely. This is going to be, I don't think anybody wants to see a repeat of what happened in the Wisconsin election where people didn't get their balance in time and had to line up around the block just, you know, hundreds of people in line waiting for hours in their masks and their gloves, but still putting themselves at risk in order to cast their vote. And I don't think that's the way anybody, no one should be able, should be forced into that choice of having to risk their health with the health of their loved ones in order to cast their vote, which is our sacred civic responsibility and right. So that is my number one concern. There are bills in Congress that are unfortunately snarled up right now. They don't look like they're moving. Is there a role or could there be a role I guess for some of these big tech platforms to play in disseminating that information to voters? Not necessarily a voting online, but we have seen some kind of fascinating quasi-government partnerships over the past couple of months. California has worked very closely with Google, for example, and Apple to get some of their data on the coronavirus. And I wonder, you know, would there be a role for even a Facebook to be able to say, hey, voter in whatever state, here's how you do it? And is that something you guys would even be comfortable with? Well, I would personally be very comfortable with that. And I think that actually a lot of the platforms Facebook included have tried to get information out in past elections about here's how you go about voting. And that kind of an information role, it would seem to me, would be a natural for them and would be really useful to state some localities, because one of the things they're going to have to do, particularly in parts of the country where vote by mail is not something that a lot of voters have done before, is to make sure everybody understands how it works. But if they really want to step up and do something for their democracy, I mean, these are at a time of economic upheaval. When many, many companies are hurting, the tech companies are doing great. They're making lots of money. And they have benefited from our democracy, from all the protections that our laws and our Constitution has given them. If they really want to step up, they could just avoid the snarl up in Washington and say, we'll buy the equipment. We'll help the states and localities not only with information, but also with money. We will help them help ensure that they will have what they need in order to make sure that everybody can vote. They've got the money. They could do it. There aren't a lot of folks who could do it, but they could. I want to ask you a somewhat personal question. You are not known for being shy with your opinions. What's that been like as a woman in Washington who just does what she thinks? Well, you know, my favorite part of this job is that I don't really have a boss that I have to respond to. The only boss that I have is the American people. And my only concern when I get up to do my job every day is what would be the best policies for the American people. And that is a really privileged place to be, to be able to be solely dedicated to the good of the country. And in particular, as a woman, for me not to take advantage of the platform that that gives me and the opportunity to speak out and to try and push policy in the direction that I think would be better for the country. I think would be a really missed opportunity. And so I try to take advantage of it. And I hope that I am a good role model for other young women out there, like yourself and women throughout the country, that they should always take advantage of their platforms and speak up and express themselves. And you know, nobody ever changed history by being shy and sitting in the corner, right? Fair enough. I have a couple of audience questions. I'm looking down on my phone to look at them. This is what I think is sort of interesting. Do you think the movie, and let's pause it again, it is a movie, is a suitable indictment of how social media or Facebook has changed our lives? Well, I don't think the movie is an indictment of that. I think it, it, it, I think the movie emphasizes the more positive sides of how the platforms have changed people's lives. I mean, it's not a particularly flattering portrayal of Zuckerberg, but I think that the notion of people making friends online and finding connections and, you know, what start, it started out in a college dorm room. I mean, it's not really surprising that starting out in a college dorm room with the goals of helping college students connect with each other, that maybe nobody thought through what the ramifications would be when it became an international information platform. That has become, you know, a place where billions of people go to find the news and to learn about what's going on in their world. So I think what's striking about watching the movie is how much things have changed and how the platform has changed in, in what it does. And yet there are some characters who were there at the beginning and are still very much a part of our sort of collective understanding of things. Here's another question. I'm going to let you go with this one. Who is your favorite villain in the film? Is there one? And I guess is there someone you empathize with more than anybody else? You know, it's, it's not a film that really has a lot of very sympathetic characters in it, right? I mean, you know, you, and in the, in the movie you might feel bad for Eduardo, his friend who he basically betrays, if you can trust that. But you know, in real life, I think, yes, he had to Sue Zuckerberg to do it, but he came out pretty well of that. And I believe that, that he's, you know, living a pretty good life as a result of his initial investment in Facebook. So, you know, things worked out okay with it for, for him. You know, I guess the, we're supposed to not like the Winklevoss twins, but if you believe the way it shows in the movie, it does kind of sound like they had the idea in the first place. And Zuckerberg thought he could do it better and, you know, may well have been right. I mean, if the Winklevoss twins had done their platform for all, we know it would have gone the way of my space. And we wouldn't be sitting here talking about it. Yes. And, you know, it's, would the world be a better place? Does the world really need Facebook? I don't, I don't know. I mean, one aspect of, of Facebook that's, that has troubled me, and this feeds into why I'm concerned about the micro targeting is the amount of information that it collects on every single user. And nobody really thinks about that when they're using the platform, nobody thinks about how every time they're, you know, out there looking for information and they're sharing things and liking things and overlooking other things. Facebook is just sucking up all that data about us and using it to, to, to sell ads to people so that they know that they can find the folks who are most interested in their, in their products. And, you know, if you're selling soap or shoes, maybe that's not so terrible, but when you're selling politics and politicians, it makes me a little bit nervous. I was struck when we talked on the phone last night by two things. Number one, that you use Facebook. I do. But, but, you know, not in my personal life, not in, not in, I don't have a professional social media account on Facebook. I use Twitter as my professional social media. But also you said, well, what if they tried a subscription law? What if they did something different from, you know, basing their profits on ad sales? Now, I have yet to see in my history as a business reporter, any large company walking away from something that brings them a lot of money. But what would that even look like? Well, I think part of my concern about Facebook is that as it has grown, it has kind of gobbled up a lot of potential competitors. So we don't see a lot of companies trying to set themselves up in opposition to Facebook and trying to draw off their, their customer base for a different model. And I think that because they are so ubiquitous now, it would be very difficult for another company to come along and do that. But I, I know, and I know a number of my friends feel the same way that I would be more than happy to pay a subscription to a company that would allow me to share vacation photos and see baby pictures and with my friends and not have the company constantly be sucking up data from me and trying to sell me things. I'd be, I'd be willing to pay for it to avoid all of that. It's, it is a really quite an invasion of privacy, I think. Do you think, just to get back to the oversight board, which we talked about a little bit, you know, and they announced 20 names this week, I spent some time with Kate Klonic, who is a law professor who, you know, really has thought through what a Supreme Court would look like. You're a lawyer. Is there enough of, I guess, an impetus for the company to do what this framework says? And, and what would you have to see in order to say, oh, actually they're taking this thing seriously? Well, as I said, one concern is that there, it's 20, no 20 people or 40 people when it gets to its full size, no matter how brilliant are going to be able to deal with the amount of information, the amount of posts that that go on on Facebook on a daily basis. So I think they're going to pretty quickly be overwhelmed and it will be a hard job for them to select which issues they're going to grapple with. And, you know, we'll have to see how long we'll take them to make decisions once they decide to pick up on a particular issue. And then we'll have to see whether Zuckerberg is willing to, he says he will, accept their views and will adapt to them. But, you know, it's not clear to me, for example, what if there's a policy decision that Facebook has made and they decide they come up with an individual decision that conflicts with that policy. One thing that jumps to mind is Facebook has decided that they are not, they do a lot of fact checking, but they're not going to fact check candidate ads. And suppose somebody challenges a candidate ad and the board were to come to the conclusion that this ad was completely misleading and should be taken down. What is Zuckerberg going to do with that? Should that come to pass? Is he going to say, oh, I'm now going to abandon my policy that I have so strongly advocated for because the board tells me they have a different point of view? Or is he going to say, well, no, no, that's a policy decision. So that's out of the board's purview. So I think that's one example that occurs to me because of what I do. But my sense is that this is not the only issue where there could be a real conflict between what Zuckerberg and the business entity have decided to do and what this oversight board might recommend. And that'll be really interesting to see how that gets negotiated out. Why do you think they're still so interested in political ads, given that they are, by all reporting, a very small slice of revenue? I think they want to be the go-to source for information of all kinds. I think whenever there is an important discussion, they want to be in the middle of it, and they want people to come to Facebook for information on it. And I don't think they want to give up that space so that somebody else might come along and become a competitor or that people just wouldn't feel like they needed Facebook for absolutely everything. And yet, at least in the US, as you mentioned when we were first talking about the movie, that Facebook was conceived to be cool. And that maybe it's not the place where the cool edgy conversation is taking place now. Is the attempt to stay kind of, to keep doing political ads to stay relevant and to stay cool? I guess that's what I'm trying to figure out. I don't know if political ads make you cool. Maybe not. I know that when I first joined Facebook, I joined because I had teenagers and they wanted to be on Facebook. And I wanted to know what they were doing on Facebook. I was concerned about what kind of information my children were sharing online and with whom. So the condition of them getting Facebook accounts was that they had to friend their parents and allow us to see what they were posting. I'm sure they were delighted. Yes, well, they were. They had nothing to hide, so it was not a problem. But so I joined in order to keep tabs on them originally. But at this point, they don't think Facebook is cool anymore. My now young adult children no longer think that Facebook is at all cool. And their generation doesn't really think that Facebook is cool. They use other platforms, some of which like Instagram are owned by Facebook. But it's actually the roles have switched where now I think they stay on Facebook to keep tabs on me instead of me staying on Facebook to keep tabs on them. And it's not their favorite platform. And I think that Facebook has a real uncool problem with the younger generation. If you were, and this is maybe a fun thought exercise, maybe not, I don't know, if you were to direct a movie, right, if you get to do the Ellen Weintraub version of the social network, now 10 years later, what are we talking about? Is it a political thriller? Is it something that's very different? What would you focus on? Well, for one thing, there would be a lot more women in the cast. Not a lot of women in this movie. Not a lot of women in this movie. And likely for a reason. I mean, the tech community does is well known for not being a particularly woman friendly place, particularly at the beginning. But I think even, even today, excuse me. So, so that's where I'd start is I'd bring in more women. What one of the things that we've seen around the world recently is some of the countries that are dealing best with our health crisis are the ones that are run by women. So maybe if we had more women running tech companies, you know, in my in my version, it would be a woman who creates the big hot new platform. And I think it'd be really interesting to see how a woman centered platform would be different from the way the the boys put theirs together. Well, that is an interesting thing about the oversight board is it seems at least so far that they've got a lot of women, they've got people of color, that they actually did take that piece of feedback and say, and people from all over the world. Yes, from all over the world. And and this is again, something that I think is interesting is that how we might view something in the US, maybe G should there be a government regulation here looks very different when you are in a country with, say, a hostile government that has, you know, or or any government that has a history of obscuring and suppressing speech. And I guess I wonder, do you think that that what you see from the oversight board in that respect is progress? Well, as I said, I think they got a terrific group of people together on their oversight board. I really can't fault them at all on that. But I do think that you raise an important point that when we look at disinformation and misinformation here in the United States, I think we have to be careful not to generalize what happens here across the across the globe, because different countries need different different regulatory environments. And I think one very troubling development that we have seen in recent in the last few months is authoritarian governments taking advantage of the health crisis to consolidate power and to to try and and while claiming to be going after disinformation, really to be going after dissent and to be trying to quash dissent. And we do not want to go down that road. It's it's a it's a very dangerous place to be. And I'm as concerned as I am about the propagation of disinformation, particularly when it comes from abroad and tries to affect our elections. I'm equally concerned with the risk to dissent and free speech and people's rights across the globe to speak truth to power and to criticize their their government if they don't think that they're doing a good job on COVID-19 or on anything else. This is one of those reportedly thought exercise questions. So let's say that you got a chance to sit down to dinner or in this case, not dinner, a socially distanced zoom or what have you with Mark Zuckerberg. Where would you start? I you know, I think one of Facebook's problems is that it is it's not a very humble place. And I think that they and I don't know how to inculcate a sense of humility, but I think they could really use that. I think that there are all these whizz kids starting with with Zuckerberg who have become very successful and very rich. And they think they know it all. And I bet that you like me have been in plenty of rooms where there was some guy who just thought he was the smartest guy in the room and he didn't have to listen to anybody else, right? Right? No, I bet I bet you've been there. I know I have. And I I think that you know, I was reading in the Wall Street Journal that just recently held a big meeting at his palatial estate in Hawaii, which sounds really nice. And came back from there and people thought that that he was moving towards a more inclusive model. And instead, there was upheaval on the board of directors, people who were critical of him left and more allies seem to join the board. So there's this interesting dichotomy between what's going on on the board of directors as opposed to what's going on on this oversight board where they legitimately have people who have been very critical of Facebook. Not everybody on that board is, but some of them, some of them have been. And Zuckerberg is quoted in the Wall Street Journal is saying that he thinks his problem is that he's been worried about offending people and he wants to be clear and direct with people who wants to be understood and he doesn't need to be liked. And if he thinks his problem is that he's been too polite and mealymouthed so far, I think he really doesn't understand how people feel about Facebook. All right, you can get to all of this in the sequel. The Ellen Wanchop directed sequel. Yes, with the with the new woman center platform. Fair enough. We are pretty much running out of time. So I am just going to give you a chance. Any final thoughts, anything I didn't ask you that you think is important for people who are watching this to know? Well, I guess I would say be very careful about what information you are obtaining and sharing on the Internet. There it is possible. I do think that Facebook has done a better job, not a perfect job, but a better job than it did before at trying to get some information out there as to where where the information is coming from to help people, you know, find out more about where the information is coming from. And people really need to do that, particularly when you read something that just pushes all your buttons. Stop and take a breath before you share that and make sure they want to be true. Make sure that it's coming from a reputable place. Not everything that we want to be true is in fact true and truth matters. And you know what else really matters? Voting. So my final word to everybody is figure out a safe way to do it. But please, please do your civic duty and help your country be more responsible to you by taking part in it and getting out there and voting or staying home and voting. But do vote. Ellen, thank you so much for talking with me. It's been a pleasure. I'm just going to thank everybody who helped put this together. So that's Future Tense, which is the partnership between Slate, Arizona State and New America. And just a reminder that we have these social distancing socials. That's what these are called. Every Tuesday and Thursday, you can find them on the New America events page and at Slate Live. And thank you very much, Ellen. Well, thank you. And I hope someday that you and I can actually go to the movies together in person. Okay. Bye. Bye.