 Hi, everyone. I think we can get going. So, it's like a classroom again. I'm Anne-Marie Slaughter. I'm the CEO of New America. And welcome, all of you in the room and all of you online, to our event, Section 230 and the Public Interest, Proceed with Caution. So, we are, New America is delighted to host this with our partner, the Wikimedia Foundation. And we are really happy to have all of you here for a very important discussion. We're going to be looking at Section 230 of the Communications Decency Act, known in these circles just as Section 230. But particularly its impact on the public interest internet and how potential reforms to Section 230 will affect the public interest internet. So, the Open Technology Institute works at the intersection of technology and policy to ensure that every community has equitable access to digital technology and to its benefits. So, we take a multidisciplinary approach to promoting universal access to an open, secure, and interoperable internet. OTI is part of New America's new technology and democracy cluster, a group of our tech programs that are together committed to both developing and governing technological tools that's in ways that serve the public interest, democracy, and also to reduce inequality. And at the heart of all of that is corporate accountability. Section 230, passed in 1996, courtesy and part of our keynote speaker, Senator Wyden, was really passed as a foundational law to help the internet grow. It is a critical liability protection that allows any internet service the freedom to choose whether and how to deal with user-generated content, all that content that many of us put on the internet. So, that allows sites big and small to moderate content, to develop ways of moderating content without fear of litigation. There's a heated debate, as all of you know, about Section 230. Many insist that big tech companies use Section 230 to shield themselves from liability for lots and lots of harmful content. As we, as you might guess, from Proceed with Caution, both OTI and New America and the Wikimedia Foundation actually counsel caution in experimenting with Section 230 and thinking through all of the unintended consequences and potential downsides that may be less evident. So, in particular today, we're going to be looking at the impact of Section 230 on the public interest internet, on the ways in which everything from Wikipedia, which I still probably check many times a week without question, but also local libraries and small public interest organizations, the ways in which all of us benefit from Section 230 or are impacted by Section 230. So, it gives me great pleasure with that introduction to introduce Senator Ron Wyden as our keynote speaker, one of the authors of Section 230, along with former Representative Chris Cox of Connecticut. Senator Wyden of Oregon, I should have mentioned. Senator Wyden is a senior member of the Senate Select Committee on Intelligence and the top Democrat on the Senate Finance Committee, which is two very important areas of the Senate. And he's very well-known for his work in advocating a free and open internet and in addition to advocating for user data and privacy protections. Also joining us today is Ashley Gold, who is going to moderate the Q&A part of the conversation. Ashley Gold is the Technology Policy Reporter with Axios, but she's had experience reporting for the Information, PoliticoPro, and BBC's Washington Bureau. So, thanks to all of you for coming and please join me in welcoming Senator Wyden. Thank you. Very much, and it's great to be with all of you and I'm going to make this a filibuster free zone, okay? And we're going to talk a bit about 230 obviously and let me start it this way. More than a quarter century of empowering Americans to make their own choices online and take responsibility for those choices is what Section 230 is about in a sentence. And over the years, there have been opinion articles written that have said that our law, for example, created a trillion dollars worth of wealth in the private economy. I'll let people debate that, but what I'll tell you is I'm most proud. Most proud is what Wiki Media has done with Section 230. That's what I cite when people want to know what 230 is all about. So, first of all, what 230 is, and a lot of people I think sometimes think it's some kind of fuel substitute or something like that. They're going to order some 230. It's a law that Chris Cox and I wrote in 1996. And one core part of the law, the so-called 26 words that created the internet, that really stands for one simple proposition. And that is the individual who created a piece of content online is the person responsible for it. And back then when we were having a big debate about the future of the internet, I said, let's get down to this personal responsibility concept and that's what it's about. Another part of the law broadens the First Amendment's protections and allows websites to take down posts that they don't want. Here we're talking about stuff like hate speech, violent content, that kind of thing. And you can take it down and elevate other posts. Together these provisions allow for online services to host and moderate content without fear of being under lawsuit deluge. Unfortunately, a lot of the debate is basically on whether this is just a big windfall for big social media companies. I want to emphasize that our goal is not about anything like big guy protection. It was about emphasizing users. We wanted to make sure that they could speak online and access interesting content. And second, the startups and small sites that want to compete with the incumbents, whether that's going up against Big Cable or Big Tech, everybody from Wikipedia to public library services and knitting message boards, we wanted them to be in a position to be competitive. So for all of us users, Section 230 is what allows sites to host controversial speech. That is the speech that we really care about. The speech that propels progress supports democracy. Platforms could make a lot of money posting inoffensive clickbait that lines their pockets with revenue while forgetting about that kind of speech, that challenges power and uncovers the truth. The controversial speech is essential. We know all too well that individuals and corporations with power are happy to use legal structures and legal systems to silence whistleblowers or dissenters and activists. Without 230, platforms would be happy to get rid of important speech. It's just not that important to their bottom line. For example, look at how Republican politicians and states across the country are trying to shut down online conversations about abortion and reproductive health. Section 230 is the first line in defense against those repressive laws or how they're trying to shut down access to gender-affirming care and access to information about gender identity, including by banning books and teaching on the subject. Section 230 helps ensure that critical information gets online and accessible for those who need it most. And look at the Me Too movement or police accountability movements in 2020. I don't think any Me Too post accusing powerful people of wrongdoing would even be allowed on a moderated platform without Section 230. Nor would black journalists have been able to use Twitter to call out their own management on coverage of police violence. It's been clear that for several years, MAGA Republicans and their Supreme Court justices, Alito and Thomas, want to take a sledgehammer to 230. And it's not to help consumers or create a healthier online environment. They just want to get rid of 230 to force companies to carry harmful content because that promotes their political agenda. And to obscure their motivations, they falsely claim that getting rid of 230 would solve everything from the opioid epidemic to sex trafficking to bias against conservatives online. My comment? They ought to be careful what they wish for. When Congress passed Cesta Faust over my objection, we all said that this was a horrible scourge. I don't take a backseat to anybody in talking about how evil these people are. They came out with this laudable goal of stopping sex trafficking online. And I went to the floor of the Senate and I said, it's not going to work. And it's going to cause a lot of collateral damage. Five years later, that misguided law has ended up doing nothing, nothing to protect victims or bring sex traffickers to justice. When was the last time you saw a politician hold a news conference to talk about Cesta Faust? I can't find any of them. Instead, what Cesta Faust did is drive sex work to the dark web and dark alleys and by all accounts, violence against those individuals. Meanwhile, the threat of lawsuits has led sites to take down content that has nothing to do with sex trafficking. So I would just tell you, if you want a preview of a world without 230, it is Cesta Fausta. Stopping online conversations won't solve the problems politicians claim they will, but without 230 and the First Amendment, it will be harder for people without power, without clout, without political action committees, and marginalized voices to call out wrongdoing by the powerful. And it'll certainly be easier for government to set the terms of public debate. That's what the Maga Republicans want to do in Florida and Texas, where they pass laws to force the platforms to carry content that drives out vulnerable speakers that are disproportionately the targets of the content. Maga Republicans want that kind of content that allows them to drive their points of view, and those laws are going to have their day in the highest court this fall, and I hope the justices see the law as plainly violating the First Amendment and being preempted by 230. So, if 230 were repealed tomorrow, there would be immense pressure on websites to quickly take down content that offends people with power and anything else outside of the comfortable in the mainstream. I don't believe Americans want that kind of world. I sincerely hope that no Democrats play into the hands of right-wing culture warriors and help them use Section 230 to force sites to give voice to hate and radicalism. Now, Section 230, as I said, protects startups in smaller sites. If Section 230 were repealed tomorrow, there wouldn't be any blue skies challenging Twitter. These upstart federated services depend on 230 to protect their users who create and manage their own content moderation programs within the small communities that comprise this whole decentralized service effort. And without blue sky, where else could I mock my staff for not having invites yet? So, Section 230, I'm one of the co-authors when Chris Cox isn't perfect. We're always looking at a way to make the Internet a better place for users. You know, my family has said, when Ron talks about the Internet, all he's really going to tell us is about users, users, users. Well, I'm sorry to be repetitive, but I think that's the bedrock principle we ought to be all about. You know, a lot of people say otherwise or people who want to repeal Section 230 and say, it's a get out of jail free card. I just point everybody to the brief Chris Cox and I submitted to the Supreme Court in Gonzales. We took care to highlight a number of cases where courts decided companies were not protected by Section 230, Lemon vs. Snap, where a family sued over Snapchat speed filter. The company created the filter. It's not user-generated content. It's not protected by 230. It also doesn't protect Amazon when it delivers defective products or websites that provide illegal online home rentals. If you write an algorithm that only shows housing ads to people of a certain race, which Justice Sotomayor was concerned about, the courts have correctly decided no protection, none from 230. So I'm sure the court read what Chris Cox and I wrote. I'm just absolutely convinced they read every word and were persuaded by our brief that they decided not to rewrite the statute. So I say, you're welcome. We're making light of this, of course. I know many folks there spent nights drafting briefs and we thank them. The outcome in Gonzales highlights another point. Many of the claims that commentators and my fellow legislators bring up of being blocked by 230 often wouldn't go far on their own. The court found that the claims, for example, in Gonzales couldn't give rise to liability in their own right. No 230 needed. Where courts find that 230 doesn't bar a claim, the parties often end up at the same destination. Case dismissed. Courts might say that platform's actions were too far removed from the harm or that claims elements weren't met or dismissed for some other reason. But that only happens many months and tens of thousands of dollars later. Truly ruinous. So two final points. Any time I get the opportunity to talk about why content moderation and 230 are important, I also want to be clear about big tech. Google, Metta, Twitter, Microsoft, all of them have to do a better job of protecting users on their sites. It's a horror. What's happening with Twitter now having divested so much particularly from trust and safety. Second to my mind, the first place to start in holding companies accountable is to pass a strong federal privacy law. That's to attack the business model so many of the big tech companies depend on. If you take away the incentives to hoover up users' personal information, you make it much harder to target them. It'll provide protection for people and you take away the incentives to target them. Particularly, we don't want people targeted with objectionable content and we want to make sure that we're protecting kids and teens. I also support more aggressive antitrust enforcement and the open-app market agreement to create more competition in our markets and make it easier for more companies and new speech forums to grow. I said I'd give you two more points. That'll be it. My questions will be especially welcome. I think Leslie is asking today. So thank you again to Wikimedia. You all, every single day, day in and day out, are making a big difference. Back to my favorite word for users. Thanks, everybody. Where are we? Oh, okay, great. My name is Ashley Gold. I'm a tech policy reporter with Axios. I was just telling the senator backstage when he and I ever had a conversation about Section 230 was in 2017. So it's been a while. This topic has been percolating. It gets a little more relevant every year. So since you mentioned the Supreme Court, let's start there. Were you at all surprised by the result in Gonzalez? Was that what you were expecting to happen? Well, I have to tell you, I didn't expect the court to come out with 9-0. Look, I mentioned the fact that, you know, we filed a brief. We're not saying the court was, you know, sitting there watching it. But I'm also, while we won 9-0, I'm not convinced we're out of, out of the challenge place. Certainly those cases, you know, the deal with, you know, free speech seems to me their First Amendment violations, but with this court, who knows? And with the Supreme Court, I mean, Justin Clarence Thomas talked about wanting to amend Section 230. They did not do it, given the chance here. So now that it's been punted back to Congress, again, what should Congress do, if anything, or is your ideal scenario that we keep going stat as quo, the law is what it is, and we just keep hoping for the best? Well, I'm always open to new ideas. I sketched out a couple of principles first. I wanted to make sure we protected moderation. That was key, and I wanted to make sure we protected the First Amendment. I mean, look at blue sky right now. If you really want to take on Facebook, blue sky is out there because of Section 230 and the protection for users and the structure of, you know, our law green lights, startups, and competition. You know, Leslie, I want to take just for a minute. The big guys have got enough power to take care of themselves. They can buy all the content and all the coverage that they could possibly want. My constituency has always been the users, the smart startups, the people who wouldn't have a chance to get out on this, you know, internet playing field without Section 230. So I'm always open to new ideas. I gather there's going to be a hearing in the Judiciary Committee about Section 230. I hope they'll have some people who share our views. Let me be diplomatic and put it that way. Have you seen any ideas from your colleagues in Congress over the years that were good Section 230 reform ideas? Well, there's always interest in civil rights issues. I feel very strongly about protecting those kinds of priorities, but mostly it's about people on the right who are trying to get more far-right voices out there. I hope my colleagues in the Democratic Party who are absolutely right to be concerned about hate and bullying and the like don't put in with those far-right voices because we could end up with a lot more speech in this country being silenced both on and offline when we're already seeing book banning and the like. So let's talk about the topic du jour. Does that come up these days? A little bit. A little bit. It's 230 more. It's definitely still more important. So you talk about colleagues from the Democratic Party teaming up with colleagues on the right. We have a prime example of that with Senator Josh Hawley and Richard Blumenthal putting out a bill last week that would strip 230 protections from AI-generated works. We don't even know exactly what that means or what that would look like in practice, so that would be pretty spooky for a company like OpenAI or Meta that are working on these generative products. I'm sure they wouldn't support a bill like that. What do you think of that effort? Well, our 230 law was about hosting. That was the core kind of principle. It's not about generating content. And so I'm already on record as saying that when you're talking about chat GPT, you know, for example, which is being integrated into popular digital services and the like, it shouldn't be protected by section 230. Now, the Hawley-Blumenthal bill, I've already said that I'm not for protecting generative content. I think we ought to wait a little bit and think through what the implications are before we go out writing bills. Also, I think there needs to be some effort to define generative AI as it stands it could be read to include search engines, which I've said before should be seen as distinct from generative AI chatbots. If you withhold section 230 immunity from search engines, I think that'd be a disaster for all of us who care about users and speech and the like. So I don't see a big rush to move here. The Congress already has a to-do list on the technology front that we know are real problems and you've already heard me say right up at the top of it is we ought to pass a major federal privacy law. I don't know if you saw I was able to get declassified information from the director of national intelligence with respect to all the commercially available information. Read that analysis. I'll tell you because we've all been working in these tech precincts. That report basically outlines how the government is in a position to collect almost as much information as people were talking about back in the John Poindexter days of total information awareness. So I just want to say the top priority ought to be passing federal privacy law and also passing my algorithm accountability act because that would make automated decision systems more accountable. So you're not entirely opposed to the idea that companies that have products like chat GPT and Bard should be held responsible for the content that those generative AI products are spitting out? Look, the 230 bill was to protect hosting. It was not to protect generative content kinds of information and my colleague Chris Cox, my colleague and friends if you're talking about something which even in part is generating new content. No protection. Okay, very interesting. So a lot of the conversation around section 230 has revolved around children's safety online. There's this idea that section 230 gives big tech companies an excuse to not tackle that problem of abuse on their platforms. Are having those conversations in the same vein? Is that the right thing to do? I mean are solutions to children's safety online to be found in 230 or do you feel it should be an entirely separate conversation? Well, let's talk about kids. I think for example the Durban legislation would be a big mistake. I think kids would be less safe in that kind of situation because of the damage done to encryption. I think my colleague Senator Markey and I think it's called the COPA bill takes the right kind of approach. It doesn't devastate 230 but it protects kids. So, look I have my wife and I are older parents in New York City so people always come in and blame her for stuff I'm doing. That's so cool. She's so much cooler than you. I had no idea. Amazing. So, you know, the point really is all of us, we have twins that are 15, little one that's 10 we look at all this stuff and say we gotta find ways to get this kind of filth out of the sight lines of our kids. Which is why the big tech companies ought to get off their tush and get serious about moderating. I mean, I don't carry any brief folks for big tech. My brief is for the startups and the competitors and the marginalized voices. The big tech people can take care of themselves. You look at my career in public service. I've always been on the side of the people who don't have the power in the cloud and that's still true. So, you've mentioned a couple of times, you know, you're not carrying the torch for big tech but ultimately it's big tech that really, really you know, they benefit from these laws. Wikimedia and smaller startups benefit from these laws as well. But it gives people who are against 230 such an easy talking point to say hey, this is a get out of jail free card for big tech. When you're having those sort of like honest discussions with your colleagues on the hill and they say that. What do you say? Why? Tell them about Cesta Foster. The vote was 98 to 2 on Cesta Foster and everybody said, oh Ron, how can you do this? Your political career you've been so valuable will be over and I said, you're gonna see that this law is a mess. It's a mess. The bad guys went to the dark web. We got more violence against these vulnerable, you know, people on the streets. So Leslie, the way I start the conversation is guys had a press conference recently on Cesta Foster. Anybody out there talking about how great it was? You want to see the future of all these kind of bills? That's what it is. So let me just finish on one point about big tech. Big tech is for 230, mostly in theory. They like 230 basically they were big. You know, they basically liked the opportunity to grow and hire engineers early on because they didn't have to hire attorneys. But now that they're big now that they're big, they got teams of lawyers, they sit back, they relax and mostly what they're interested in folks is stopping competition. Do you know what happened on Cesta Foster? Facebook, for example was all interested in what we were doing. I remember when Mark Zuckerberg came to my office for the first time, told me how great it was. At the end when they were getting flack on so many things, Facebook basically pulled up the moat and said, we got what we needed out of this thing. We got a chance to lead and corner the competitive market. Let's keep everybody else out. And they supported Cesta Foster which two people in the Senate voted against and I think I've made the point. I have one more question for you and then I'm going to let the audience do a few questions but there's been a lot of talk about AI lately and sort of this opportunity lawmakers have to set foundational laws for AI and I hear a lot of lawmakers saying let's not make the same mistake we made with Web 1.0 in social media bypassing something like Section 230. We can't make that mistake again. What do you think when people say that? Well, I tell them I'm always open to new ideas but let's not miss facts. I always tell them, for example the New York Times once printed a very large article. I went to school on basketball scholarship. I'm going to read this for and it made the article made me out to be seven feet tall and it said Ron Wyden and Chris Cox the authors of the law that empowers hate speech and we don't do this very often we got in touch with them and we said you do know that the first amendment makes more than 97% of the speech we're talking about acceptable in our country don't hear anybody talking about getting rid of the first amendment and the New York Times to their great credit printed a very long retraction saying that most of the hate vile speech is not due to Section 230 so I realize Leslie your point is a very good one it takes more time to lay out my side because we all hate this slime and bullying and just trash that's there but do you really want to cut off the voices of me too and black lives matter and blue sky I'm open to new ideas but I think we ought to be real careful when we're talking about making sure the voices of people who don't have power and clout you know get heard and that's been my case that's been my brief since we began okay thank you very much I've been told we're actually out of time I'm sorry we don't have time for audience questions be continued everybody thanks for being here so good afternoon everyone thank you for staying and hanging in with us that was a great discussion my name is Lillian Corral I'm the head of Technology and Democracy Programs at New America and also the senior director of the Open Technology Institute so as Ann Marie started to share a little bit earlier here at OTI we focus on what internet platforms can do to strike a balance between preserving freedom of expression online and ensuring safety including through our work on the Santa Clara principles and whether or not platforms are run by for-profit entities like Facebook but trust and safety is critical legislatively we believe that the most productive way to think about this public debate is to zoom out from the text of the law and to focus on the broader context of online content moderation and instead of proposing reforms to session 230 we should be working off of widely accepted policy principles on content moderation and algorithmic accountability including as Senator Wyden alluded to the algorithmic accountability act in 2022 which OTI has supported and the Senator has so greatly championed as he mentioned this year the Supreme Court considered two landmark cases indirectly challenging section 230 and it's clear that the court's rulings have sidestepped defining the scope of section 230's reach and sending the challenge back to lawmakers where they must balance the need for greater platform accountability and freedom to publish organized, curate and share content online. This next panel is going to we're going to hear from is a set of public interest organizations and they'll discuss what's at stake for them these are the section 230 reforms and why lawmakers must safeguard our democracy and avoid any hasty or overly blunt changes to the statute so please join me in welcoming our panelists first our co-host and former new American Rebecca McKinnon the Vice President of Global Advocacy at the Wikimedia Foundation feel free to start coming up Peter Ruthier Policy Council at Internet Archive Andrew Lee Digital Media Strategist and Author and Catherine from the Association of Research Libraries the Director of Information Policy and Federal Relations and lastly to moderate the discussion Rebecca Kern for Politico Thank you Rebecca Everyone was introduced so I can say everyone's names again and quickly describe your bio but at the end we have Rebecca McKinnon who's also the co-host today and she's Vice President of Global Advocacy at Wikimedia Foundation Peter Ruthier as Public Policy Council at Internet Archive Andrew Lee Digital Media Strategist Author and Wikimedia at large you would need to explain this to all of us your role there Catherine Klosek Information Policy and Federal Relations Director at the Associates of Research Libraries I'm a tech policy reporter at Politico and have the honor of monitoring the panel and covering Section 230 and watched the Supreme Court cases very closely as you guys did and I wanted to kind of get your positions on the law itself and where you see it's been beneficial in your area of work and we can go down the line Sure Well as Senator first of all thank you so much to New America for hosting this event today at this I think critical year for the law as Senator Widen said Wikimedia would not exist without Section 230 and that is absolutely true Section 230 not only protects and just to clarify the Wikimedia Foundation which I work for is the technical and legal host of volunteer run platforms so Wikipedia of which Andrew is an active editor is all the content is written and edited and uploaded by volunteers with rules that are set by volunteers and enforced by volunteers the foundation hosts technically the servers that enable Wikipedia to be accessed by people around the world and we also house the lawyers and the policy teams that of course deal with many government request demands deal with compliance around the world and so on and so just to come back to our position Section 230 then makes it possible for the foundation to run the servers to host Wikipedia and other volunteer run platforms without having to intervene for fear that anything that people might write on a particular encyclopedia article would result in us getting sued so you couldn't actually enable and empower a community to self govern on platforms that they are building and governing without Section 230 protections without what technically according to legal lingo intermediary liability protections protects us from the liability but similarly it also protects the moderators so Wikipedia may be a site that anyone can edit but it's not a free-for-all zone it's not a state of nature it has rules and governance around what is well sourced content and Andrew will talk more about that but Section 230 also protects people like Andrew who actually enforce the rules so if somebody is posting something that is against the rules for what a well sourced fact is about a particular topic he can delete the content without being afraid of being sued by whoever posted it in the first place so it hosts both the community it protects both the communities and the platform itself to enable communities to build an information environment that serves their needs and interest and in this case an educational encyclopedia in many many languages that it spans across the world and Andrew of course can talk about more examples he's not an employee of the foundation but he and several hundred thousand other people are what make Wikipedia and the related projects what they are today. Do you want to jump in Andrew and share your role? Yeah I guess it might be useful early on to tell you a little bit about what you may infer after hearing more about it but it's great to hear Senator Wyden on stage directly link 230 and the Wikimedia what I call the revolution in my book that I wrote back in 2009 and the reason why a lot of folks may not appreciate this is that when section 230 was proposed and was the law of the land that was a really important time development of the internet in the 1990s and it was the reason why the US is still a leader in entrepreneurship when it comes to user generated content look at all the major platforms online you talk about Facebook, Twitter and Reddit and Wikipedia where they're talking at the time about the user generated revolution Web 2.0, the read write web that we were in the United States a leader in it's also what allowed internet archive to proliferate and to do the things that they do as well so something that you may not know as Rebecca said there's a Wikimedia foundation they have a big budget that they manage from the donations come in but 99% well north of 99% of all the content on Wikipedia is done by volunteers by design because you do not want a nonprofit responsible for all that content and it's the many many thousands of volunteers over the last 20 some years that allowed Wikipedia to become the multilingual number one reference site in the entire world so you may not know this because you probably experienced Wikipedia in only one or two or maybe three languages Wikipedia is available in over 200 plus languages around the world it's the number one reference site in almost every language in the world curiously not Korean but almost every language in the world it is the number one reference site of it's kind and that's really an amazing accomplishment that could only really happen with 230 as the climate for this so just one last thing I'll mention we can get into some discussion we do have this kind of black swan kind of scenario we play in our community every few years because we do have to look out for this right Wikipedia started as a project in the United States originally out of San Diego California is now an international project volunteers are all over the world but the servers are hosted in the United States it's a U.S. nonprofit those servers being in Virginia across the water there and in Texas and other places in the U.S. is really important because we have those 230 protections so we do this black swan scenario where we say if we had either by choice or by necessity we had to move Wikipedia to some other place where what other places could compare to the environment that we have in the United States in terms of section 230 protections legal climate around public domain copyright anything else and every year we do this we cannot think of another country that has the same type of environment as the United States not even close 230 is so uniquely special to Wikipedia's existence and its health and its impact that we've never found another place that is even close to this and we've thought about places like Iceland and other strange jurisdictions we couldn't come close and just think about we can get in this later just how foundational Wikipedia is not just for your day-to-day work for students around the world but also for training AI it is the main corpus that all these AI systems are trained on so as Senator Wyden said would you rather have inoffensive clickbait as the foundation for AI or would you rather have their world's number one reference site contributed by volunteers all around the world that is an amazing resource as that foundation for generative AI and the reason why AI is so hot now is because we have a source like Wikipedia that is so well built over two decades by this volunteer community with 230 as its basis exactly and Peter and Catherine you both kind of have more of the research and library backgrounds so could you kind of briefly just share how 230 is beneficial in each of your fields let's start Peter sure and I think it's important to or it's helpful to take our minds back as Andrew was doing to the 90s to the earlier days of the internet and there were a lot of interesting new ideas and new innovative ideas one of which was Wikipedia that section 230 enabled and the internet archive I think was one of those and it was like many of the other ones looked like a crazy idea right and so one of the crazy ideas that the internet archive had was let's just make a preservation copy of the web right this is the way back machine which is what most of you probably think about if you've heard of the internet archive let's just make a copy and we'll just have a public record copy to preserve for the future of how the web has evolved and what's been put there and you know this was considered kind of wacky at the time and now it's like a core function of the internet's core internet infrastructure and the thing one of the key things that enabled that to happen with section 230 when we carry out our preservation mission our preservation function we're not doing something that when people talk about section 230 reform they tend to be interested in right we're not doing social media we're not doing promotion we're not doing surveillance advertising we're executing on a core preservation function and we're only able to do that and focus on that because we have the protection of section 230 otherwise we'd have to get involved in the sort of content moderation decisions that a number of these laws are seen to impose on social media company shore but frankly the rest of the internet too and so that was part of the core protection that section 230 provided allowed the kind of innovation to flourish that led to Wikipedia that led to the internet archive and that hopefully can lead to new things in the future too as long as we preserve a free and open internet. I'm representing the association of research libraries as was mentioned ARL so I really want to thank New America OTI and Wikimedia for inviting libraries to be part of this conversation so our association is a trade association representing 127 research libraries in the US and so we our member libraries serve communities of researchers academic communities, patrons of research libraries that also function as public libraries like New York Public and Boston Public as well as users of federal libraries so you can imagine the breadth of online activities that our members host and all of these are allowed and facilitated by section 230 just to name a few examples that you might be familiar with you know that libraries collect and digitize collections often that are meaningful to particular communities and a lot of times libraries invite those communities to engage with those works with those collections through crowd sourcing programs or initiatives so inviting third party content because of and user generated content because of section 230 we don't have to be concerned about liability if a nefarious actor were to upload something content that's illegal malicious harmful and so on and libraries provide internet access it's a basic and foundational example so whether it's you know walking into a public library to access the machine or network access through an institution of higher education that's another function that that section 230 protects as well. Academic libraries in particular also operate public access repositories so repositories of digital scholarly works so researchers students faculty can upload you know journal articles conference recaps oral histories you know all kinds of scholarly digital works and because of section 230 libraries don't have to use filters or humans to pre-screen that content or to over remove content which would interfere with the research ecosystem and research functionality so because of all these functions that section 230 has allowed you know libraries to host and foster our position has been we're against repeal more sunsetting section 230 and we've we've cautioned against reform and we've really asked that you know in conversations with congressional staff and others we've asked that libraries have a seat at the table in conversations about section 230 reform or more broadly online harm with section 230 in particular services provided by libraries and educational institutions are named in the statutory definition of interactive computer service so that's one reason that we definitely have a stake in being at the table for conversations about reform but we also have a broader interest in preserving the liability protections of section 230 and that's the function maintaining the functions that I mentioned earlier and the way that we've seen section 230 since the beginning you know foster intellectual activities cultural heritage activities, political discourse information building, knowledge sharing, all the things that libraries and the Wikimedia community care about right and thank you guys for sharing your positions it helps us know where you stand and I think after hearing Senator Wyden speak you know he's obviously very in your camp and cautioning you know against major changes to 230 and alluding to CESA and some of the downfalls of that law going to effect but we still see Congress trying to reform the law we've heard Senator Lindsey Graham talk about a sunsetting provision if it passes five years later 230 would go away that's one of the more extreme viewpoints and then we have other more narrow carve-outs wherein it would require transparency reporting of from platforms that's from Senator Chris Coons and it's bipartisan Senator Bill Cassie signed on that was just reintroduced this year so we have a big spectrum in Congress and I'm not necessarily optimistic Congress will pass anything on 230 reform but for sake of argument where do you guys stand on any potential reforms are there other bills out there you're tracking where you see there could be opportunity to provide more information to users I mean I wonder what you think about transparency requirements could be a basis for researcher access to some of these large platforms where it is not transparent they do not have current access it's not there's no law requiring such access what do you guys think of that proposal for example well I mean this is where I think the fact that a lot of the reforms are focused on big tech and I often find myself in conversations where people are saying the platforms it's like you know Wikipedia is a platform and I'm like oh yeah right you know so people just aren't thinking about platforms beyond big tech when making proposals we're not around the table as much and we have nothing against transparency every edit ever made on Wikipedia is for it's public right you can go to the history tab on any Wikipedia page you can see every edit ever made you can see all the debates about all the edits you can see what were the rules for that page and how they were enforced we published transparency reports the machine learning team works with the community to publish what are called model cards on the machine learning tools that are used so things are very public and transparent we have no problem with transparency right and so that in itself is not a concern we work with researchers we do human rights impact assessments we do human rights do diligence right we do a lot of things that the laws might potentially require I think part of the problem is is that when the laws get crafted they're thinking about particular types of content moderation models they're thinking about certain types of business models that they're particularly concerned about given the harms and then the laws the text of the laws tend to kind of assume that things work in a certain way and put in requirements that only work for certain models and don't work as well for other models so that's where being at the table and making sure there aren't unintended consequences that actually make it harder to just continue on with our model as it is we Wikipedia itself has been designated in the European Union as a very large online platform under the digital services act and of course we will comply and we are working on compliance that includes making some further improvements to our transparency reporting that we're already doing tweaking our terms of use you know the the um we do have a terms of use that the foundation sets to make sure we're compliant with law even while most of the content rules are set by volunteers and we have to do our own risk assessment we have to be subject to audit we're all working through that in principle this all makes sense you know we do believe that we should be you know publicly accountable we've always believed that we're accountable to our community we're accountable to our stakeholders and um you know we're happy to work with policy makers on incentivizing that accountability for ourselves and others there are a lot of open questions just in terms of the specifics of the requirements and whether we'll be able to actually implement the requirements with the resources that we have given that we're a non-profit so that's kind of an open conversation we have with the regulators but the digital services act in Europe is a real test for how these kinds of transparency and impact assessment requirements may or may not work so it might be worth observing a little bit before kind of jumping into something um another thing I'd point out in Senator White and talk about this too you know there is an order in which it would make sense to do things and you know with all the tools in the toolbox the privacy law that is the first step it's not all the harms but is the first step towards addressing the most pernicious harms that are part of the targeted advertising business model which is not part of the business model of any of those of us represented here and so why aren't we going for that first right? Why aren't we going for some other tools in the toolbox that will not affect public interest platforms in the same way and you know it's like with a recipe there's a reason why you do certain things before you do other things if you want the whole thing to make any sense right and this is the same kind of thing um so um yeah we don't quite understand why people are jumping to section 231st if their intention is actually to serve the public interest and not some other purpose when we talked ahead of the panel you talked about a comprehensive privacy law would be actually address some of the concerns on 231st I think just to say the platform accountability and transparency act which has been reintroduced in particular the concept of allowing researchers access to platform data to inform legislation to inform policy changes and proposals I think it's something that we could see a lot of the section 230 proposals that we've seen aren't necessarily calibrated to what they purport to address or solve but yes I think a comprehensive federal privacy law that limited or restricted what platforms can do with user data would go a long way to addressing some of these challenges and online harms and our association has engaged with staffers on the American Data Privacy and Protection Act that was introduced in the last congress that we'll likely see again in this congress although I don't know if it'll need to be updated with considerations for AI but we'll see that law in particular would limit or restrict what platforms can do with user data it would give users more control over their data and their information and we heard the senator talk about civil rights and so the ADPPA had rules against discriminatory practices and so we think that all of that would really go a long way toward addressing some of the online harms that we're concerned about right peter are there any bills you're tracking? yeah I mean we try to keep up there's dozens and dozens of them out there and I think that's kind of part of our point which is like the wikimedia foundation internet archives 501c3 or public charity right and the question with these bills is what are the sort of the baseline question is what are the sort of compliance obligations and are we interested and in these areas does it make sense to impose those and does removing section 230 protection make sense as the way to impose those obligations on platforms right and so I think for us we look at a lot of these bills there's all kinds of good ideas there's lots of good ideas in the bills that are mostly good ideas about regulating social media companies which we're not a lot of them are good ideas about fixing sort of what are perceived as competition problems which would probably be a good idea but we're at public charity we don't really compete at all we're talking about business we don't have a business model like we don't sell anything so there's lots of good ideas but the question is are they directed towards the right place and I think like Rebecca said that includes policy makers the platforms will just fix the platforms and then they write a bill that sort of fundamentally restructures how you can participate online and it tends to have the effect that Senator Wyden was talking about earlier which is well in the end it sort of entrenched these big platforms and makes it harder for the next Wikipedia the next internet archive makes it harder for a research library to operate and perform its mission perform its function online and I have concern on the Hill and probably rightfully so all focus is discussed earlier kids online safety and Senator Durbin chair of the Senate Judiciary Committee and the ranking member Lindsey Graham have discussed holding a section 230 hearing in the coming weeks and it's likely going to be discussing the Earn It Act and the Stop CCM Act Stop CCM was just introduced this year from Durbin and it would allow online child sexual abuse to have their day in court to sue these platforms we are talking about large platforms yes so that's what these are intended to go after but I wanted to talk to you guys about what you think those maybe unintended consequences of obviously noble issue and a concerning issue Wall Street Journal reported a lot about that recently Instagram's algorithm being used to connect profile networks and the theoretical it is happening and it's being taken advantage of on some of these larger platforms so just this issue of CCM content and do you think that may be an opportunity to open up 230 again told them liable for hosting such content and then secondarily like where do you think this could hit you guys and maybe in an unexpected manner hosting CCM is illegal it remains illegal we take measures to remove it to keep it off our platforms we're proactive about it is the case you don't need a new bill for that to be the case I think the senator pointed to the problems of Cesta Fosta and the unintended consequences and we certainly along with many other civil society organizations and nonprofits oppose the urn act because we believe it's potentially catastrophic both for free speech and for privacy and weakening encryption etc and it's kind of like yes there's some ugly stuff on the internet that people should be responsible for dealing with that's undeniable but it's kind of like okay so there's this cockroach on my kitchen counter so I'm going to blow torch the whole kitchen and destroy my children's food and everything in the refrigerator and everything else so we really just need to think about yes we need to get beyond okay I have to do something and show that I'm do something and think more holistically about not just how do I address this problem at hand but what kind of internet environment do I want to create for my communities for the most vulnerable people who are trying to use the internet who are saying controversial things that powerful people don't like that might be edgy that might be you know some people might find offensive but they have every right to say and how do we ensure that all perspectives are protected in that way and not just the majority viewpoint of what is content that's safe for children there's wide debate about that right now just in terms of basic questions about sexuality and so on so we just need to be very careful about where we take things things often sound like a good idea sort of on their face but as soon as you start digging down into the details you realize how a bill can get weaponized in ways that will really not make anyone safer except for powerful people with lots of money does anyone else want weigh in on those bills at hand I think that's really well said and I'm going to borrow the cockroach analogy I think going forward but I'll just add that this is an area where we encourage members of congress to talk to effective communities and to try to come up with a solution that is more calibrated to the problem that they're trying to address libraries are also against child sexual abuse material we also want to make sure that the internet is a safe place for kids but we also push back on urnit because we don't think it's the solution to switch back we obviously saw the Supreme Court did not they punted on ruling on 230 we can get into that but I think everyone has the same position on being pleased that the law is still intact as it was but we do have two cases in the wings from that choice in CCIA that could be taken up this fall and they're challenging the constitutionality of two laws in Texas and Florida currently they're not in effect right now but they would force platforms large companies like Twitter Facebook, Instagram to carry all political viewpoints wanted to get your views on if these cases were to be taken up by the highest court we have a sense that they're asking a cheater to weigh in and it does seem like these maybe cases that Klein and Thomas may want to actually weigh in on where do you think they an upholding of these laws allowing them to go into effect how would those affect your communities well if I could jump in and maybe hand it off to Andrew real quick and I know everybody else has views too but it would be devastating for Wikipedia and the Wikimedia platforms because Wikipedia is about it's not a free for all it's not a place for people's political views Encyclopedia articles require shared agreements about what constitutes well sourced content of what constitutes reliable sources that means that certain sources that come from a particular particularly farther to either extreme viewpoint are not considered reliable by the editor community and so if the Texas law is upheld that opens up editors to all kinds of losses just for taking down content that's coming from conspiracy theory websites that the person who posted the content thinks is their political speech but Andrew please you have examples of situations where you all have been editing content that maybe the person who posted it thinks should stay I just find is curious that we've tossed out the fairness doctrine for broadcast yet we're looking at this as a new thing on the horizon where clearly if you look at airwaves or pathways for broadcasting, radio and television are clearly public goods but we don't have fairness doctrine anymore when I was growing up there was so this is kind of an interesting putting quotes move to try this but as Rebecca said in our community we do have this concept if we look at one policy that makes Wikipedia work it's what we call the neutral point of view so that's kind of the the prime directive in Wikipedia the only way you can get thousands and thousands of editors rowing in the same direction regardless of where they're conservative young, old, liberal whatever is to say we have this kind of objectivity kind of ideal to say we want to try to cover everything in Wikipedia from a way such that all sides can agree and the proportionality of these viewpoints are rooted in reliable sourcing and verifiability so if that law that you're talking about has verifiability and reliable sourcing that baked into it maybe but I don't think it does it talks about political spectrum and political opinions so absolutely what Rebecca said is right is that it would be at odds with the things that not only Wikipedia stand for but academic research of any kind of findings of fact of science that's something that would be really bad and I'd love to hear library insight I don't think it's a surprise what librarians would think but I'd love to hear what you think well it's interesting because I think our concern with federal proposals that we've talked about are that you know the increased risk of liability will potentially lead to censorship restriction of free expression whether that's through over removing content or throttling it from being uploaded to begin with these bills in particular are more targeted to social media platforms so might not affect libraries and research libraries directly but the concern is absolutely the proliferation of misinformation hate speech and harmful content that we're likely to see if these laws are to be upheld as constitutional somehow so libraries are repeatedly found to be a trusted source of reliable information so I think our concern might be more indirect but very much about the internet ecosystem and the proliferation of misinformation with these laws how would this affect the way back machine internet archive as a whole oh jeez I hope I don't find out I mean you know it's interesting right I mean I think starting with Gonzalez so everybody's excited right oh they're going to change 230 they're going to pull back and then it turns out they can just decide as a matter of substantive law well actually there's no liability there which I think some people saw sort of like oh they backed away from 230 they decided they didn't want to do anything after all I think it's actually pretty instructive so 230 is a liability shield that's all it does if you get rid of 230 what are the underlying causes of action that we want to put forward that we want to hold people and companies and platforms responsible for nobody really has a very good answer for that question well many people don't is what I'll say and so this idea that repealing section 230 is going to like do something structural to the internet what it's going to do is it's going to leave to the courts to decide and a bunch of individual cases a bunch of decisions about liability that maybe in the years and years to come will have a structural impact on the internet but that's going to take years and there's going to be this whole process in getting there and I don't think that's what people actually think they want when they say they want to repeal section 230 I think they want something else and so the Texas and the Florida laws are an interesting indication of what some people want I think when they when they talk about repealing section 230 when they talk about changing the rules in the internet or about platforms this is sort of something like this and when you hear what they say I mean it's interesting and to get specifically your point about the internet archive when Texas passed the law there were some statements from Texas and then also in the federal courts of appeal saying oh well this law is going to apply to the three dominant platforms it's going to apply only to the members of NetChoice and that's just not what the law says at all it doesn't say that it says something like 50 million monthly active users which is a pretty small universe but it doesn't say this is just about Facebook and Twitter but that's what the policy makers tend to have in mind when they pass laws like this and so I think for us would we meet the threshold of that law I don't know but the point is the laws are written in this way that they're going to have this broad accountability and you know in this case are almost impossible to comply with frankly and aren't going to lead to the result that people want and would it lead to you having maybe removed past websites that you're hosting or past versions I mean I think again like whether this law would apply to us or not is a question for hopefully I was going to say another day but hopefully not another day but I think it's absolutely the case that if you're pass a broad level that just imposes moderation decisions on platforms where at large you're going to impose moderation decisions on organizations and websites that have for example a preservation function like we do in like many libraries and others in the library community and I don't think that's what you want I don't think you want to moderate the past away or to change what was said in the past but that's going to be the outcome of some of these laws if you're not careful about scoping many years on from the beginning of this day people still aren't being as careful as they might be in that respect and we're seeing senators already start to attempt to regulate AI and put forth legislation saying last week that Ashley alluded to from Senators Hawley and Blumenthal, a unique pairing about saying generative AI content is not covered by section 230 that you would be held liable what are your thoughts on that even Justice Gorsuch put the question before the court it wasn't answered in their rulings but I am curious where all of you stand we heard where Senator Wyden stands do you want to Andrew have you had to moderate anything from generative AI yet or do you know how to trace it even yet? we've seen a lot of influx of generative content and most of it I would say if I were to be daring most of it's been in good faith to say that hey if I can help write an article in Wikipedia with AI or if I can generate an image are volunteers using it? yes they are but then we have the big question of copyright status and things like that but I thought Senator Wyden's comment saying that 230 is about hosting quite interesting it was good to hear it from the OG230 guy on what his views are in terms of the contemporary issues about AI so it's about hosting and not generative AI itself and that's kind of interesting so I think that's similar to what's going on in Wikipedia right now the debate is not that there's anything against AI it's about what are we hosting and are we sure that what is being hosted in Wikipedia is clean in terms of the copyright issues and is clean in terms of the legality with existing laws and it meets the rules that volunteers have set for well-sourced factual neutral content especially this is one of the things AI can mean many things in fact there are those who argue that the term artificial intelligence is kind of meaningless because it covers so many different things and of course algorithms too are different things and so within Wikimedia and the Wikimedia platforms of Wikipedia in particular there is machine learning that is used to help detect spam to help detect malicious edits to help detect bots but it's controlled by humans it's actually used as a tool to empower the human moderators to catch bad stuff and improve and defend the quality of Wikipedia pages at scale in a way they couldn't without these machine learning tools and that's very different from algorithms or AI that is amplifying content that is targeting people based on data that is collected in profiles that are created about so it's kind of a technology that has this overall umbrella label that's the same but it's being used in very different ways so again we have to be careful about what we're regulating and not making blanket overly blunt kind of statements or efforts to regulate so you could use one of the problems with trying to regulate generative AI too broadly is that as Senator Wyden mentions it's increasingly just used as a part of search right so of course there's search on Wikipedia so you can find things right and so we wanted to use some of that functionality that way or use large language models trained on ourselves to help identify disinformation patterns or false information or something like that right so humans can use these tools actually to really advance the public interest to advance free knowledge to detect problems as long as you're using it correctly it all depends on to what end you're using the technology not necessarily all instances of the technology in some blanket way so again we need to be very careful we do believe in being responsible accountable and transparent you know we have a lot of technical volunteers who work with us on the moderation tools we're very open our engineering teams are very open with the community about what is being deployed and the source code and you know people debate it and go through it at great length and certainly we have that continued commitment to transparency both kind of with our volunteers as well as the general public on what's happening technically on the projects and we're doing impact assessment we're doing risk assessment both to fulfill European legal requirements and because we were doing it anyway because that's the responsible thing to do so certainly it's not to say that just because we're nonprofit or just because we're wikipedia we think we're just intrinsically good on all things and people shouldn't scrutinize what we're doing we're not saying that at all but we're also just saying don't legislate in a way that prevents innovation that actually supports fact checking and kind of better distribution of high quality well and we're just thinking about a better distribution of resource content Peter and Katherine are you encountering using generative AI in your fields yet and where do you land if it were to be regulated and you would have to be held liable for that I think that bill in particular is perhaps premature particularly we don't want to see this technology in the future but I think it's a good idea to use generative AI as a senator said in the Gonzalez case ARL joined a brief authored by the electronic frontier foundation where we laid out all the ways that search and the use of algorithms is central to the way that the internet functions so if you're searching for a journal article or the latest ebook popular novel that is swept up in a bill that restricts or allows lawsuits opens up liability for the use of generative AI for instance that would be a huge concern for all the reasons that Rebecca has named how are you Peter so we use some machine learning techniques to do things like OCR books that we scan and a variety of other things like that and that's what AI used to mean until six months ago now it means this amazing future and all these amazing generative tools that have come out so there is that sort of scoping problem again which is like what are we talking about here I think we're probably not trying to talk about OCR but you know there's a lot of really good useful tools that we've used to help build our library that are based on the same technologies so there's that piece of it and then I think the other thing I would say is you know if you look at this panel I mean we're talking about public interest organizations right we're talking about library repositories and libraries and their participation in the internet ecosystem we're talking about Wikipedia and the role of the Wikimedia Foundation we're talking about internet archive and I think the question is do you think the internet is better because Wikipedia is on there right do you think the internet is better because there are public interest organizations participating in the internet ecosystem do you think it would be better if it was a closed system that was only run by these very large for profit companies I think that's kind of the question that comes up in these platform regulation discussions at least when you have people like us on the panel and I think we're going to have that question or already having that question again with generative AI there's proposals about well you're going to need to have a license or you're going to need to do a certain thing or this is going to be limited to a very specific group of people that many of them not only are going to threaten the open source community but they're going to threaten the existence of sort of the Wikimedias of the AI world and the public interest organizations that are otherwise likely to arise and participate in this ecosystem if we let them I want to let everyone know we are opening up to questions from the audience in person or online so if anyone has any okay this gentleman has the question I would like to hear your guys' opinion on some of the right to be deleting laws considering you're all archives and you thrive on the access to information both present, future, and past what do you think about some of the proposals out of the EU and then in states requiring data to be deleted requiring data to be scraped after a certain point how is that going to impact your operations and how do you see those laws moving some of those provisions we saw in the version of the ADPPA that was introduced in the last Congress and so the conversations that we've had with staffers on with congressional staffers on that bill is distinguishing data in library collections from patron data so it's still a question but I think we're hoping that a privacy bill would include important provisions but in a way that libraries as a covered entity because to your point, right, there's archives, there's yearbooks there's newspapers, like all of that stuff so I think we're not sure how it would play but we have made that point to staffers and they're very interested in addressing that Yeah my right to be forgotten in the European Union in particular is being abused with frivolous lawsuits and that's a cautionary tale that people here in the United States need, you know, should be concerned about yeah, that the line between what who is a public figure, what is public interest information versus what is private there are different views of that and yeah it's continuing to be problematic Any other questions? I have a few, I'll keep going if anyone has any, feel free to raise your hand so I wanted to go down the line and asking now Congress is very concerned about AI, we've talked about that but what are some lessons learned from 230 and then efforts to reform successfully says some would say that's not a success but it did become lots the one time it has been changed what are some lessons learned that you would tell lawmakers as they now approach AI and regulating it based on your experience in interacting with lawmakers since 230 was passed and so in trying to embark on a new technology and both I guess from different perspectives not stifle innovation but also ensure safety and and vulnerable communities are protected at the same time Senator Wyden named particular implications of SESTA POSTA including bad actors going to the dark web, violence moving from online to the streets essentially and advocates told Congress that that was going to happen so I think my lesson or take away is to listen to those affected communities with AI, I mean that's a broad range of folks, but I think hold hearings invite witnesses, go on site visits just listen to the folks that are using and affected by the technology and with an eye toward not stifling or throttling it I think as I mentioned before T30 was brought in in 1990s and allowed for this proliferation of .com use generated content Web 2.0 it was really kind of a carrier wave of all these things that made these things happen but also gave America an edge in this area so I know it's hard to keep focus on that but it is true that Wikipedia could not have survived or couldn't have risen without this environment that we had in the United States I think similarly what's going on with AI why it's rapidly accelerating and innovating so quickly is because we have 230,000 people to experiment post what they've done online and get feedback and iterate iterate so it's all a matter of as Senator Wyden said users trading information and allowing this innovation to happen and you couldn't have that if you had just the big companies controlling the portals and you couldn't spin up a website with one minute notice and start posting your content there and having these channels for sharing that information so I think it's definitely very important for the innovations happening especially in the United States yeah I guess I would just say very quickly like if you look at that that early stage that we talked about before and that you were just talking about of innovation on the internet back when it was sort of exciting right back in the late 90s and early 2000s if you can remember it was sort of exciting there were all these cool things happening all the time there were all these new services wow you have a map now wow there's an encyclopedia that people are just writing it was sort of this very interesting exciting time where big things happened and I think it needed the space to do that and that's what 230 did I mean I think that would be a lesson for AI if you look at for example what they've been doing in the European Union on AI they said oh AI next thing so we need to start doing some regulation so they worked for a really long time working on the AI act and they did good work they identified a bunch of specific harms they were very careful what they were doing and then six months ago all these new generative models came out and they said oh wait we've got a change you know and it just so happened that the bill was still open and it happened six months later they would have had a whole bill passed and then they would have to start all over again or something like that which I mean I think that's sort of part of what you're referring to when you talk about the US system versus some other potential system of regulation which is what 230 did in the US was it allowed this space for innovation and experimentation to sort of happen bottom up and I think we probably need some space for that in the new frontiers too I mean I think one lesson is we didn't pass privacy law a long time ago and we are now suffering the consequences and let's pass the darn thing now before we suffer a whole set of new consequences that are going to be very and the senator alluded to this very intermingled with state level surveillance and not just by the state of the country you happen to be living in but nation states you know from all over the world that are seeking to track people across borders and definitely protecting people's data, protecting the way it's going to be used and abused not just by commercial entities but by state actors is absolutely vital to protecting vulnerable communities but to go back to one other thing that Catherine was saying about everybody who is warning about DeFosta and then lo and behold they were right a lot of a lot of civil society civil liberties groups, human rights groups have been calling on government both here in the United States and elsewhere for quite some time that before you pass a law that's affecting how the internet works do a human rights impact assessment do a civil liberties impact assessment really red team it through how is this thing, you know what's going to go wrong here just like you do environmental impact assessments and things like that that it actually needs to be thought through the most vulnerable communities faced by the most bad faith actors out there how are those two things going to play out when such and such law gets passed and if that had been done with DeFosta I think it would have people were already warning that that was going to happen and that needs to be taken seriously and relatedly you know there's this mentality I think that you can fix the internet like you fix your television or you fix your refrigerator like you fix an appliance it's kind of like saying I'm going to fix crime in Washington DC sure you could fix crime in Washington DC and have zero crime would be Pyongyang just like the capital of North Korea there's a reason why we don't want to be the capital of North Korea it's about governance it's about making sure that yes you have rules because you don't want to be a state of nature you want to be able to walk outside in broad daylight as a woman which you can't do without police and without rules but the rules and the enforcement need to interact well with the community and need to make sure that people civil liberties and economic rights and other things are all taken into account right and you don't just kind of say okay I'm going to pass this thing stop and frisk and that's going to solve crime in Washington DC problem fixed and but people seem to have that kind of let's fix the internet mentality when it comes to legislation that just doesn't make any sense yeah right and we do have an audience question from online and this person is asking isn't another reason to think carefully about section 230 reforms is because of its profoundly global impact which you've all you're alluding to and you know these are global companies and a lot of how they act here is how they try to act abroad but we've seen some censorship in India and other countries so you know to what extent do you think 230 has been able to be globalized and also where do you think it butts up against regional restrictions yeah I mean just to give an example also how internet archive and Wikipedia and other things so as we discussed section Wikipedia would not exist without section 230, Wikipedia is accessible in hundreds of languages around the world so just one really specific example is in Hong Kong where a bunch of newspapers have been shut down and their websites have been taken off the internet da da da enter internet archive so the Apple Daily whose publisher is now in jail in Hong Kong went offline but the internet archive has archived their articles and so there are a bunch of articles on Wikipedia that are based on sources whose websites no longer exist because they got shut down by the government but you can still access that content and it is still linked on Wikipedia pages thanks to the internet archive and so thanks to and Wikipedia is able to exist with that information thanks to section 230 and the people editing these pages are all over the world of course including in the United States and elsewhere and somebody in the United States might sue an editor editing something about Hong Kong because they think their political speech got it could be weaponized in ways that nobody's thinking of right now based on political views of people in relation to the politics in other countries right so again it has a big impact globally yeah great I think my last question would be for each of you like what is your one ask from Congress right now with 230 reform and as they're imagining legislating on AI we'll see what would you advise them if they were to ask for your communities input I think invite us to be witnesses of the upcoming hearing would be great truly like I keep saying hearing from affected communities and that's one way of doing it right we've also signed you know big tent letters right sort of laying out a lot of these points you know been entered into the record so I think just like the more advocacy you know we can do the better and I also wanted to yeah I think that's such a great question about sort of the you know globalness of these issues and privacy and everything that we've talked about because I do think there are other so ARL has talked to other national library associations and other countries around what is the vision for you know a safe and inclusive internet and are there you know things that we can come together on and one thing is the UN has said that internet access is a you know as a human right and libraries provide you know internet access so what is our obligation there and you know revisiting that neutrality principles and rules or other potential solutions so I think it's you know listening to affected communities and sort of broadening the scope of or understanding the problem that's to be solved and again working toward those solutions and it's not just one problem right there's you know child sexual abuse material there's all kinds of issues on online but I think to again work on calibrating you know solutions to the problems listening to affected communities inviting us to the table would all be really beneficial I think well to everybody yeah I'll take some of my time just to amplify what she said it's like the nice thing about I lived outside the US for six years teaching overseas and you realize looking around other folks a lot of governments have a lot of respect for scholarship academics libraries museums unfortunately the United States is not the same and it's kind of surprising in that they don't bring in a lot of folks who are experts or scholars or memory institutions social science research as Rebecca said it's amazing how much Congress does not embrace that level of expertise for a lot of its deliberation so I think that is something that I definitely agree with you on is that bring us to the table just emphasize how much Wikipedia would not exist without internet archive and libraries right we talked about how Wikipedia relies on verifiable information that chain of verification is rooted in what libraries provide as the base material and preserving it for long term and in our archive in terms of when sites go away link rot as we call it or when sites get shut down something as you know far away as Apple daily but also things like the newspapers in Colorado right the the Rocky Mountain news and all those folks Pulitzer prize winning organizations are just gone and only in our archive and these folks have a copy of that we would have to delete hundreds and thousands of articles in Wikipedia day if we didn't have these two folks or these two types of organizations working with us so for Congress to appreciate that we are an ecosystem that is so crucial to not only global knowledge but American competitiveness if you just want to pander to make America really competitive this area keep 230 around so that we're still in that leadership position yeah I mean I'll just keep piling on I guess I mean I think the markets amazing and and the ability of markets to like solve real problems and build real things is is like incredible right it's amazing but there are things where there's just no market incentive for them to create or exist or to continue exist and we keep talking about preservation and obviously there's a reason why I keep talking about that but that's an important area so we preserve a lot of books millions and millions of books that we've preserved that there's just no market incentive to preserve them once they're no longer commercially viable there just isn't one and if we can digitize them and preserve them for the long haul that's an important public good that helps a lot many of them are saying Wikipedia articles many of them are really important to researchers across all different spectrums and there's just not a good market mechanism for solving that problem and so I think it's important what I would say is just echoing what the two of you have said which is we should keep that in mind right when we create new rules of the road whether it's AI or anything else that we want to have a space for public interest organizations like these ones to fulfill their mission driven roles which are often and most often the roles that the markets design will solve you know just finally the ask to congress is every time you draft something bring us all in and ask the question what will this mean for wikipedia what will this mean for the internet archive how will it how will it affect libraries game it through with us yeah great well thank you guys this is a really interesting panel and thanks for all the online and audience questions I want to cue it to the other Rebecca for closing statements as the co-host of the event but thank you guys for the opportunity thanks so much I'm just going to say just real quick because I said most things already but thanks so much to New America for hosting this event today I used to work here so it feels like a real homecoming as well so it's very grateful for that and just to re-emphasize that in this age of chat GPT and generative AI facts and the people who both report curate preserve share archive facts are more vulnerable than ever to attack by those who would rather perpetuate other narratives and more valuable than ever because both economically but as far as an open and democratic society is concerned without Wikipedia without independent journalism independent researchers archives libraries public interest technology open data repositories without this ecosystem call it the digital commons call it digital public infrastructure call it what you will public interest technology without this we're nothing but a big swamp of hallucinating large language models or that's what we can become or something that's manipulated by those with the most power and money so we need to make sure that the law is preserving and protecting not just the models but the people who do this work who are the most vulnerable to attack now than ever and not just in this country but in some other countries seriously vulnerable and it's of the things that keep me up at night how do we protect wikipedia all over the world who are trying to get information out about Ukraine you know websites in Russia or wherever it is that are going offline that are being preserved by internet archive research that has no home anywhere except in libraries that you know are funded and protected against the political vagaries of many other regimes it's so important for the future of the world and I really hope that our political leaders and lawmakers will do the right thing thank you thank you so much