 Hello, everyone, and welcome. I'm April Glazer, an activist with the Electronic Frontier Foundation. We're located in San Francisco, California, and we work to make sure that your rights go with you when you go online. Today is April 4th, and for us at EFF, that means it's 404-day. It's a day we're using to call attention to the long-standing problem of internet censorship and public libraries and schools. We've partnered with the Center for Civic Media at MIT and the National Coalition Against Censorship to organize this digital teach-in where we'll discuss internet filters that are used on public computers and libraries and schools across the country to block online content and ban websites. People can join the conversation live and ask questions that all present to the panel in our question-and-answer session at the end of the teach-in using the hashtag 404-day. We're joined by Deborah Caldwell-Stone, a lawyer and the Deputy Director of the American Library Association's Office for Intellectual Freedom. Sarah Hutan, the Director for the San Rafael Public Library in Northern California. She's also blogged as a librarian in black for over a decade. And last but certainly not least, we have Chris Peterson, a research affiliate at the Center for Civic Media at the MIT Media Lab. He's currently working on the Mapping Information Access project. Deborah's going to start us off. Hi, Deborah. Go ahead and start. Yes. Good afternoon, everyone, and thank you for inviting me to participate today. Let's begin by remembering that in the United States, the internet is a place for free speech. In 1997, via a remarkable 9-0 unanimous decision in Reno v. ACLU, the Supreme Court declared that the First Amendment applies to the internet without any limitations. Yet today, internet-filtering software, a tool specifically designed to censor content, is one of the primary means of managing access to the internet in public libraries and public schools across the United States. In fact, the filters used in many of our public libraries and schools are the same filters used by repressive regimes in China, Iran, and elsewhere to suppress dissent and limit access to information. Now, we know from court cases here in the United States and complaints made by library users that due to the use of internet filters, libraries and schools are denying access to a broad range of First Amendment-protected speech that includes controversial topics and social media tools. For example, one school blocked access to sites promoting LGBT rights, sites without any sexual content at all, on the grounds that such sites were indecent or unsuitable for minors. Now, in fairness, much of this filtering is the result of Congress's adoption of the Children's Internet Protection Act, or CEPA. That's the law that requires schools and libraries to use internet filters in order to receive federal funding for internet access. Now, congressional efforts to address concerns about children's access to obscene content online began almost as soon as the internet was open to the public. In 1996, Congress passed the Communications Decency Act, or CDA, as its first attempt to impose regulations on internet and decency. The law was a broad ban that criminalized any publication of what they called patently offensive material, and within the year, the Supreme Court struck down that law in the Reno decision. Then Congress tried again, passing the Children's Online Protection Act, or COPA, which banned commercial publication of harmful to minors' material on the internet. It too was struck down on First Amendment grounds because it effectively barred adults from reading online content that was perfectly legal for adults to view, but might be deemed objectionable for minors. Now, the Supreme Court decisions over turning the CDA and COPA established two important principles. First, that the First Amendment applies without restriction to all material published on or provided through the internet, and that direct restrictions on constitutionally protected speech intended to protect children from obscene content are unlikely to be found constitutional if the laws operate to restrict adult access to constitutionally protected speech. Now, the Children's Internet Protection Act was proposed in the wake of the Supreme Court decisions and responds to those decisions. Unlike the CDA and COPA, CEPA does not put the government in the business of directly regulating speech and violation of the First Amendment. Instead, it places conditions on schools and libraries' use of federal funds intended to subsidize internet connections, thereby requiring schools and libraries to take on the duty of regulating library users' internet access. In practical terms, libraries and schools who use federal funds to help pay for internet access must install and employ filtering software on their computers and certify to the government that it has a technology protection measure in place that protects against access to certain pictures or visual images on the internet. Now, CEPA only applies to those libraries that receive federal funds for internet access. No library or schools required to accept these funds and a library can choose to forego these internet subsidies and retain local control of their internet use policies. Now, many libraries have done this, but all too often, libraries and school districts absolutely need these federal funds to provide internet access to their communities and, as a result, have installed filters on their computers. Now, under CEPA, filtering software must do one particular task, and that is to block access to visual images that are either obscene, child pornography, or harmful to minors. An image is deemed obscene if it depicts patently offensive or hardcore sexual content that's designed to appeal to period interest and lack serious literary, artistic, political, or scientific value. Now, only a very small amount of material ever meets this task, because most material has some literary value or artistic value, and thus is protected by the First Amendment. Then the law bars accessing child pornography, and this is usually an image of an actual child engaged in a sex act. The law criminalizes these images because of the abuse of the child forced to make the image, but this does not apply to text-based material or anime or images of adults pretending to be children unless a court of law has determined those images to be obscene. And finally, we come to harmful to minors images, and this is the term to describe images that have sexual themes or may even be sexually explicit that adults actually have a right to access under the First Amendment, but lack any serious literary, artistic, political, or scientific value for minors. This determination is made by a court of law and is done in the context of what might be harmful to the oldest of minors, that is a minor aged 17. The measurement is never what might be harmful to those under a five-year-old, for example. So if you read the Children's Internet Protection Act very carefully, you'll find that the law never employs the word pornography or requires libraries to block porn, a word that really has no meaning under the law. The plain language of the law only requires libraries receiving federal funds for internet access to visual images that fall under the three narrow categories of unprotected speech that by long tradition have been illegal even under the First Amendment. It was illegal to distribute our own obscenity or child pornography long before SIPA became law, and SIPA has not changed that situation. So SIPA does not require or authorize blocking access to narratives or other text-based material, blocking access to constitutionally protected material, blocking access to controversial viewpoints or subjects, blocking access to social media platforms or search tools, or tracking or monitoring users' web surfing habits. So contrary to what some pro-filtering advocates say, SIPA should not be seen as a license to filter constitutionally protected materials for adults. Nor should the law be used to deny young people access to controversial views, social media tools like Facebook, or otherwise deny them access to suitable websites that are erroneously blocked by the internet filter. Such overblocking has become so bad that the FCC has included cautionary language in its last set of rules issued on SIPA that tells schools and libraries that sites like Facebook should never be regarded as harmful to minors and should never be blocked in the first instance by filters. Now, the need to consider the First Amendment when devising filtering policies for public libraries and public schools is confirmed by the Supreme Court's opinion that upheld the Children's Internet Protection Act in 2003. While that plurality opinion held that the First Amendment does not prohibit Congress from requiring public libraries and public schools from using filtering software as a condition of receiving federal funds for internet access, it did place an important qualification on its decision. SIPA could be upheld as long as adults could request that filters be disabled without needing to explain that request to disable the filters. Allowing adults to disable the filter or unblock websites functions as a First Amendment safety valve and acknowledges that the use of filtering software to deny library users access to constitutionally protected speech on the internet does and can violate the First Amendment. So how has this whole situation affected library users and students? We've drawing on some extensive research and on presentations and discussions held during a national symposium on SIPA held in July of 2013, a study conducted by the American Library Association this year identified a huge overreach in the implementation of SIPA, far beyond the requirements and the intent of the law. This overreach stems from misinterpretation of the law, different perceptions of how to filter in libraries, and the limitations of internet filtering software itself. The net result is over-filtering that blocks access to much legitimate educational online resources while often feeling to block images that are prescribed by law that is under blocking. Over-filtering limits access to information and learning opportunities for both children and adults and it has the unfortunate result of impacting those who can benefit most from public internet access. The 60 million Americans without access to either a home broadband connection or a smartphone who are required to access the internet through public libraries and schools and are subject to SIPA's prescriptions. All of this highlights the need to re-examine how libraries use internet filters and to recommit to providing internet access in a manner that's consistent with the First Amendment and the library profession's own commitment to the core values of intellectual freedom and equity of access. Of course, for libraries that need to accept e-rate funding for internet access, the challenge is to comply with SIPA while at the same time fulfilling the library's mission to provide content and not suppress it and to increase access and not restrict it. But it is entirely possible for libraries and schools to meet this challenge and to re-open the door to the internet by adopting First Amendment-friendly filtering policies under SIPA that minimize the use of filters and ensure that the filter is not blocking protected speech. April? Yes, thank you so much, Deborah. That was fantastic. And so next Sarah is going to talk about her experience in getting filters not installed on the library system where she was working in San Jose. Yes, thank you. I am currently the director for the San Rafael Public Library. We do not filter our internet access here at the library, but my previous job was with the San Jose Public Library as their digital futures manager. And I was hired in 2007 almost immediately after a directive was issued by a city council member for the library to examine filtering. He had been approached by the values advocacy council to try to get filters implemented in San Jose's libraries. And so the library was asked to conduct an extensive, and I do mean extensive survey, evaluation and response about both the efficacy and the cost effectiveness of filtering in order to obtain the goal which was e-rate funding for the library. The result was San Jose did not filter and still does not filter their access either. This was a huge team effort of the staff down at San Jose and took a number of months to conclude our research, which was done in late 2007 and early 2008. The outcomes were basically that the effectiveness in four filters of filtering out text-based content was approximately 80% effective or accurate. And for anything that was non-text, so images, video, multimedia, RSS, the accuracy there was under 50%. So you literally could flip a coin and get better accuracy ratings than we were getting with the filtering software. And we tested several different software from the cheap to the expensive. And while the results did vary from piece to piece configuration to configuration, the accuracy results were pretty consistent. What was interesting to me was that our survey results matched up completely with all of the other survey results that had been done over the years. And since that basically 80% accuracy rating for text content, but less than 50% for images or video. And if you'll remember, Debra was just saying that SEPA's requirement is only that we filter out images that are obscene or harmful to minors. And if your image success rating is less than 50%, is this really an effective tool to do what you are saying the tool is doing? The other outcome that we found from that was that it would actually cost the library dramatically more to implement filters than we would ever get in e-rate funding. And I think that's a question that few libraries and their political bodies aren't asking. So we found that to get about $40,000 in e-rate funding, we would have to spend somewhere between $140,000 and $400,000 to implement and maintain filters. And that was not a winning scenario. So the whole point of the suggestion was that we were going to be getting free money from e-rate by putting filters in place, and we disproved that as well. The San Jose study is up on the web on San Jose's website. You can find it pretty easily. And it really does a good job of outlining pretty much all of the factors that you're considering if you're looking at filtering your library or hopefully not filtering. Now, this was back in 2008, but since then, I've been conducting my own personal research on filters on a pretty much an ongoing basis because now I'm obsessed and I'm interested with how they work and how to break them and how they don't work. And I have not seen anything really change over the last six years. There's been no technological innovation that blew everybody away. There's no new product that trumps all of the rest of the competition. The accuracy results I get doing tests today are the exact same as the accuracy results I got testing six years ago. So I don't think that technology has gotten better. I don't think that it's magically absorbed some kind of sentient AI that can distinguish between what's obscene and harmful and what isn't. This is a technology that's using an algorithm to figure out what's obscene and harmful, and therefore it is valuable. So filtering products, like I said, they range from the cheap to the expensive, the good to the bad, very basic to extremely configurable, but most of them were designed either for home use for parents or they were designed for ISP level filtering as Deborah mentioned in countries where dictatorships are basically limiting free speech within their countries. And those are the products that libraries have to choose from, so it's not ideal. Another huge problem is that these are black box products. We have no idea how the companies are coming up with their quote unquote naughty lists. We have no idea what the criteria are that they use to develop these categories. This is sort of their secret sauce for their corporation, so they're not gonna share that information with their clients. Sometimes you're lucky enough that you have a company that's spot checking results with an actual live human being, but most of the time the algorithms results are purely what's being used for the filtering software. It's usually blocking URLs or IP addresses specifically or looking at keyword analysis. And keyword analysis uses almost exclusively Google for search results. So one really easy way to get around filters is to use a search engine other than Google and most likely some of the results you're seeing are not gonna be filtered because they didn't come up in the top 10 or 100 search results when the same search was run on Google. There is some more advanced image analysis being done with filters today than there was six years ago, but it's extremely fallible and easy to break. So there is face recognition software being done, shape analysis, pixel color analysis, but there are huge, huge problems with that. The pixel analysis looks for skin tone. So if you are of a lighter skin tone or a darker skin tone, you're not gonna get picked up as being skin tone. The facial recognition is extremely fallible, recognizes things as faces that aren't. The shape recognition recognizes things as genitalia that aren't, and it's very, very easy to break. The big problem that we see with filters is that the better that they are at blocking the quote-unquote naughty stuff, the stuff they wanna block, the worse they are at also blocking constitutionally. So more naughty stuff, more totally freedom of speech accessible stuff they're blocking as well, which is extremely frightening. The more expensive products are like that, and the more money you spend on a product, the more likely it is that you are gonna be blocking constitutionally protected speech, and therefore putting your library at risk for a lawsuit, because those are the lawsuits that happen, are the ones when libraries are over blocking. No library has been successfully sued for quote-unquote under blocking yet. So getting around filters, there's so many ways with proxy servers, using Tor, using Peacefire, many, many sites set up specifically to get around filters. It's simple to do, kids know how to do it. They carry around Tor or other services on flash drives and plug those in at their school computers. I mean, it does not take a genius to figure out how to get around the filters. Other ways that you can get around them, IP refer services are rampant out there. You can look at cached versions of sites instead of the live version and get right to it, because most filters don't block that. You can slightly misspell words in your search terms and find access to things that the filter would normally block. If you speak a language other than English, or can use a translator software to quickly look up how you say a particular word in another language, you'll probably get through, because most of the filters are extremely English-centric. And as I said before, use a search engine other than Google, and most likely you're gonna end up getting through to a lot of stuff that the filter says it blocks. Also, interestingly enough, adult content providers have been working very hard to get around filters. So they are tailoring pages specifically to be below the threshold of the naughty criteria. They misspell words. They intentionally use poor metadata on their pages and services to get around the filters. I wanted to close just by saying that to me as a librarian, filtering is censorship, plain and simple, and it's using a technological means to prevent someone from accessing content. And while I believe that generally the motivations of people trying to get filters into libraries are well-intentioned, they are usually based on the misguided notion that filters actually do what we think they should do, which is to only block images that are harmful or obscene. And that's not what happens. They tend to not block that stuff, but they do block constitutionally protected speech. And installing filters in a library or using a public library where there are filters, no matter the brand of the filter or the configuration is censorship. And as librarians, we shouldn't be censoring our public's right to see constitutionally protected information. Worse yet, most of the libraries who feel that they have to filter to get e-rate funding are in the poorest, least educated communities, meaning that the most vulnerable among our country's population are the ones who are receiving substandard access to the internet and not able to truly become the functional digital citizens that they need to survive in today's world. And so with that, I will turn it back over to April, thanks. Great, that was super interesting, Sarah, thank you. And now Chris is gonna talk about the research project that he's involved in called Mapping Information Access. He's one of the lead researchers on that project. Also, if you have any questions for the group, please use the hashtag 404day on Twitter and we're gonna take those questions to our panel after everyone's done, thank you. Thanks for having me, April, and for the EFF for co-hosting this with us. As I said, my name is Chris Peterson. I am a research affiliate at the Center for Civic Media here at MIT and I also work with the National Coalition Against Censorship. And I am co-PI on the Mapping Information Access, which I get mappinginfoaccess.org along with my collaborators Emily Knox, a professor at the University of Illinois, Shannon Oltman, a professor at the University of Kentucky, and Sean Musgrave, who is project editor at Muckrock, a company that helps facilitate citizen access to public records through FOIA and various sunshine laws with a pretty great app. This project for us was started as its roots in an earlier project that tried to figure out the landscape of banned and challenged books through public libraries and public school libraries. But we have recently been modifying the project because as we all know, books are not the only way that we get information anymore that libraries, internet access in schools and in public libraries is a critical component, especially as Sarah outlined, particularly for the size of a portion of Americans who do not have reliable internet access at home and whose only access may be through whatever venue they can get at their school or at their public library. And what we realized was that basically the terrain of information access in these public institutions is still not well-known and not well mapped. Deborah Colwell Stone had a great introduction today talking about the history of SIPA, the history of the Supreme Court's jurisprudence in this area. And what we've been trying to do is figure out, okay, what does this terrain actually look like? What types of sites are blocked? For what reasons, at what places? How is this manner of blocking enforced? Through what regimes and with what provisions? At what cost? And the way that we've been doing this is by working with Muck Rock to send public records requests to public institutions, so in this case, libraries and school libraries. We had done an earlier version of this project in Massachusetts just asking for books, but we're currently doing a pilot in Alabama because alphabetically it's first. And our project took a very simple step at the very beginning, which is we just sent letters and emails to every public school district and every public library system that we could find a record of in Alabama, about a little bit over 300 institutions and said, among other things, what websites do you block? With what systems, if any, and at what cost? And what has been most interesting to me and what has been most surprising to me is that the manner of response here from, as Sarah said, really well-intentioned public servants has been just as fascinating as the actual answers themselves because what schools block and how they block or whether they block turns out to be a bit of a quicksand. So we've had librarians who have responded to us saying, we don't block anything because we've taken a position similar to the one that Sarah has advocated so compellingly at other institutions. We've had libraries saying, well, we offer filtered internet access but it's not run by us, it's run by the separate cooperative IT department so we can't tell you because we don't know. We've had libraries say, well, we do this in accordance with federal legislation only to children's computers or computers in the children's area of the library. We've had them say, well, we do this in accordance with federal legislation on every computer in our system. We have had people say, well, we don't have computers, we operate this small print library, we don't go from there. And I think that that is the biggest thing that I've taken away from this project in which we are still in the earliest stages is that the, it's to quote Rumsfeld, it's the unknown unknowns that are the most interesting here, right? We thought we went into this knowing that we did not know what sites were blocked or where they were blocked, how much it was costing. But it's what's really becoming apparent to us now is how uncertain the entire public records apparatus is in dealing with a complex system like this. When there are library, and some of the responses we've had have just been things like, I'm sorry, I can't, we're a poorly funded library and I don't have the time of the staff to be able to answer these questions. Some libraries said, we're more than happy to make these records available to you if you come to Alabama and copy them yourselves. One response that I had that I just wanted to pull up that I thought was just fascinating was from one library which uses Net Nanny and they block all of their computers because computers can, children could see them from anywhere in the room, but they also say that not all the categories are blocked for content, some are blocked in order to keep some patrons from monopolizing the computers for long periods of time. And I think what our research is revealing in its earliest stages, and I do want to stress again that this is at its earliest stages and the level that we hope to expand to a much broader national scope is that people think they know how SIPA works. And they think that SIPA specifies certain things and that the federal law does a particular thing and the disagreement is whether you think it should do that particular thing or whether you think it shouldn't do that particular thing. Should children be able to look at porn on computers in libraries or not? But the lived reality of this is much more complex. The lived reality of this is that there is no standard not only by community and the community standards obscenity test sense, but just there is no standard company, there's no standard understanding of what SIPA entails, there's no standard understanding of what should be blocked or why it should be blocked. Companies charge various things, as Sarah said, it's not only the algorithms or black boxes, but that when that IT administrators at different libraries can go ahead and say this website belongs in this category and then they go ahead and put it into that box wherever. And this is not the level of uncertainty with which the patrons of public institutions in an area of free speech should really have to be negotiating. Again, not so much because of the question that everybody thinks we're dealing with, which is this controversy of what sort of stuff that people should be allowed to view in a public institute or not. As much as it is this much more residual level of like who can see what information with what resources, under what conditions. And I think that it's so important for librarians and bibliophiles and civil libertarians and supporters of access to information to realize that this isn't just about petty but perennial disagreements over what is appropriate and what is not appropriate. It's not about where that line is drawn, that is one of the many lines that is drawn here. What is also critically about, as in this case, this library blocks so many things because they have a small library, children can see the computers from anywhere and they feel that they have to make a decision to limit people's time on Facebook, even if they're adults, even if it's not against the law because of the lack of resources that are available to them. What I hope that we'll get out of our research project when I will be interested to the broader community that's involved in 404 days like-minded initiatives is to map out this terrain more accurately with my collaborators and to help us develop a better sense so we can say, look, if we are going to have some kind of filtering regime, let's do it consistently, let's do it well, let's understand what we are doing because there's only once we understand what we are doing that we really can hope to negotiate the sort of free access to information that we all deserve and to which we are all entitled. Thank you so much, Chris. That was super interesting. So now we did get one question. We've gotten a couple of questions now from folks on Twitter and if you have questions for us on Twitter, feel free to join us at the hashtag pound 404day. So we got a question here from the handle at Sean McKay Beck and he asks, have there been studies on how many miners know how to circumvent filters? So maybe Deborah Caldwell Stone would know about this or Sarah. Hi, April. That's one of the issues that we have is that there's been very little research, peer-reviewed research done on how CEPA is being implemented in libraries and that's one of the tasks we're hoping to take up in the next few years here through ALA but we do have an unlimited number of anecdotes that we've heard from those who provide IT services to schools and public libraries that students routinely know about and share information on how to defeat filters. I can tell you that my own daughter learned how to defeat filters in her high school where they had filtering in place and told me that they routinely traded proxies so that they could defeat the filters in their school library. So I have personal experience with this but we find that when there's often, and so many things that are banned, when you deny access, those who are denied access devise means of getting around that block or that ban. They go out and they find that block or they get around the block on the internet. Great, so I have a question. I would just like to say I would like to add my own anecdotes on top of that stack because the first time that I ever figured out that I needed to use SSH or figure out how proxies work was to evade my school filters and maybe to a certain extent thank you CEPA because if my own high school hadn't had such a chaotically implemented system I probably would have never started screwing around with the technical underpinnings of computers to the extent that I have. But I would agree that it probably student circumvention while anarchically productive probably undermines the intent of CEPA as constituted. Right, great point. So I'm curious if you all have any tips for librarians that wanna start useful and non-threatening conversations about how to get internet filters turned off or get the blacklisted sites reassessed in their library systems or schools. Sarah, I thought maybe you might have tips on how to get those conversations started. Sure, for us at San Jose and with the libraries that I've helped on this issue since then it's really about changing the conversation and Chris mentioned this too. The question isn't what's naughty and for goodness sake protect the children. We all wanna protect children but the filters aren't doing that. The filters are creating a false sense of security and as I said they often cost more than you're gonna get in return funding through E-Rate. And so I would encourage libraries to do what San Jose did which is do a real assessment of the staff cost, the equipment cost, network costs including all sorts of things from training to administration to unblocking things time by time to IT administration and changing the conversation from is it morally right or wrong to is it cost effective for the library because dollars speak in civic agencies and then also do the filters work or not and looking at the studies that have been done and saying is this actually doing what we think it does do? And for libraries who are looking to configure their filters to be the least restrictive required under SIPA there have been a number of articles written by Debra and others about what to look for there and how to get the best possible configuration if you are stuck in a place with filters. Debra do you have any comments on that on how to get a conversation started amongst librarians or people in public schools that would like to get the filters turned off? Okay so we'll move on to the next question then. I'm curious what patrons should do when they encounter a 404 message or when they encounter a website that they feel should not be filtered? Frankly they should request that the site be unblocked if they find that they've tried to access the site that there's no question that they should be able to access Facebook or a healthcare site. We find that a lot of healthcare content is blocked by filters simply because the images or the wording triggers filters often and make a request to their library pursuant to the SIPA decision to have that site unblocked. That may start a conversation with the library. When libraries are aware that this kind of blocking is impeding access to things like homework sites and things often they'll start the conversation again. We know that here in the Chicago area a local suburban library removed its filters, entirety and their entirety for all ages when they found that the kids couldn't even get to their own high school's website because of the filter, couldn't even do their own homework and it's just having a fact-based conversation in what filters can and can't do and what their purpose is and providing choice to users on that issue. Yeah, so I wanted to jump in and say that one great resource for trying to figure out you know what types of blocks, what types of sites are blocked and a place to report it would be the Herdikt project, H-E-R-D-I-C-T, the verdict of the herd, you can remember it that way, which is run by my friends and colleagues down the road at the Berkman Center for Internet Society at Harvard. And Ryan Buttig, the project manager for Herdikt, blogged about this, I'm sure that the EFF can tweet something out about it, but Herdikt is essentially trying to develop a view of the internet from the perspective of then user computers. And how Herdikt is implemented under and how you would exactly report to it, it's gonna depend in general on which computer you're accessing from and what are conditions, but definitely check out Herdikt if you're interested in trying to figure out how you can help contribute to a more well-developed grassroots sense of what the internet looks like from a thousand tiny points of light. Herdikt is your front. Yeah, it's a great resource and we recommend people use it too. I'm curious if any of you have any examples of discretionary filtering that you might share with us, where librarians or public schools have decided to block without reference to what would be protected by the First Amendment. Yeah, I will say that a number of our responses that come back do not try to situate things in the SIPA context, although I'm sure that if you ask, they would take them with that as much as it is, look, we need to keep control over our library. That one thing that I read earlier was from a librarian who said, look, we've got a couple of computers, we have a lot of people where this is their only internet access, we put a time limit on them and we don't only put a time limit on them, but we block social media because if they can't get to Facebook, they're a lot less likely to use up their time limit. This is even more common in schools where there's kind of a whole, let's bracket that aside, because there are some different issues with school administration. And I'm deeply sympathetic to this, right? Like library administrators have difficult jobs, they're allocating scarce resources when it comes to their computational infrastructure, but it is worth taking seriously the question of, well, if you are a public library and if you're publicly funded and if part of your mandate is to provide access to information, then can you, as an institution, really justify your decision to block certain sites as a method of enforcing a form of control and patron management as opposed to, because the law, however well or poorly formed, requires it. And again, that's a deeply difficult question, I have the deepest sympathy for actual practitioners who are running libraries. But I think that's one thing that our research is revealing, is that filters are used, not infrequently, as a form of enforcing a particular set of control over patrons, sometimes using SIPA as a pretext and sometimes not even. I have to agree with this. We find that SIPA, because of the filtering mandate encouraged by SIPA, it's actually legitimized filtering and censorship is a means of resolving controversies over content in the library, particularly sexually themed content or content that's disfavored by the government. It seems to elevate the interest in protecting children over all other values, including freedom and civil liberties. And these shifting norms, these proposals to manage issues like hacking, copyright infringement, cyber crime and more, are being addressed by policies that simply deny users access to technology. And so we're deeply concerned about this. We are developing a set of proposals at ALA to increase awareness about alternatives to over filtering and the kinds of policies that can be used to enable access and the best of circumstances. And we're working on a toolkit. We will be releasing a white paper on SIPA and internet filtering at the end of the month that will address a lot of these issues. And as I mentioned earlier, we really want to conduct more research on how filtering is impacting individuals access so that we can build a factual case for why there are better means of managing internet access in libraries and to protect everyone's ability to access information on the internet. Great, thank you so much, Deborah and Chris, for those comments. We have a question now from Twitter, someone that chimed in with the hashtag 404day. Actually, it's a comment from at Devin Pratter. He says, I honestly think the best way of dealing with this is to educate the society, not block. I was wondering if the panelists had any comments on that. Sarah, I'll start with you. I can speak in my capacity as a library director that I've had a few parents and other concerned citizens ask why we don't filter and they're coming at it from the perspective of we should be filtering. And as soon as I tell them how they work, what they block, what they don't block, how there's no guarantees at all with this software that it's really hit or miss and how expensive it is to implement, they're always very receptive then to our stance that no, we're not going to filter. And so I've just found kind of grassroots education, person by person as the topics come up has worked really, really well for me. As much as we try to get the word out about filters and censorship and what works and doesn't, it's actually relevant in someone's life, they may not be paying attention to the topic. So this kind of one-on-one approach seems to be what works best from what I've seen in that range of people about what we are actually talking about does fundamentally change the conversation. And I think it makes it a lot easier to find middle ground where you're all starting from the same facts. Great, so we only have a few minutes left, so I'm gonna ask everyone if they could give us some closing remarks about why they think it's important that we continue to call attention to filtering and banned websites in public libraries and public schools, and if they see any hope for reform of SIPA, or maybe Deborah can answer that best, and if everyone could please offer comments. I'll start with Deborah. Unfortunately I think SIPA is going to continue to be the law of the land for the time being given the current makeup of Congress, but libraries can respond to that with improved policies, improved procedures that look to enabling access rather than denying access to material on the internet. This may mean developing new policies that address bad behavior rather than criminalizing content, for example, developing methods for making sure that everyone has a great experience in the library, and just generally improving education. Sarah mentioned this, but it's just so important when you provide people with the tools they need and the information they need to manage their own internet access, and to understand how this works, you can often change minds and change how the internet is managed in libraries. And so we're hoping to continue this whole program of getting the word out about over filtering and how to address it, and how to develop policies to address it to libraries as a whole, and try to encourage libraries to recommit to enabling access rather than denying it. Great, and Chris, do you have any closing remarks that you'd like people to take away with them after this teaching? Yeah, I mean, I guess my biggest closing remark is that I wish SIPA was our biggest problem, which is to say SIPA isn't great. I'm not a fan of really any filtering machine, but at least in the course of my research, what I've come to be much more troubled by is the just radical inconsistency that's actually enforced, again, that sometimes uses SIPA as a pretext and sometimes does not. If our worst problem was a minimally, a standard minimally SIPA compliant filter that was in every library and school system in the country, I still wouldn't be a fan of it, but I think it'd be a lot better than the status quo, and what I hope that people will do, whether you're librarians, whether you're bibliophiles, civil libertarians, whatever your role is here, is to take some time to interrogate your local institution, and to say, listen, you guys do hard work. What can we do to help support you in the decision to make whatever your filtering regime is, if you have one, as narrow as possible, as narrowly tailored as possible, and if you've gotten to the narrowly tailored point, what can we do to convince you, as Sarah has done so well, to do away with it? And I think that it's going to take hard, persistent, empathetic understanding work in order to do that, and that's the sort of work that I hope we'll be able to all accomplish going forward. And Sarah, do you have any last thoughts that you'd like people to leave this with? I think that for librarians working in libraries or community members who are patrons of libraries, where there is filtering, you need to speak up and let people know what's being blocked, what's not working for you as a user, and also I would just happily offer my services to any individual who's trying to get filters out of their library. I'll arm you with all the data I have and give you the tools that I can to help you on your journey. That's one small contribution I can make and I'm happy to help anybody. Great, well I'd like to thank all three of you for joining us and I encourage everyone that's watching this teach-in to visit eff.org. We've curated a number of blogs from people who have shared their thoughts on internet filtering and blocked websites from across the country. I hope you all learned a lot. Thank you so much for joining us.