 This upcoming panel on data and democracy with the much-hyped Census 2020 presentation as well is moderated by Susan McGregor, an assistant professor at the Columbia Journalism School and assistant director at the Tau Center for Digital Journalism. I will give brief highlights from her bio and it will make complete sense, obviously. There she helps supervise the dual degree program in journalism and computer science. She teaches on data journalism, information visualization, algorithms and ethics, and her work is on information security, privacy and novel news distribution methods. With that, I will ask Susan to join us and introduce our next panel of speakers. Hi everyone. Our first presentation is going to kind of explore what we mean by data and what we mean by democracy and some of the inherent values and assumptions that are built into those terms and really kind of offer a clarifying overview in opening up those terms for discussion and examination. And then our second presentation, and I'll introduce each of the speakers before they come up, our second presentation is going to look at what it takes to govern what the effort that is involved in attempting to govern existing systems, so looking at existing automated decision-making systems. And what the real cost of those is, very often they're presented as a way to streamline decision-making, to decrease cost, to increase efficiency, but the attempt to govern them effectively actually generates an entire many, many layers of additional work and effort to make that happen. And then our third speaker is actually coming from a great set of projects that's looking at what does it mean to actually bring participation and governance into the development of these systems. So rather than trying to impose after the fact, what might it look like if we actually engage communities in making choices and set and just determining the governance rules for how automated decision-making systems affect them and how they engage with the community. So it is my great honor to present our first speaker. Renee Sieber is a professor of geography and environment at McGill University in Montreal. She's affiliated with McGill School of Computer Science and their Digital Humanities Working Group as well as the Global Environmental and Climate Change Center of Quebec. She works at the intersection of social theory and computer code and specializes in the use of information technology by marginalized communities, community-based organizations and social movement groups. Public participation. She also works with public participation in GIS. I'm sure everyone here knows GIS, geographical information systems and participatory GeoWeb and its relationship to the environmental movement, development of e-commerce tools and their use in marginalized communities. So with that. So I'm going to talk about civic participation with tech data and algorithms make the decisions. I'm not a big fan of the concept of data, first of all, because data isn't power. Power is power. It really is the treatment of data. It's whose treatment is valued and whose treatment is not valued. So I'm going to talk about civic participation here, but really it is what is the role of human beings in automated decision-making or in automated democracy. That being said humans are used a lot in automated decision-making and in artificial intelligence and algorithms, which form the basis of automated decision-making. But they serve as contributors of data or proofers of data, think CAPTCHA for example. So they're validating the data. Can it be more than that? Can it be counter-mapping? Can it be counter AI and counter algorithms? And I think it can be. We have a lot of work ahead of us. So I want to go back to the beginning. I'm best known for public participation in GIS. Thanks, Susan. So this is grassroots groups advocating for social change, and this was a long time ago in the mid-90s before GIS was easy to use. I think it's important here to say that it's, first of all, it's deliberative, and most of the data we're talking about is passive repurposing of ubiquitous and pervasive surveillance. And we need to bring back this notion that people are contributing data deliberately and they know the purpose for which the data is used. Secondly, a lot of the stuff I worked in is activist. It's normative, hence the counter-mapping. It's not descriptive. I think it is important to decide which side of the line we're on. And are we on the side of goodness or are we on the side of evil? Are we on the side of merely describing what's going on and reacting to what's going on or are we change agents ourselves? And that really, for those of us who are in academia, is really challenging. But I think the time has come when we have to try to put a stop to some things and we have to empower where we can. I think, so a lot of what I do is mapping, but I just want to say that for me, mapping is more than just the visual. Sometimes you have to reach deep into the software architecture. This is an ongoing project I've had with the Kree people in northern Quebec. This is a geospatial ontology. Well, I won't go into the details of that, but you know, when I talk about, oh, indigenous people, when I talk about the semantic arrangements, the way that we conceptualize space, you probably think, first of all, you're going to see the visual or if you see something structural like this of the arrangement of features, you're probably going to see me describe stuff about rivers or things about the land or hunting. But we have to reach even further into what are called upper-level computational ontologies where, for example, there is no universal way to describe space. We have to accept that even though web 3.0, the semantic web assumes that everything is interoperability, interoperable. Maybe we have to fight interoperability to challenge it at least. Second of all, the way that we have historically in geographic information science defined these distinctions as we sometimes fix geographies or features like mountains and space and time and the Kree people identified that geographic features cannot exist without time and, more importantly, space and time have agency. So actually, with the Kree people, time is speaking to them. When we say that there's such a thing as a living past, that is not merely metaphorical or rhetorical. It actually means that time is telling the future where to go. And that really disturbs our notion of, no, no, no, no, that's spiritual and we'll just put that over there and let us just get to the really hard technical good stuff. This is my work on, actually work with Columbia, NASA, Goddard Institute for Space Studies on climate models. These guys are actually working with actual climate models. This is just to push the point that people look at AI as a black box and they say, oh, there's no way that marginalized people could ever do this because, frankly, people in government can't do it. So, I mean, they outsource all this stuff. So why is it even possible? You just sort of have to react. Do you risk assessment measures that are big in Canada right now? But no. If you can get school kids to run actual climate models and, yeah, breaks all the time, but they still run them, then why can't we build counter AIs? So that's what I'm doing right now. I'm starting with natural language processing and feature detection. So I think it's possible. But I want to conclude my part of this by saying that when we think about the role of technology and the role of data in digital urbanisms and in democracy, well, we tend to focus down here on the civil society challenges. And that's really important because issues of digital literacy and digital divides and deficits of resources and how you even define the breadth and depth of participation persists no matter what technology you throw at it. So it's not really a technological problem, even though we think that technology is just around the corner. The next technology is going to fix all the messy democracy that we have. And that's not true. And we should fight that. Even, well, that puts me in a terrible position because I'm actually advocating for the technology. We also have the issue of institutional norms like more data is always better than less data, even if it's biased data. Why is that? Why should we accept that? And that quantification is unfortunate, but it's necessary. And like I say, just wait for version 2.0 and we'll fix all that. We will grab all that qualitative small data and we will magically transform it into quantitative big data. And that's one of the big challenges with natural language processing and sentiment analysis that we think that we've kind of figured all that out. And also, just as a side, data allows for us to replicate a consensus model of dispute resolution. There are huge value differences in a democracy and I don't think we should ignore them and assume that we can do audio to digital conversion of public consultation meetings in real time and therefore figure, this is actually happening in Montreal where they've done that. It's like, oh, we're going to, we can convert this and we're going to do a natural language processing and we'll figure out what people want. It's like, no, listen to the people who are in the room with you. You don't need to thread technology on top of it. And lastly, that there are structural epistemologies that the data is confounding. I call the first one the triumph of liberalism, that we, data has allowed incredible individuation of the world. It's both a way to be biased against you personally and to customize your life to make your life really convenient. And that makes us incredibly isolated, but it also makes it very difficult to collectively organize. And I think that we need to fight, I'm not talking about neoliberalism, I'm talking about fighting liberalism. And lastly, I'll leave it with this session is about data and democracy. Digital data affords a kind of technocracy. It is important to remember that technocracy is a form of democracy. So it is quite easy for us to say, well, our lives are so complicated and complex. So why don't we just hand it to the experts, this disembodied data treatment, and that will make our lives smoother. So we can do other things like spend more time on social media. I think that we have to think about ways that we can engage and extract joy from the messiness and the descent of democracy in the face of the quantification. That's it, thanks. Thank you so much. And now to share with us insight about the complexity of governing some of these automated systems. Greta Byram reimagines the way that we design, build, control, and govern communication systems. She's the co-director of the Digital Equity Laboratory at the New School in New York City, where she builds digital justice through applied research, community projects, and policy strategy. She also founded the Resilient Communities Program at the New America Foundation, where she brought training, tools, and equipment for storm-hardened mesh Wi-Fi to five neighborhoods in New York City's sledblades. Byram was a 2017 Harvard Loeb Fellow and currently serves on the boards of the New Harmony Earth Sanctuary and the Metro New York Libraries Council. Thank you, Susan, and thank you so much, Leah, and to the organizers. Thank you, Lila, and it's great to be here. I am a GSAP grad, and back when I was at GSAP about a decade ago, I used to talk a lot about how communications technology requires infrastructure, and GSAP would tell me, like, that's not urban planning. Go over to the journalism school, like, you're not talking about planning. And I was like, but it's infrastructure. So I'm going to talk about infrastructure today, and that's data infrastructure, but also literal infrastructure. And the title of my talk was that the future is already here. It's just poorly managed. So I was, of course, thinking about the old clip from, I think it's William Gibson who said the future is already here. It's just unevenly distributed, and I think both are true. But yeah, so I think it's like an open question of when we really think about what's happening now as the development of a new kind of infrastructure, is it possible to build democracy into it? I'm going to ask a lot of questions, and I'm also going to talk about the census, so get ready. Okay, so starting off, this is what the future of infrastructure looks like. Does anybody know what this is? Just shout it out. 5G. I like to call it the fire festival of infrastructure, because it's all hype and no plumbing. It's very literal. So 5G is very high-band with short-range communications infrastructure, and it involves not only the development, the massive development of building out all these small cells, those rectangles or small cells, which have to be extremely dense. So it really only works in urban areas, not in rural areas, and also it's really resource-intensive. So you're not going to see a lot of 5G development in poorer areas. It also depends on having really good underground pipes, like data infrastructure. So again, in places that have been digitally redlined, which is where the industry has not invested in the underlying infrastructure, you're going to have 5G networks, or you won't have 5G networks, you'll have substandard infrastructure in those areas. So that's 5G. It basically deepens and makes material a kind of inequity that exists already. And really, if you look at redlining maps from the 50s and then you look at where digital redlining is happening now, it's the same exact map. So we also know about 5G that it feeds surveillance capitalism. It's a wide net. It basically is a dragnet of all kinds of data. That's license plate readers, facial recognition systems, sensors, beacons, basically just a constant hose of data being processed at the edge and sent through the 5G systems. Happily, that will be concentrated in affluent, well-resourced areas. Well, it's not happy, but at least we're not doubling down on some of the surveillance that is already over-built out in marginalized and poorer areas. So yeah, 5G and the future of data-driven infrastructure is not just one system. It's a combination of systems, overlapping systems, and really, we can't really think about each of these in isolation. It's an overlapping sort of pattern of basically data gathering. And by the way, when Sidewalk Labs says they're not going to use the data they're gathering for advertising, they are using it for a new company called Replica, which will actually automate the job of urban planners. So look that up. They just did their IPO in September and raised $11 million to automate planning jobs. And that's what they're going to use, the data they're gathering in Toronto for. So I just want to say this is basically the back end of 5G. So when you have this kind of wide net of data gathering happening in urban areas, this is what it looks like on the other side, which is the cybernetic city. So you have essentially a hose of data. And by the way, by 2020, we'll be producing 44 zettabytes of data per day. That is the equivalent of 44 trillion gigabytes of data. So just imagine that volume of data. So what do you do with it all? And I think we have almost reached this point where because we're generating so much data, then we have to like find ways to do things with it. So in a way, we're sort of like creating the cycle of demand for data processing products, which in the end are algorithms or automated decision making systems. The cybernetic city, and this is a concept I'll attribute to Rob Goodspeed, it's the science of communication and control of an organized system. And the theory is that if you collect enough data and you have a system of, it's like an engine, you have a control loop with sensors that are measuring all the time. And then you have actuators and you're kind of constantly using all this data you're gathering to kind of tune the system. So the idea is that if we are doing that with our social systems and our culture, that we should be able to kind of like tune our culture based on this like constant influx of data that we're pulling in over 5G. But yeah, it sort of creates this imperative where we think, oh, okay, well, we need to do something with all this data. We need to organize our systems better. And then we get into predictive analytics and automated decision making systems and decisions about what to do with the hose of data. And then we get things like this is sort of the futuristic cybernetic control room of the future city. And I love the example, I mean, the Fire Festival analogy is a joke, but it's also real. So in Rio, they built a control center like this that was supposed to kind of control for flooding that was happening in informal settlements on the hillsides. And so they were taking all kinds of sensor data and like just trying to fine tune everything from their central control center. But what they didn't do was deal with the plumbing that was being built in these informal settlements. So they continued to have mudslides despite having this kind of like very sanitized infrastructure in the city. So yeah, we have this kind of strange feedback loop that's emerging almost by virtue of the fact that we're building this kind of infrastructure. And we also have industry that is the telecommunications industry and the technology industry seeing a return on investment from surveillance and basically capitalism, surveillance capitalism, which I'll attribute also to Shoshana Zuboff, although she also attributes it to many other people. So yeah, this control center mentality, you know, it brings us to a point where kind of we're sort of heading headlong into this future that we're not sure we really want. But what do we do about it? And this is where I think the question of governance comes in and where we get into this problem of, you know, do we regulate? Do we advocate? Do we organize? Like what's what's the answer here? And I think a lot of it is just we don't even realize in some cases what's happening. So I like to, here we are, census. Census 2020, everybody. It's the most boring topic that ever became like a huge part of my life for a year. But the census is incredibly important. So if I'm sure as planners, you all are aware that the census will decide how district lines are drawn in 2020. It'll decide how we're represented in government at every level. And it'll also decide the fate of $800 billion in federal funding that flows through the states and the municipalities. And it's going to be online for the first time in 2020. So get ready. And this this reaction, which is just like somebody's comment in the New York Times is really typical of like everybody I've talked to. Wait, what? The census can be online. Seriously, people think the citizenship question is the biggest thing, but it's going to be online. And we all know the terrible things that happen online. What are you talking about? And I do return over and over to these words of Donald Rumsfeld. There are the known unknowns and the unknown unknowns. And in dealing with the Census Bureau for the last year, there are many unknown unknowns. They don't like to answer questions. In addition to the census being online for the first time, so 80% of households will be asked to respond online. And that's the plan. And mind you, 30% of households in the U.S. don't have internet at home. So that's where we are. In addition to that, the census is integrating machine learning systems for non-response follow-up that'll be integrated with GIS and administrative data. And they're also going to use instead of paper forms, they're using what they call distributed devices as a service, which is iPhones being used by canvassers. So imagine what's going to happen is somebody's going to knock on people's door with an iPhone and say, like, I'm here from the government to ask you some questions. So that's what's happening. And by the way, the citizenship question supposedly will not appear. And I say supposedly because they still A-B tested it in June. But what's happening instead is that federal agencies are going to share directly data with the Census Bureau, including the Department of Homeland Security. And supposedly they're going to share sensitive information like alien registration numbers. So we still have a digital divide. And that's the other thing that comes up with the Census is that, again, you know, 80% of households being asked to fill it out online first, there will be a automated of what's called a voice response system. So people will be able to also call a number, but 80% will be asked to fill it in online. This is just, you know, an example of what the digital divide looks like. It looks like a donut hole usually. So you see those kind of purplish areas. This is actually census data, which is one of the tragedies is that we'll lose the best data we have on Internet adoption along with going going digital. So those kind of purplish areas, those are areas where 40 to 60% of households do not have home Internet of any kind. And they tend to be the poorer urban areas in addition to rural areas, which is usually you hear about the rural areas. So what we're doing at the Digital Equity Lab is that we're working with New York State's library system. So we've developed a curriculum for them helping to train both frontline staff and IT staff to prepare for census. It's sort of the only thing that we could think to do, but that curriculum will be ready I think in early November. So I can follow up on that. So that's census, but I didn't bum anybody out too much. But yeah, I can answer lots of questions about that. Meanwhile, we have basically every day stories coming out. I'm just going to name a few things that have been really alarming in the last few weeks. Number one, HUD has a rulemaking being considered right now which would protect algorithmic decision making when it comes to housing and make it impossible for people to sue based on discriminatory impact of housing decisions and would protect the IP of technology companies that are producing those algorithms. So they would be completely black box. So that's being decided right now, but basically HUD has announced their intention to do that. Also, things like this tweet from Danny Rivera, who's just some local reporter who looked at the proceedings, like a thousand pages worth of proceedings that were going to go before a commission in the city of Miami and discovered that there was a 30-year contract in there on light poles that would have cameras licensed to plate readers and flood sensors that they could sell to anybody. So that's public data that they could just... And so they had to postpone the vote because this guy tweeted about it, but it's still being considered. A couple other things. Amazon Ring, if you guys are watching that, they have 400 partnerships with police departments right now and police departments are coaching, or sorry, Amazon is coaching police departments on how to talk to people out of giving up their ring data. And also public funds are being used to pay Amazon for installation of this equipment. And meanwhile, there's a fight in New York State right now over facial recognition technology that's already been introduced by private landlords in the city. So these are all things that are happening right this second. So what are we going to do? What are we going to do about all of this? Oh yeah, this, I forgot about this great example where the New York Police Department was using a facial recognition system. They had some grainy video of a guy that stole beer from a bodega and the police officer who was looking at the footage said, oh, I can't really see the guy's face, but he looks like Woody Harrelson. And so they fed Woody Harrelson's image into the facial recognition composite system and generated a new image and then arrested somebody based on that composite. So that's in a report from the Georgetown Privacy Technology Center. If you want to check that out. So yeah, what do we do? In 2018, the city of New York, the city council passed a law setting up. At first, it was going to be a commission that would review automated decision making systems or algorithms used by any city department or agency. And then they decided they weren't ready for that. So instead it became a task force that was going to develop a plan for how to do that. And basically that task force has been convening for about a year and the report is due out in November. But what has happened is that the process has gotten, it's sort of turned like a bleakly bureaucratic process. And by that, I mean, you know, it's amazing, wonderful leaders who are on the task force, leaders in data science and algorithmic fairness, civil leaders, great people from around the city. But there was no sort of coherence or like sort of clear directive in the law that city council passed. And so they spent a really long time just trying to figure out what an automated decision making is. How do you define it? The law was written so broadly that a calculator would have qualified as an automated decision making system. And when you have hundreds of ADSs already being rolled out by city departments, just the sheer work of saying this one qualifies as something that needs review, this one doesn't, is already a problem. And then you'll have the problem in addition that many city agencies and departments were not very happy to give up this information about the systems they were using, in particular the New York police department was not. And then the last problem that they had was what Albert Kahn of Stop, which is a surveillance sort of watchdog group in New York, what he calls the salience gap, which is that when you try to hold a public process around something like ADSs and you're like, we're going to have a meeting about ADSs, people are like, you what? You're going to do what now? So people have a hard time, law makers in particular have a hard time understanding what they're deciding about. So we'll see what happens with the task force's recommendations next month. There's also a group convening, convened right now by the NAACP of New York and the Legal Aid Fund, which is going to hold a convening in early December where they're going to actually hold a disco tech and sort of help people understand ADSs and then have a kind of organizing push around it. So what can civil society do? If regulation is really hard, what can civil society do? I just, I love these folks. This is the Connecticut four. Does anybody know who the Connecticut four is? Like raise your hand. They're like my heroes. No? Okay, too bad. Now you know, the Connecticut four was a group of librarians who were issued with a demand for patrons borrowing records under the Patriot Act and also a gag order. And so they fought back and they took it all the way to the Supreme Court and they won. So they are amazing and they're librarians and one of the reasons they protected people's data so strongly was due to the ALA code of ethics. So where can we find ethical codes, professional codes like this where people feel empowered based on a professional code that they have sworn to to uphold the people's privacy and security. We also need multi-level policy advocacy because one thing we've seen is that these kinds of decisions around ADSs tend to be made in court and so we'll often have overlapping jurisdictions or we'll have a decision made in a court where state policy is actually different and in the case, the wonderful case where people have come together and organized for facial recognition bands like in Oakland, San Francisco and Somerville, that doesn't stop ICE from using facial recognition in those jurisdictions because those are municipal bands, not federal. And then something like the state's housing policy, even if a judge in Brooklyn says this landlord is not allowed to use a facial recognition system in that building, the state may still have a different policy. So we really need to understand that there's a jurisdictional problem in regulation and start to do advocacy on every level. We also need to do awareness building and education, discotecs everywhere and look to codes and standards where they exist. And then I want to just also shout out the work. So this is some of the work that we did at New America with the resilient communities project where we developed a kit where people can build their own infrastructure. So this is kind of like the mini version of a wireless network. And we've now rolled these kits out all over New York City and you can sort of mesh them together to kind of do a ground up like DIY network. And the thing that's awesome about this is that people get hands on and they it sort of demystifies networks for people. And you see, you know, whether it's something that's going to work at scale or like cover the whole city is not really the question. The question is how do people relate to technology? How do they feel about it when it's in their hands and they feel like, oh, I can make this. I'm a producer. And we, well, I wear two hats. I should have mentioned. So one is I'm the director. I'm the co-director of the Digital Equity Lab at the New School with Maya Wiley. And then the other hat I wear is I'm the director of something called Community Tech New York. We are the sister project of Detroit Community Technology Project. And together we are something called the Community Technology Collective. And we are trying to build a movement while we are building a movement. This is from some work that we did last month in rural Tennessee, building a new network. And I'm really excited that Janna Skates is going to get up next and talk about the Equitable Internet Initiative, which we're just like a new piece of it. But she'll talk about the history and what they're up to now with the Detroit Community Technology Project. So I wanted to leave you all with a little bit of hope that we can build something different. And also a note of caution that we don't have to rush headlong into this future that's rushing towards us. So thank you. And so as Greta mentioned, next up we have Janna Skates, who is the director of programming for the Equitable Internet Initiative at the Detroit Community Technology Project. And she has a background in program management, marketing, public relations and communications. She works with anchor organizations in three Detroit neighborhoods of Island View Southwest and the North End, seeding community technology programming, including DCTP's digital stewards training program, local expansion, outreach strategies, partnerships, program implementation, evaluation, and internet adoption. So Janice is going to share with us, I think sort of a vision of an alternative approach to how we might design these systems in a participatory way and what that alternate future might look like. Well thank you Susan. I'm really grateful to be here with all of you today. Really appreciate the shout out Dr. Benjamin and Greta. Yeah so today I just want to share with you some of the work that we do. Give you some context, talk a little bit about the history and then tell you why we do this work and why we feel that it's important. So the Detroit Community Technology Project, so we use and develop technology that's rooted in community needs and strengthens human connections to each other and the planet. So before I jump into what the Equitable Internet Initiative is, an EII as I'm going to call it moving forward, just want to give you some context. Some parts of Detroit are seeing lots of growth and development. But since 2015 Detroit has consistently remained at the top of the list of worst connected cities in the country. In addition, 38% of homes have no internet connection at all, 63% of homes of low-income homes have no in-home broadband, and then 70% of the school-age children in Detroit have no internet access at home. And the backdrop to all of this is that the median household income is $26,249. So this map shows the percentages of homes in the city with no household internet subscriptions. So like Greta mentioned it's kind of a doughnut hole effect here. So you'll see the blue areas and those are they have the highest percentages of internet access. All of the red it's about 49 to 87% of homes with no internet and the orange 35 to 49%. And the small orange dots that you see are the smaller pockets of internet access that exist in the city. So just think about like how often you use your technology, how often that you're online and what your day might be like if you didn't have that opportunity. So if you lived in one of these red areas your life could be substantially different. So you might not be able to do things like search for and apply for a job, access important government resources and information online like about local elections and the census and you might not be able to fully participate in an economy that's moving online more and more. A factor into this that you come from a community that's consistently excluded or underrepresented and you live in a low-income community that has very few resources. And so this is really why we believe that communication is a fundamental human right and in digital equity and it's also why we created the Equitable Internet Initiative. So as part of this initiative we work with three neighborhoods in Detroit, Southwest Island View and the North End to build community governed wireless networks otherwise known as ISPs or internet service providers to bring their communities online. So just a little bit of the history we really start talking about this program in 2015, begin having conversations around it. 2016 we solidify partnerships with Church of the Messiah on the East Side, the North End Woodwork Community Coalition in the North End and Grayson Action in Southwest Detroit to launch the program. 2017 we completed a training for 45 Detroit residents, 15 in each neighborhood. This was a training where they learned community organizing and wireless engineering skills to build out the network infrastructure. Last year was the pilot year where we connected about 150 homes, about 50 in each neighborhood and this year we really focused on expansion, adoption, sustainability and resiliency. And these are the things that we prioritize so homes with no internet connection at all or those who wouldn't otherwise wouldn't be able to have a connection. Homes where adults or children are participating in educational programs, senior citizens in our Island View neighborhood has one of the largest concentrations of senior citizens in Detroit and those living in vulnerable areas who often experience flooding and things like water shutoffs. So what makes us different from a traditional internet service provider is that we practice data justice and we do so by training community technologists, having and creating sets of principles, trainings, building relationships and in our policies. So really integral to this work are our digital stewards and they really build out the network infrastructure. And these stewards are community organizers, they're artists, they're educators, they're neighborhood leaders, they work, they live in the community in which they work and as you can see there are people of color because we are challenging this idea that the face of a technologist has to be white. They also range in age from elders to their teens and we train them as community technologists and so what that means is that they build technology in a way that heals relationships and helps to restore the neighborhood. And some of the ways that they do this is they're really intentional about how they build design and expand the networks. They work with neighborhood advisory councils and they host participatory design sessions with their communities and the people on their network. And then our work is guided by a set of collaboratively designed principles. So this is really how we ensure that our networks remain autonomous and it's how we know that we're really helping those that are the most impacted. These principles guide our work, they guide how we collaborate and they help to keep us aligned and focused. And then training, so last year our stewards went through a three-day intensive training led by DCTP's data justice director on digital security, privacy and consent where they learned how to think critically about data and privacy, safety versus security and what that means for their neighborhoods and how to create rest practices on securing their network. And we also focus on building relationships. So we don't make assumptions about what's best for our communities. We don't make assumptions about what they need or run into these communities with a list of solutions we think will help them. We focus on collaboration, engagement and building relationships that are authentic and transparent. Some of the ways the stewards do this is by initially canvassing their neighborhoods, so literally going door-to-door to talk to people and survey them about internet access. They partner with local businesses and community gathering spaces to help build out the distribution network and they host trainings and workshops in their neighborhood that are focused on adoption and security. The neighborhoods also build relationships with each other. We pulled these three neighborhoods together in the beginning, they didn't really work together a whole lot. But these are some of the ways they share staff, so in cases of steward turnover or if someone goes on leave or a vacation and they share services, resources and also collaborate on hosting trainings and events. We also create a responsible policy. So we don't sell or share data. We practice net neutrality that's written into our agreements. We're transparent about the information that we collect and why. And we have a privacy policy that our stewards actually verbally review with a new client. So while we're working within EII to create and model this new world we want to see, we still have to address the reality of the implementation of harmful technologies into our communities. So where we are focused on building relationships, there are technologies being designed every day that inspire fear, suspicion and surveillance of each other, where we encourage participation and engagement. Companies are creating technologies that eliminate human involvement altogether. And where we practice consent, the gathering, buying and selling of data without consent has become very lucrative. And these are some of the policies that have recently been rolled out in Detroit. The first of which is Project Greenlight, which is a real-time police surveillance camera program that local businesses buy into with the idea in mind that this will keep their neighborhood, keep their businesses safe and will reduce crime. Since its launch, it's expanded into schools, churches and affordable housing communities. Another technology is facial recognition, which was just recently the Board of Police Commissioners in Detroit approve the use of this technology in our communities. So the real issue here is that Detroit is still a majority black city and many of the neighborhoods are not experiencing the economic growth and development that others are. These neighborhoods are seeing gentrification and the implementation of dangerous technologies that are used to surveil them, profile them and further harm their communities. And so the work of the Detroit Community Technology Project, specifically this data justice work, is led by our Data Justice Director. And we also work in coalition with other organizations like the ACLU, Georgetown University, BYP 100, Fight for the Future and Fight for the Future and others to educate the community around these technologies, lead resistance to them, call for moratoriums on facial recognition and calls for responsible governance. And this is really the tension that we and I'm sure other organizations are facing, trying to create a world they want to see while also having to still navigate their way through harmful systems and infrastructures. That's why we use these two approaches. So having resistance within, but on the outside, modeling and trying to create the world that we want to see. Because we believe that media and technology are really essential to helping transform a community because they create opportunities for participation, collaboration, access, engagement, particularly for those often excluded like people of color, women, people who identify as LGBTQIA and beyond, and women and people who identify as women. They also help communities to realize their economic and political power. So outside of this work in my spare time, I'm an energy healer. And so I think of the process of tearing down these harmful systems like disproving a harmful belief that's had years to percolate. They need to be uprooted, dismantled and reframed so that new and more participatory systems can take root. Thank you. So I just want to thank everyone for your fantastic presentations. My mind is bursting. I think one of the themes that has been addressed here and much of this work is very close to my heart in the work that I do around journalism and communications. Janice, I was hoping to start with you because I think one of the issues that's been raised is sort of this idea of how do we govern after the fact. And I'm curious that the policies that you all have developed at EII to govern the engagement with communities and the building out of that network. The part of me that wants to quantify is curious, how long did it take you to develop those policies to come up with things that you felt were principled enough to then give to the digital stewards and say, okay, or give to the communities and say, this is the outline? Yeah, well, so the first round of the policy agreements was kind of a, I don't want to say a mad dash, but we were just kind of starting and we didn't know what we didn't know about how ISPs operate. And so we had, I think one of our engineers put together the policies, but then we had like a, we had at least eight lawyers review them, and then we also had the anchor organizations. Each of them went through them and we offered them the opportunity to give feedback for things that they thought should be in there. So we had actual conversations about, you know, including net neutrality in that space because I think it was the year actually that we officially launched EII, net neutrality was overturned. Yeah, so the year after it was overturned. And then we also have, like I said, the stewards review the privacy policies verbally with the communities. And that's really to just help to build the relationship and to create trust, because these are often people who they haven't had internet access before, especially a senior citizen. And you can't give them a 12 page document, ask them to sign it without explaining to them. And so we also wanted to let them know how, so how you would, with the traditional ISP, you'd get these policies. And they might offer you five or $10 a month, which is what Comcast and AT&T are trying to offer now in Detroit. But there are all these loopholes in there where they have opportunities to raise these fees and add fees into the contracts. And so we specifically have the stewards do that, because we want the communities to know that we're not just like trying to make money off of this project, because it's actually not, you know, super profitable. We're trying to bring access and we're trying to stay true to that mission. I mean, part of the reason why I ask is because, you know, I think we think about, I've been following the work of the the ADS task force here in New York City, and think about how drawn out that process has become. And I think one of the things I think about at this intersection of data and democracy is that somebody who I actually many, many years ago studied computer science. And I actually work with computer scientists. Most of my research is with computer scientists. And I always find the contrast between engineering and not everything else, but a lot of other practice methods is the desire for engineering to come up with a single solution, right? One solution that applies to everyone and everything. And so I was I was curious, because again, to contrast with the ADS process, which is trying to maybe create a governance or even a definitional structure. But also, Greta, in your presentation, you brought up this issue of, well, we we have these very piecemeal regulations. I'm wondering how likely or even appropriate you think it is to have to strive for a single governance system, right? Or we would be over engineering at that point, because one of the things that strikes me is really significant about the work I think you're doing with EII is that in principle, the components of the technology are the same. But each network looks different, depending on the needs and interests of the community. And so I'm curious what you think about, is there a balance to strike between, you know, this is the overarching regulation, which to me sounds kind of appealing. But, you know, you sort of see, well, does that mean that we're kind of flattening the ability for a grassroots representation? Wow, okay. No, it's a really interesting question because we're working now with the Community Technology Collective. We're trying to build equitable internet initiative projects now in upstate New York and rural Tennessee, which I mentioned. And one of the challenges we're finding is that it's really difficult to take all of the policies that work in Detroit and bring them to these other contexts. So, for example, in Tennessee, we're having a really hard time finding what's called a backhaul provider. So somebody that'll provide a data source for the Equitable Internet Initiative project there, and we may not be able to bring all of the EII policies, and then we'll have to have a collective retreat, probably, where we spend a lot of time just talking it out. So, yeah, it's a very time-intensive process, but I think when it comes to something like ADSs and civil liberties, we're talking about something different there. I mean, we're talking about civil rights. And in that case, we do need to have a coherent governance structure because it's the bill of rights. And, you know, this is the nature of our democracy is we have federal, state, and local governance. And if those things are at odds with each other, then how does anybody enforce the law? And the law right now is the Bill of Rights and the Constitution, knock on wood. So, let's try to agree to stick to this and maybe strengthen it where we can. So, thank you so much. And I think, Renee, this makes me think of that wonderful triangle that you showed in this idea of sort of structural epistemologies. Is the Bill of Rights where we should start with that? I mean, if we try to think about these systems and what we are trying to preserve or replicate into the future, it strikes me that at a technical level, this is not at all what these systems do. But is that what we, does that seem like a sensible place to aspire to? I'm wondering, and this is really for all of you, you know, where do we begin when we think about what we should be preserving or guaranteeing when we try to have these systems in our democracy or do we exclude them? Are there places where they just don't belong? I've taught for many years with law scholars and I've taught law students and it's made me realize that law isn't everything. So, not everything is amenable to being adjudicated as a rights issue as opposed to a norms issue. And I think, for me, I think we need to start inside governmental organizations. Governmental employees are our friends often and they're as frustrated as we are sometimes with the abuse and the slowness of government response, although given that participation, participatory democracy is actually enshrined in governments as a result of many, many years of pushback in advocating that citizens have standing, I don't think there's something to be said for the slowness of government and not to adopt agile methodology is the way that they approach everything. So, I'm not entirely sure that it should always be rights-based because, so when I teach with a law scholar and, you know, half the class ends up thinking, oh, I'm going to go into law now because law will solve everything because everyone follows the law, right? It's like, no, they don't. So, oftentimes, and you can't legislate everything. So, I think it really behooves us to think about how we convince or con or something government employees. So, you see government employees and as potential partners in these efforts. Yeah, because there isn't a loss in non-profit organizations. Everything is outsourced. It's sending out RFPs for automated decision-making systems. They have no clue what they're getting. So, we can help them out. Right. I don't know, would either you like to add to that in terms of should we be starting with rights as the place for defining these systems? I mean, I sort of think at this moment in the United States, I think many of us have learned the hard way over the last several years that norms go quickly in the wrong context. And so, I'm wondering if you think establishing the norms is enough or... I think they go hand in hand. I think part of your question is like, what is important to preserve? And so, for us, that is the relationships, that's the history, that's like the culture in the neighborhood, and people's privacy. And so, in our Southwest network, there's a really large Hispanic population. They have a really large immigrant population. And so, with the rollout of technologies like Project Greenlight, it's not only harmful, but it's actually really dangerous. And so, we've heard stories of ICE actually sitting outside of some schools. And I had one person tell me that she was waiting for her friend to come to school one day and she just didn't show up because ICE had apparently grabbed her. And so, what essentially is happening is that families are being torn apart. So, I think like that, like on the ground level, on the very basic level is a place to start, but I also think that the rights and like policy work is important. Good. Did you want to add anything? I just, I mean, I think that's right. I think like we sort of have to, yeah, honor both rights and norms and struggle with the problem of how we enforce any of that. But to your point about, to Monet's point about government employees, I think one of the things that we're seeing in a lot of agencies right now is that career folks that have been there forever, so this is at the federal level in a lot of cases, they are really struggling with some of the directives that they've been given. And, you know, for example, the Census Bureau has amazing staff who are doing their very best to make this 2020 census pan out. Well, it's not going to. But you see, you know, I'm so grateful that those people are there and that they're doing their best to create integrity in the process. So, yeah, I think we're at a really strange point where we have to appeal to maybe the norms that exist in people's minds or, you know, they're the historical memory of our democracy. And then, you know, also remember and honor local histories and cultures as well. I actually just, I recently at the journalism school, I taught a class on algorithms. And I just want to encourage anyone who isn't kind of familiar with how machine learning systems work, which is the kind of decision-making systems we're talking about here. There's a wonderful paper by a statistician named Gallit Shmueli that was published in 2010. And it's called To Explain or Predict. And I recommend it because I think it provides one of the most clear and compelling explanations of what automated decision-making systems do, which is essentially to replicate the world as it has been delivered to them. And she uses this fantastic analogy between Van Gogh's Starry Night and this idea that you can look at the Starry Night image and say, is this a picture of a real place and use that image to try to infer what that village actually looked like? Or you can look at it and imagine a piece is missing and say, I'm going to look at that painting and try to fill in the blank. And that filling in the blank is what machine learning systems do. They take the data that they are given, which is the picture of the world that has been fed to them in the form of these data points, and tries to replicate that. And the fact that they're really only capable of positive feedback loops to replicate what they have been fed, I think is a really fundamental thing to consider as we think about what these systems are actually capable of doing and whether that resolves with our desire for representation and democracy. Can I shout out one more paper on that note? It's called Dirty Data and it's by Rashida Richardson and her co-authors at AI Now Institute and basically examining predictive policing systems and how biased, historically biased policing data has gone into creating these predictive systems that then replicate that bias in their predictions. So that paper came out this year. A couple of years ago. Well, yeah. Maybe last year. Dirty Data, AI Now. Yes. Thank you for tying back to the morning. I'm going to leave it there, but thank you for tying back to the morning. Part two, I will also tie back to the counting and the tracking. Just to give you a chance to answer this question for everybody in the room, given what's written in the Constitution about the decennial census, given that I think you said that any gaps or non-responses with not the gaps produced by non-responses will be filled by machine learning, is the 2020 U.S. decennial census designed to be an enumeration? Is it designed to be a count? Oh my gosh. That's such a good question. Yeah. Yeah. That's it. That's the whole question. Right. Yeah. Right. So the Constitution mandates that the census should be an actual count. So that's the reason that, whereas the American Community Survey is sample-based, the decennial census is supposed to be just a straight-up count. So in order to try to answer that question, I recommend that people read the Census Bureau's operational plan. I'm just kidding. I don't actually because I read it. Is there a quick summary? But no, it's really complicated because the non-response follow-up system, what they actually did was they collected administrative data from every municipality in the country and built essentially a very advanced mapping system. And yes, it does fill in the blanks, but the blanks it fills in is basically given what the composition of X neighborhood is in terms of households. Then we can impute that Y neighborhood looks like that as well. And then that is how the Census Bureau then does the follow-ups that will inform the count. So they can sort of say that methodologically it's a count, but the way they're designing the system that will follow up and continue to canvass neighborhoods is informed by that machine learning process. I will say they are only planning to canvass a quarter of the blocks that they canvassed in 2010. And so take from that what you will about imputation. So I think it's a really complicated answer and I think it rests on how we understand the technical systems behind this. And the final thing I'll say is the government accountability office issued 500 flags on the Census Bureau systems, many of which have not actually been finished yet, even though we're going into the count in March. And basically at this point, the GIO is saying the flags are only going up in number and that their major recommendation for the Census Bureau is to create a system to deal with the recommendations. Hi, I had a question on the census. There is a switch to differential privacy as a way to protect individual level data, which essentially means data reconstruction instead of data re-identification. And I was wondering how that fits in the conversation of data and democracy. How it fits in with what? In the conversation about data and democracy. Yeah, so differential privacy is another data swapping technique to protect, so to prevent re-identification of de-identified data. So the Census Bureau recognized that systems are so advanced now data processing that it would be quite easy to re-identify census data if they released the raw data as they have in the past. And so they've decided to employ this technique whereby they do sort of the same thing where they swap data from one census block for another in the official release. So that's what the plan is for the release of 2020 data. Of course, that also means if you're going to do data swapping informed by machine learning, it means that you could have a 100% error rate. And then when you get into districting, that becomes a huge issue. It also means that anybody who's not a huge academic institution or otherwise very well resourced institution is only going to have data that has had this technique applied to it. So how much will we be able to actually rely on the veracity of this data? It's really a question and hotly debated among statisticians. I'm not one of those. I will give one last plug which is that here on campus there will be an event on the census as well led by a colleague of mine who is a journals professor and statistician. So keep an eye out for that. Unfortunately, we have to conclude this has been really wonderful. Thank you all so much for your presentations and participation.