 Welcome to the second annual pit you and convening we're grateful for the opportunity to meet virtually. I want to thank new America for agreeing to host this event for us. I hope that we will be able to meet in person in Arizona State University next year where we'll have our third annual convening. Today's sessions are open to pit you and representatives including designees, your development and communication teams, your 2019 grantees and our new 2020 network challenge grantees thus far. And our wonderful funders which include the Ford Foundation, Hewlett Foundation, Mastercard Impact Fund and Mastercard Center for Inclusive Growth, the Patrick J. McGovern Foundation, the Rakes Foundation, Schmidt Futures and the Segal Family Endowment. On tap for the for the day are the following I just want to give you a sense of what to expect and to give you a sense of the through line for the day. So we are going to revisit the 2018 definition of public interest technology which was created by one of our first working group. We have several members from the upcoming panel who are part of that working group, which includes Deerja and David Eaves. We also will then take a deep dive into the need for public interest technology across multiple sectors, sectors in our demand panel. And then we'll take a 15 minute break, get up, shake out your legs, refresh your coffee, and then come back for a highly interactive sessions around the field building areas with spitfire. You all are very familiar with these five field building areas and so we're going to take a deep dive together as a group and exchange some of the lessons learned over the past year. And then finally, we're going to close out our internal meetings for the day with a presentation from the team from Handshake. They're here to demonstrate how you can use Handshake as a tool to help connect your students with jobs in public interest technology. I hope that you will return at 7pm Eastern to join us for our very first public event of the convening, Centering Racial Equity in Pitt with Elizabeth Garlow, the Deputy Director of the New Practice Lab at New America. Have no fear. We will be sharing this video across our platform so please know that you are, if you've missed this session, if you've missed any of the sessions we're hoping that you'll be able to access them sometime in the future. So stick around with us for the remainder of the day. And I look forward to the next two days of intense discussion. And I just want to say this as well that I want to thank you all for your leadership around public interest technology on your respective campuses. I hope that we will have a set of fruitful discussions that will only improve the work on your campuses and make public interest technology a viable career pathway for your students. So thanks again, and let's get started. Thank you so much, Andrean, and thank you for all that you and New America have done in organizing and giving life to the Pitt University Network. And thanks to all the techs and all the support people who've really put in an enormous amount of effort to bring us together in these strange circumstances. I see Z ones and Z twos and Z threes and I know that they must be doing something very helpful and productive behind the scenes. So again, I'm Dave Gustin. I'm the Pitt designee here at Arizona State University, and I direct ASU school for the future of innovation and society, which has issues around public interest technology is central to what it does in the world. We also have a new role here at ASU as the associate vice provost for discovery engagement now comes in a larger structure called the global futures laboratory. And it is really my privilege today to moderate a discussion that revisits the definition of public interest technology that the university network has been working with. We have three of our panelists here right now, maybe our fourth is going to show up I know that she is on the other side of the planet and it's rather early her time so maybe there are some, some challenges there I know there were some challenges for me this morning, and I feel sort of breathless to give us a little moment of of repose perhaps before we dive into this. One of the many habits in trying to humanize this instrument of zoom is when asking people to introduce themselves to share something that might be a little personal that they would not have you know shared in a formal professional introduction, but maybe something that might have come up in the chit chat before meeting or something along those lines or that we would know from deeper dives into conversations that we are precluded from having. My professional information, my personal point of introduction is that I have a black belt in taekwondo. And I would also like to invite our other are panelists to share that personal information, and also to excuse me to add a tidbit about what it is that perhaps spurred their interest in public interest technology or what is the earliest something cognate to public interest technology that they have. So, after the introductions by the panelists, we are going to start the conversation by revisiting the definition as Andre had said that David and Deirdre and their colleagues had posed for us a bit ago. And look at that, see what how it's fared over the past year and a half or two years that the network has been full operation. Then talk about other kinds of cognate terms responsible innovation responsible research and innovation in particular. Hopefully, Sujata will arrive for that part of the conversation since I had intended her to be principal in that because responsible innovation is a framework that has been used, perhaps more globally and as the university network potentially thinks about expanding into a global scene. The differences or similarities between the pit framing and the responsible innovation framing will need to be discussed a little bit. And then I thought we might slide into discussions about pit framings in the private sector and how ideas like corporate social responsibility and and its related terms and practices have evolved over time. And then we will open it up for a broader discussion among all of you. So, you've again you've heard my professional introduction you've heard that I have a black belt and taekwondo, and I have been paying attention to this sort of pit related stuff for so long that when I was five or six years old. My favorite book was a book called Mike Mulligan and his steam shovel. It is a great treat for you who are unfamiliar with it. It is a great treat for your children or grandchildren and even for yourself. It is a depression era storybook about technological unemployment. So that was my first interest adorable one in pit. Now I'll turn it over to David is to introduce himself and meet myself because it's 2020. So my name is David ease I'm a lecturer in public policy at the Harvard Kennedy School, where my teaching focuses on digital transformation and government, particularly interested in how governments can use technology to to create and provide public goods more effectively. Maybe the lesser known fact about me is I'm actually originally I have also a career, or maybe a side project as a professional mediator negotiator and I have a real passion for the environment. I worked with many, many environmental groups around the world helping them think about negotiation strategies as they pursue their public policy objectives. My early connection to thinking about technology. I think that the thing that has driven the memory I go back to is I was dyslexic as a kid. And I remember I had a friend whose dad had, you know was involved in software and we didn't have a computer. But they had a computer and it became very interesting to me, and I managed to persuade my parents to get one. And I was probably the first kid who wanted to have word perfect and Microsoft word on the computer, because I very quickly realized that there was something called spell check and then ultimately grammar check. And I was probably the first kid to ever submit all of their papers, like printed out. And I share all this because I'm fairly confident if it had not been for spell check for grammar check, I wouldn't be here today, but I would have been streamed some other way. And so there's a lot of luck and a lot of privilege that allowed me to get to that place. But when I look back, I think a lot of people would have said Oh, is that cheating or you know is David following the system in the right way. But it was actually the only way that I was able to navigate the system and so technology shaped my life in such an important profound way. David, thank you for sharing that that's wonderful. Hi, it's a pleasure to be here today. I'm Deirdre Mulligan. I'm a professor at the School of Information. I also have an appointment at the law school. I started my career at the in law in public interest law in particular, and I like David and David. I got an internship at I was at one of the first public interest fellows at Georgetown's law school, and got a summer position at the ACLU and I was offered a spot in a bunch of different issue areas and the one that I chose was with the privacy and technology project because I was leading at January Goldman. I had been told by high fell bloom would be a fabulous mentor, and she was. And while I was there, I both realized how male dominated the field was that Jen Lori Kate Martin and I would go to these meetings with mostly FBI and criminal security administration officials around encryption policy and so it was the sea of those like the men in black, right and these us and Dorothy Denning maybe and, and realizing that this was a field that both needed to be diversified, but also realizing that I wanted to be a lawyer because I thought it was a very important tool for social justice. And because I was working on encryption policy I realized that the design of technical systems could also be a really important tool to protect values that we care about. And so the reason that I moved from the law school to the School of Information after a career at the Center for democracy and technology that I helped start is because I really care about a set of social values. I care about social justice and the public good and both law and technical design, as well as organizational design are necessary if we want to protect them going forward. I'm tidbit about myself on the following on Dave Guston, I played soccer in college and I have continued to coach and play most of my life but having blown my ACL out three times I know mostly I'm on the sidelines. Sorry to hear that. Okay, thank you, Georgia. And now Sylvester. And so, good morning, good afternoon. Good evening, depending on where you are in our, in our big world, excited to be here on this panel. I'm Sylvester Johnson, I serve as the executive director of attack for humanity initiative at Virginia Tech, which is our efforts to promote human centered humanistic approaches to technology. I'm also assistant vice provost for the humanities and I direct our Center for humanities. I. Interesting factoid about me is that I ran a cafe with my spouse some years ago that was launching breakfast spots so I know how to pull a mean espresso if you ever need one of those I can actually turn out a pretty good one for you. So I am really excited about the public interest technology initiative and my own entry into this began with my work as a humanist as a humanities scholar, running a digital humanities project that required us to develop deep neural network algorithm to process the correction that we were doing. And that turned into a 20 team AI project that I was leading that that made me realize there was an interesting connection between the way many people and humanities traditions have associated thinking and reasoning. But being human, me realizing at the time that they were increasing efforts to get machines to do but look pretty close to what's called thinking and reasoning or making decisions. And so I realized that there were tremendous humanistic questions about this. What is the place of humans in a world where increasingly we are becoming more sophisticated in our efforts to get machines to do things that we've been calling human decision making decision trees and the human machine connections for a hybrid intelligent infrastructure. And from there I began to understand that so much of the institutions of power and the decisions that affect people's lives are being shaped by technology innovation. So as someone who has spent a lot of time studying social systems like race and politics and empire and religion, I began to recognize that there had to be a much more robust engagement with technology from humanists in order to help address some of these larger questions that our society is facing. And from there, one thing led to another I began to participate in a number of different efforts that were individual and collaborative in order to garner a more robust response and so really excited that our institution Virginia Tech is part of this pit you and network, and looking forward to our work together to bring our best efforts to these important challenges. Okay, thanks Sylvester doesn't look like we have Sujatha coming in yet so I think we will dive in and I'll try to, you know, moderate a, you know, a lively conversation so that this effort does not contribute to the zoom and gloom that we've all been facing under the current circumstances, without having to resort to the mute button. Okay. So, David Deirdre in particular you guys were involved with the the current definition that the Pitt University network has been has been using and I'll read that for the audience and it's very short version and, you know, kudos to you guys for really coming up with incredibly concise public interest technology refers to the study and application of technology expertise to advance the public interest generate public benefits promote the public good. And I want to give either of you the opportunity sort of to dive into that as it was initially constructed. There are three points that underscore that if you could just elaborate that as a as a starting point for all of us before we get into issues of what might come next and and and how that has performed a little bit. Deirdre Dave, David do you want to. One of you want to take up the slight elaboration of that. I'd like to defer to Deirdre first. Sure. Dave why don't you go since you started. Well, I think so for me like though, if I think about the kind of expanded definition and certainly what I saw is like a big challenge and coming up with this definition is a this is a very, very, is a very, very broad area that we're trying to harness and capture as a field. And then there's a spectrum of people that I think we're also trying to bring together and so I was, I was, when we were constructing this I was very concerned about how do we make sure that this isn't just about creating more expertise, and, and kind of creating what I think is ultimately going to be interdisciplinary experts who are kind of connecting the humanities and technology to think about how technology plus public interest, but also creating people on the other end of the spectrum who are going to be effective consumers of those expertise, because we're going to have a large number of people who are going to be making decisions about our society in our future are not going to be these experts the decision makers are going to be people who are politicians or administrators and public officials, or CEOs or nonprofit holders and so how do they have enough knowledge to actually engage with the experts and be able to know if they're getting good advice and challenge them, and then take that advice and run with it. And so I think one thing that we try to do in this definition is not make it too broad, but make it inclusive enough for that spectrum that I think really matters. Yeah. So, it was a really fun exercise to work on this definition, you know with the group of folks that we had at the time. You know, Ed Felton at Princeton and Tara McGinnis and Jeremy Weinstein at Stanford Tara at New America, David and I know we, we sit in different departments and some of it you know I've known add for 20 years or something you know some of us had worked together for a long time in this area, and some of us were really just meeting each other for almost the first time. And I think there were a few things and that really carried us forward. And one, you know you were talking about responsible innovation and responsible research. And I think one of the things that we really wanted to be clear about in this definition of the public interest technology field was the orientation towards social justice right. We can do responsible innovation and be developing for private interests we can do responsible research and being developed, you know developing something that is purely for private gain. And I think here what we viewed as really distinct was the orientation towards public service towards social justice towards the public good. And I think that there was an understanding that we needed a set of interdisciplinary experts who not only had expertise as David was saying that spanned multiple disciplines, but we also needed to cultivate communities of interdisciplinary practice, they were going to be specifically oriented towards public service, social justice, etc. And I think that that's very similar to the effort that was led by the Ford Foundation and others around public interest law, right that lawyers we have trained to be ethical but there's also a subset of us who really want to use our tools and service of the public good the common good. The second thing that I think is important was often people, I think from the social sciences from the humanities and even from the professional schools are viewed as kind of in service of innovation or in service of more technical expertise. And I think this document was really an effort to say we, these are complementary expertise and we need people who kind of fully embrace the fact that to be a really strong practitioner in this space, you need all of those things. And, and another key I mean there's many things to talk about but I think another thing that David Eve's really brought to the table which I think was a really important and significant thing. I kind of think about as a, almost a feminist ethics in that in the definition. He really wanted to emphasize in the kinds of skills people needed that they needed empathy engagement and negotiation. And I think those are incredibly important you know if you want to do interdisciplinary work. There's a degree of modesty and humility, you need to bring and a little bit of you know I was the kid who was always like what does that mean. I would say like totally the willingness to ask questions, because people are often using terms in ways that are completely different. And if you aren't willing to be the person who asked the question. The work is never going to progress right and to do that you need people who are willing to be kind of active listeners, and willing to answer questions right that that kind of empathy and engagement. That's another part that's important in that empathy and engagement. I remember teaching when I first moved over to the School of Information and our master's students class when I teach a required course on information law and policy and I would have students would be like, Oh, there's not even any computer scientists in Congress. Not a lot of doctors either but that doesn't mean that they don't have the capacity to understand and craft meaningful law in this place. And you cannot come in being dismissive or study right that you need to go in and understand that your job is to actually meet people halfway. And I think cultivating people with the skills of modesty and humility, empathy, engagement and bridge building is really essential to building a field because none of us advanced for social justice alone right that the kinds of people who we want to bring into the field or the field we're trying to develop they have to be part of social justice coalitions they have to work well in government agencies. They have to work well on the hill or wherever it is they're going to be. And so I think David brought that to the table which I think is incredibly important. I love the metaphor of meeting halfway in these interdisciplinary conversations. It's one of my rules of interdisciplinary collaboration actually to be willing to go more than halfway. Because there are also a set of power dynamics that underlie the relationships with as you also suggested in the, the, the perception that sometimes that the humanities and social sciences may be in service to the natural sciences and that being willing to go more than halfway take that extra little step that you're not actually asking for everything that you're giving of your potential collaborators a great way to get those collaborations started. You describe just very briefly in your introduction. Part of this perspective coming from the humanities and then succeeding in building a large project set of teams. What do you think about the the interdisciplinary from the perspective of the humanities that's been built into pit. Yeah, thanks so much David I think the definition and the conceptualization that's articulated in this document is really brilliant and anyway many ways I think it's is timeless and its emphasis on being civic minded and really emphasizing the interest for a public. I can't imagine that will ever get out of date if it does we're really in trouble. So the the foresight and I think the longevity that has shaped this articulation is really wonderful. I think it's also really important to to build on and draw on this focus on comprehensive approaches moving across disciplines and they're different terminologies that people use multi disciplinarity interdisciplinary and articles have been written trying to parse how those are or not the same thing but I think the important takeaway is that there is no single discipline or or any particular set of disciplines that we can point to exclusively and say these alone are the disciplines that people are going to come from in order to address the leadership challenge with technology. There is no area of knowledge or expertise that is irrelevant to God guiding and governing technology and addressing these challenges and so the the inclusivity at the level of knowledge and skills and methods and approaches I think is really one of the inspiring things about this definition of public interest technology with with an eye particularly to the humanistic disciplines and human centered approaches. I think it's important to appreciate that one of the changes that we're now seeing in the legibility of technical challenges or technology is the easy apprehension of the fact that we need expertise from across areas from across disciplines from across sectors. That that's a lot easier for people to see today than it was 20 years ago if you look at the debates that are happening that and I think this is not so much that we're smarter than we were 20 years ago I think that the technology has just reached a place where it shaped so many aspects of our lives. Everything from what counts as money if you have digital currency. What do you do with central banking. If you have cryptocurrency for example and that kind of question or challenge is is not necessarily an engineering question per se. It's about the very philosophical concept of what counts as money it's about the legal apparatus regulatory framework surrounded. I think also if we look at something like the impact of technology is happening on on highly vulnerable populations. It becomes even more important for us to to really invest in the pit UN's emphasis on diversity and inclusion. The more highly vulnerable populations you include in whatever you're doing it doesn't matter what you're doing. The more you're inclusive of people who have the most to lose when things go wrong. The better outcomes you're going to get in that guidance or governance process and that's true of anything. It's especially true of technology so that's not even about disciplinary. It's also about who has the most to to lose or who is most vulnerable to problems not being addressed or to a very narrow demographic of people exclusively who are getting to shape or control policy or design or implementation and make decisions about technology. They're things like the future of work. So your example David you were you were clearly a very precocious young child thinking thinking so early at the age of five about the future of work and technology but definitely that's that's an equity issue. So I think the definition is really inspiring I think that the team that put it together did a great job of just keeping it very open while also giving clear substance to it and the substance not being around a set of departments or disciplines or topics but rather around what are the outcomes we want. We want civic minded public interest in invested work to achieve those outcomes that are going to create the most equitable situation that is going to give us a kind of society that we actually want to live in. And I think this is relevant also to some of the ways that UN then has has been in turn shaping the way that public interest technology functions because the funding has supported disciplines across our colleges and universities so there's a lot of representation from arts and humanities as well as law or policy as well as engineering in education. So you see that representation that's been very intentional and the call for funding has also been very explicit in really thinking about how we need to design the work that we do. If we're serious about public when we say the public. So one of the points in that definition is that if you're only talking about a single demographic of the public that's not really public interest. You have to mean everybody it's got to be broad. And so you have to be deliberate about being inclusive and and the funding calls have reflected that commitment so that we get the outcomes that we need. So I wanted to push perhaps each of you to to think about the that hard to define public interest and I think Sylvester was getting there in the latter part of his remarks the definition the blog post the four pager that we're familiar with from the beginning of this conversation acknowledges that the public interest is very difficult to define my background my disciplinary background I didn't mention I'm a political scientist. And this is one of the challenges that my discipline whatever field the IO to it which is very minimal phases in the world and thinking about the variety of ways in which people might define the public interest and I've heard some sort of proto definitions in here, but also the variety of ways in which people reject the public interest or reject defining the public interest, or reject conceiving of such a thing, even linguistically perhaps that there is a public rather than a set of Publix or a massive individuals. And I think we've seen some of those issues even percolate up in the context of the most recent campaign. I think we've seen a lot of those issues. You know, be right in our faces in the context of the Black Lives Matter movement and how it erupted in all sorts of ways in over the past year into the public and political consciousness and if you look at the exit polls issues of equity. One way or another were involved in a lot of people's opinions as they went to the polls. So I'm wondering, you know, I guess the fundamental question here is, you know, we're often used to thinking about science technology innovation as something neutral as something from which politics has been extracted. And that's, you know, the image that most of the public has of science technology and innovation is if it gets complicated is more sort of, you know, a tool that can be used in one way or another. And then these issues politicize science technology or innovation. Yet what I take our enterprises trying to do is say, No, wait a minute. It's not political. It's not that the politics is added at the back end that there is a politics there are a set of values that are intimately interwoven with science technology innovation from the very start of this. How is it that we get that across in the context of a public interest where sometimes even speaking of public interest is fraud. I'm happy to start that off. And I, Dave, I know this is an area where you do research and I'm sure you have lots to share too. You know, on obviously the very first thing the questions of like, what, what things, what problems we think are worth solving what questions we think are worth asking are all of course, very political in nature. But the public interest technology field that we're imagining is one that understands that everything from the problems we choose to focus on to the decisions that we make in how we model things in how we engineer them, the assumptions that we make about humans and organizations and behavior. And those things are inherently political choices. And, you know, like all of you, people have been I teach a large undergrad well large for me 145 students are not large for some of you I'm sure, called behind the data humans and values and our undergraduate data science curriculum and how's it going this year and I'm like, well, you know, we talked about, you know, do artifacts have politics and we talked about the difference between data and capture, and then I have them explore zoom and think about the way in which zoom reorganizes right power and authority, like imagine me going over in the classroom and doing this to you. Right, like that would not be cool. Right, or imagine deciding you can have you know ongoing side chats while I'm lecturing that would not be cool. Yet here it's all cool. Right. And so in some ways, this semester in the cloud I think we're now most of us are looking at a year in the cloud. I think it's pretty providing a really interesting opportunity for other people to really experience the way in which our technical choices and designs of systems and the things we choose to use technology for and not because they've been very interesting acts of resistance right, the resistance to the algorithm in the UK for determining a levels resistance to remote proctoring I think at many universities across the country, questions of accessibility and privacy and security as people have experienced zoom bombing and people, you know, I won't even go into some of the privacy issues that have a vision. I think that the, the, the grounding and focus on how integrated politics and technology are couldn't be more apparent than they are at this particular moment within the educational community because all of a sudden technology is affecting teaching and research and even access to campus in ways that people here to for kind of didn't quite realize you know for many of them their blinders were on and as we know, you know once systems are stable right they become the background infrastructure and people don't see their politics. So at these moments of transition, where we have like this unique moment of possibility where people all of a sudden realize because the politics have been upended, their expectations have been unsettled. So I think this is an incredibly fruitful time for those of us who are trying to build a greater understanding of the deeply social and political nature of technical systems and their integration into different spheres of social life. Professor, can I toss this back to you too? Absolutely. Yeah, I think that it's it's definitely important for us to consider all of those factors. And we are thinking about the public think it's especially critical that we keep in mind the different levels of vulnerability. And, and so one of the things that I think is it's helpful for articulating and conceptualizing how we need to approach the question of the public has been articulated already in that definition and dear to you prefer to this a few times and David use as well and that is social justice. So that brings the question and where is the injustice. Where is the inequity. So I said earlier that when we think about the people who have the most to lose from things going wrong, or from problem something addressed and not just excluding them in the decisions but thinking about them, but actually including them in the process of figuring out the knowledge developing the regulatory framework or the teaching, whatever it is that we're doing, we're always going to get better outcomes, even if we don't end up with a perfect situation. And so that that should not be thought of in a static way that here the 10 people who have the most to lose, and we can just go through all the issues and it's the same 10 people. It depends on what we're talking about if we're dealing with healthcare decisions and the use of could be data infrastructure around privacy or getting access to data or decision making diagnosis of medical presentations. For example, then we have to think about the populations with respect to that issue who might have the most to lose if we're if we're looking at the use of algorithms and law enforcement things change we know that there's huge disparity in race and policing. If we're looking at military decisions, and you know we really need a global approach to this. So if we're talking about weaponizing AI. That's not one political party versus another is really different regions of the world where these weapons are going to be deployed and by whom and which people's then become most highly vulnerable. And all of that I think tells us that when we approach the question of the public. We have to recognize we're not talking about a static notion of the public. We have to recognize the need to be dynamic and to be nimble in in shifting from one focus to another. If we're really concerned about equity, and if we really do want outcomes that that give us a democratic outcomes. I mean that in the small D sense of really equal access to equal results and just situations, then we have to recognize that's going to depend on what issue we're talking about. And, and if we are using that as our parameter to figure out who when things go wrong, who's going to be the most hurt by this who's going to suffer the most who's going to lose the most who could be the most devastating. That approach is going to give us more equitable results. Great. And I just want to remind people at least according to the counter in front of me we've got about 15 minutes before I want to turn over to broader Q&A. David just as a sort of exit question around the definition and how it was constructed and where it's been. Do you as one of the creators see any soft spot see any weaknesses as Pitt has played out since the crafting of the definition. I think the the, I'd like to answer that by linking actually with this previous question because I think it's it's not surprising to me that Pitt has happened now in the moment that it's happening. You know, I think about like industrial policy and technology policy in the United States and globally like, I read like the van of your bush biography, like last summer the summer before. How the United States and many other places have really seen like technology is a place where we just push as far and as fast forward as we possibly can. And that this does two things one is it encourages speed encourages the siloization so we can specialize to move forward as fast as we possibly can. And what's now happening in the digital context is that technology is now matured enough that it's starting to form like you say well what is the public interest. That question, but I can tell you what public infrastructure is. And it's very clear now that this the digital technologies are now at a point where people are suddenly waking up and realizing, wait a minute, all of this stuff that we've been building, this is now digital infrastructure, and this is an infrastructure problem. And the way that we think about equity around an infrastructure problem I think is different than how we think about equity around a consumer type problem. And so if I think about like the law, well took us 5000 years before we I think really started to think about equity in the law. And when I think about physical infrastructure, you know, it's not the stories and actually not that much better like I talk a lot of my students about accessibility and like curb cutouts so the fact that people in wheelchairs can navigate the streets and how that innovation actually benefited like me with kids in a wheelchair like nothing like better than not waking your kid up as you're bouncing down the street and getting up the street, like how that benefited a whole bunch of people, and how it took us, again, thousands and thousands of years before we started to realize that the infrastructure in the physical sense had to be built for everybody, and we had an inclusive and kind of social justice framework around that. And so I think right now we're at this very important tipping point where digital infrastructure is being recognized as public infrastructure. And we're trying to I think part of our goal is to apply this lens to that infrastructure and and ask these questions now is supposed to 2000 years from now after it's all built and baked. And it's too hard to rip out any inequities have been even further and trashed in our society. So that's why I think Pitt UN is happening right now is because we recognize that need the challenge with that and the kind of the turning of the definition is, I'm doing this work trying to get faculty around the world to figure out what is the minimum knowledge around digital that public leaders need to have. And NASA, which is the body that accredited public policy schools in the States has actually removed technology as a requirement for certification. And I think the reason they did that is because it became so diffuse that it was hard to tell what was digital. And I kind of laugh because I'm like that's the reality of our world digital is now part of everything. And so the challenge for us is how are we trying to wrap our hands or our hands around what is it now effectively everything, because everything has a digit every services a digital service, everything touches digital. And so how do we wrap our hands around that and still leave it be coherent. It's just like an enormously complex challenge. I want to I want to move on and I'm going to come to you next actually did you so if you want to weave it into the next set of comments please do, but I want to begin a transition so am you know, did you in your earlier comments about things that you taught you referenced, at least obliquely landed winners do artifacts have politics, and winner is a incredibly important thinker for me as well because I lodge my normative perspective. In the identity that winner created between technology and legislation. And he said that, you know, legislation is not just like that he said that technology is not just like legislation. They're really two forms of precisely the same thing. There are things that we do collectively to create the structures through which we, we live our lives and pursue what we see as as good in the world. And therefore if we have a set of underlying norms and institutions and so on that relate to how it is that we make laws and we can refer to those roughly as as democratic institutions. Maybe we should have the same sort of norms and values and institutions directed toward our construction of of technology. Oh, here's Sujatha. Apologies. That's okay. And so we'll get to it in a second Sujatha. And so part of what I think the conversation around public interest in the context of public interest technology is, is about reconfiguring how it is that we think about the public private divide around technology. And over the past year or two, we've seen some of that reconfiguration coming from the private sector side as well so who is it it's black rock the, you know, the huge investing group has now put some incredibly interesting sort of conditions on what it is that companies that are not going to invest in the business roundtable has started talking in terms of stakeholder capitalism and not just shareholder capitalism. There are other kinds of examples, coming back from the private sector that go beyond corporate social responsibility that had been, you know, the sort of pale version in my opinion, of simply being a decent actor in the community. Deirdre particularly perhaps from your perspective having been attached to public interest law, and now moving into public interest technology, where on the law side there's you know, there's a deeply embedded value of pro bono work among private sector lawyers what would it look like if we had pro bono work among engineering firms. You know, what do you see, you know, do you see the private sector meeting us halfway. There's a whole lot of questions. I'm going to start with the Langdon winner and, you know, in artifacts to artifacts have politics he also references Lewis Mumford right and this idea about inherently political technologies and the one of the things that we touched on in our definitions is there's this one line that says this means that efforts to constrain the bad use of technology or to mitigate the harmful impacts of technology are also a part of the field. And one of the things that I think has been most interesting recently is the reemergence of the challenges to the idea that technology is always liberating or is always about innovation. And my colleague Jenna Burrell and I run the algorithmic fairness and opacity working group at Berkeley and we just held this multi day conference on refusal. And one of the things that I think has been kind of like the moment right we've seen companies deciding to stop selling or stop researching or put a halt on the, the facial recognition technology right and we've seen communities using it and saying no we don't want to use that technology. And I think that there's this moment where all of a sudden people are thinking this reflects both. I think a deeper understanding of the politics of technology. It doesn't matter that they might get the matching part better right it's that introducing it into systems where we've had systematic racism is going to yield unjust results no matter how quote unquote unbiased the technology itself in some way might be or we could make it. And I think that this understanding that there is some public interest that goes beyond just the kind of innovation impulses of any individual company that should be a lens and we've seen it's interesting we've seen employees within companies talking, not just about stakeholders instead of shareholders because we see companies talking about stakeholder orientation not just shareholder and orientation in a bit of a swing back right there's a pendulum issue here. But we've also seen the employees in particular who've been protesting against different corporate technical efforts, very explicitly using the language of human rights, right they've not been solely using the language of ethics. And I think that the the rise of that human rights language and that lens through which we look at technical design, as well as the kind of use and implementation. So understanding that it's, it's not going to just be what's baked into the system it's going to be how we choose to deploy it in society, and that companies can't just like wipe their hands that those questions about is your technology going to be used in the world in a way that furthers ideas of the public good is something that it's not so easy to walk away from. And I think that that's been an important development and I think really builds the appetite for the public interest technology within the commercial sector as well as within the public interest sector where obviously there it's already very well aligned. And so I don't think it's at all surprising that, you know, at least for us at the School of Information, some of our postdocs and PhD students are getting jobs in the kind of ethical AI or machine learning practices within companies. And I think it's precisely because they understand that they need to be thinking about the social and political implications of technology. While they are developing or potentially choosing not to develop or to gate or to control or to limit certain functionality. So I think that one, I think we could have spoken more strongly to kind of the concept of refusal as part of the things that are in the public interest technology field. And I also think it's one of the really interesting developments that's broadening the appetite and interest for the public interest technology fields growth. Great. So I want to welcome our fourth panelist Sujatha Raman, who is coming from the completely other side of the planet from where I am. And it's I think very early in her days so thank you for making the effort to be with us Sujatha. At the beginning of the program I gave folks the opportunity to introduce themselves very briefly and to share with us with with us a sort of tidbit about themselves personally and perhaps a very brief introduction to how you became introduced to public interest technology or something like it in in your life history and then from there I'll just invite you to reflect as I think we prepared a little bit to reflect on a topic that you've been deeply involved with the concept of responsible innovation and how that has been, you know, part of the global framing of these cognitive issues that we're trying to deal with in Pitt and any reflections about our eye and Pitt that you might have. So Sujatha. Thank you. Thanks, Dave. And thank you for this opportunity. It's very exciting for me and I'm so sorry to be very late to the party. You would think that in this day and age it's very easy to calculate time differences, but yeah, just had that wrong. So I was, you know, when Dave gave me this invitation, I didn't have to think twice about saying yes because I've been interested in kind of public interest issues for a long time. So growing up in India, it was something that was, although I was a science student, it was questions of, you know, social good public good social justice, you know, of deep interest to me. You know, you can be a nerd, but still be interested in those questions. But I think in India, I didn't at the time. I didn't know of things like, for example, people science movements, participatory technology design, those sorts of things which I think are the root of of the questions that the panel is considering today. But it wasn't until I moved to the US Virginia Tech in particular that I discovered that you could actually put these two pieces together. So questions to do with science, scientific research technology on the one hand, and questions to do with social good public good on the other. And so for me, I guess one maybe missed opportunity at, you know, and another route that I might have taken had I got a job that I applied for 25 years back, which was with the Loka Institute. So the Loka Institute is I think one of the early examples of attempts to bring communities together into exploring questions of technology futures. I know Dick Sclow, of course, was a student of Langdon Winnell, very highly influenced by him, but very interested in moving beyond, looking at questions to do with science technology and society simply as a sort of an academic, a set of academic questions and actually trying to promote that in the world if you like. And so my own, my own life has taken me in lots of directions, I think I got very close to the local job, didn't quite land it. But since then I have lived in the UK, and then more recently in Australia a couple of years I've been here. And to me one of the reasons and why I'm still, I guess, interested in these questions and I'm still around and you know having sort of thrown it all up and gone off to do something is because the whole kind of field of responsible innovation, which I think over the last 10 years has really gained a certain kind of impetus for bringing together a number of questions around, which were I think previously explored under sort of the labels like technology assessment, participatory technology assessment. And so on. And for me what's distinctive about responsible innovation in the way in which, of course, Dave and his colleagues in Arizona State University, but also European so my former European colleagues, both in the UK but also Netherlands Germany, European Commission and so forth the way in which they have developed this. I think the way in which it's distinctive is sort of learning the lessons of the history of technological developments and thinking about, you know, how can we actually going back to, I think, as you were saying, going back to the question of what gets baked into the technological systems in the first place right and how can we influence those. How can we work together with different organizations different publics, but also at the end of the day, working with engineers working with these people who are involved in developing technologies. You know, how can we engage with them in a productive way in an empathetic way so you understand, you know, what their kind of constraints are, but also bring in an explore some of the possibilities that we have for transforming technological systems in the public good. So kind of redesigning. So I think that issue of sort of, you know, what gets baked into the system has been very much at the heart of notions of responsible innovation. Now, just one kind of comment before I, and apologies have gone dark, the light has gone off, I will turn that back on. One of the ways in which I think this has been maybe somewhat, been somewhat unfortunate in the way in which it's been understood, is that because the label responsible. It's sometimes people hear that as something to do with the research integrity. And the, you know, then the sense is, well, we already have principles for research integrity and so forth, you know, beyond, beyond the sort of committing fraud and, you know, so on and we have guidelines we have procedures dealing with those questions. So I think some of the ways in which responsible innovation has been kind of understood, not necessarily obviously by the people who have developed it but by the way it's been taken up is in, you know, somewhat narrow way but I think there's still plenty of opportunities for the kinds of questions that the public and just technology network has posed. And many of you I think will have already been talking about today. And I just wanted to give one example, if you don't mind, they run out of time. And that is, I mean, one of the things that's very customary. Okay, at this point I think this is weird. So I'm going to turn the light back on. I just needed to move. Okay. And so I come, I'm speaking to you from the land of the first Australians, the Nanawal and the Nambri people, and it is customary in this country to acknowledge that this is the land on which we are speaking this is the land the custodianship of this land by indigenous people has made, you know, lots of things possible, including the very existence of this university. And one of the ways in which this is, I think directly relevant to the topic the panel is that there is an emerging movement for working not just for Aboriginal people, but with and sometimes by efforts by Aboriginal people, indigenous people in this country to redesign to design technologies that actually serves their interests that addresses the, you know, the history of colonial outreach in this country but also addresses the needs of indigenous communities in Australia today. And I think that is a very, very kind of exciting and, you know, it's, it's going to be a hard slog it's going to be, you know, very challenging but that's I think one aspect of the way in which responsible innovation kind of ideas is playing out in Australia and is one to watch, I think the next few years. Thank you, Sujathan. As I mentioned, I think that it is part of the intention of the network to to turn a little bit more of its face toward the global context, and as part of that in will now start to go to some of the questions that are rolling in from the audience. Another kind of global framework from Jennifer Hirsch, have any of you been engaged with the UN sustainable development goals. She says we are doing so and so is BlackRock and lots of other stakeholders seems like a way to push pit further within a framework being widely adopted. And I might I would also inflect that to suggest that what differences similarities or differences might you see between a framework that emphasizes sustainability and a framework that emphasizes public interest technology, who might want to jump in here. David smiling, let's give it to him. That'll be a lesson to teach anybody to smile at a good question. So, the one thing is I do actually, so I actually would very much like for it to have a more global focus and kind of a global attention. There's a story I often tell. One of the, one of the ways that I have some privilege of the United States is that people make assumptions and one thing they assume is that I'm American, and not. And I remember very well when the snow disclosures happened, and that the president at the time came down on the Rose Garden, and the big announcement to said that Americans should not be worried that no one is being legally spied on. And I can say that with an absolute confidence that the other 6.7 billion people or whatever it is on the planet, really parts those words pretty carefully, because they realized what the, what the inverse of that statement was. And we're, we're not going to have all too often I feel like because the many of the companies and infrastructure with built in the United States is the assumptions the solutions are going to come from within the United States. And this is, these are now global issues. Even just the other day there was a, there was a nonprofit kind of organization I saw get started up that was going to try to help solve some of these problems that I looked at the advisory board and it was entirely composed of Americans. I was like, this just makes these problems are not American problems, these are Brazilian problems are Indian problems. These are global problems. And I took at some risk of being overly candid. I just don't think the Americans are all Americans have really caught up with the fact that these are global issues and that the world actually wants to say in thinking about how global public digital infrastructure is going to be managed. That makes the problem a lot more difficult for us to address. And I'm deeply concerned about issues of sustainability and of freedom and of democracy. And I want to make sure those are at the heart of it but this does need to become a global conversation. And I want to get this back to Sujatha as well. Yeah, I think one of the ways in which maybe we should be thinking about global problems like those and exemplified by the SDGs. That's relevant to our context is that, you know, very often you will have, you know, American engineers or scientists, European engineers who, you know, go off into other countries. And if you like, offer technological solutions to some of the problems that, you know, might be seen as if you like solutions to the SDGs. But in a way I think when we start bringing in questions of public interest technology into that equation, I think it helps to enhance some of the reasons why that kind of activity, you know, the global kind of roving activity, if you like, of many of these researchers, you know, actually needs to needs to be more kind of dialogical participatory. There are plenty of examples of that sort of thing emerging. But really kind of learning the lessons of even in very simple cases like trying to develop improved cook stoves is one one area that I've been involved in. You know, it's been a movement for 40 years, 45 years. And very often it's gone awry because you assumption is you can develop design something in a lab in Sydney or, you know, Phoenix, for example, and then you can just literally transport it elsewhere in the world right so I think the public interest framing can help kind of bring attention to the importance of, you know, doing doing that kind of global work in a, in a better way. Great. So I've got another question from the audience from Mark Royovich Presslar. And I think I'm probably going to start with Sylvester on this one so pay attention Sylvester. Well, Mark is curious about what you all think about what we could and should do better to track the and influence the impact of technological change on society, particularly focusing on the disruptive nature of technology. Our society he right seems to be suffering from the unintended consequences of these revolutionary changes. Now, attention to that question was built in to the definition there's been, you know, scholarship over the long period of time going back to folks that we mentioned like winner and even Mumford in trying to grapple with this. Sylvester, what are your thoughts about what kind of progress we've made in in grappling with these things that are hard to predict hard to anticipate, but yet necessary to govern. Yes, this is a great question. I think certainly one one clear example that you are already working on at age five David was a future work. So the World Economic Forum has actually produced some very sophisticated anticipatory analysis of the disruptions in labor structures, which is a huge concern. One these there's one economy is the global economy. And that's the only one there is. And, and so the ripple effects and the impact of the disruption and labor is is always something that needs to be understood. Transnationally, globally, which is close with supply chains, you know, with the shifts in GP, the concentrations of wealth. There are other things that we have also been been tracking. That, I think has demonstrated some clear progress and robust attention if we look at the use of AI systems, particularly in surveillance. That has gone or a lot of analysis from people in government, particularly in our research institutions, think tanks, human rights or civil liberties organizations. There are other things that are not necessarily as easy to study, but that are nonetheless important. And that has to do with more, you can call them cultural ships that aren't necessarily reducible to labor to jobs when employment. But it could be for example the way these technologies affect our social relationships, the sense of closeness or intimacy that people experience. And that can be unpredictable. So we should not assume that digital technology isolates people. David Eve's your point about the enabling effects of this technology for being able to do writing is something. So people, the fact that we're able to have this conference right now, despite a global pandemic is just indicative of one way that technology can actually enable people to connect and to experience a world out there. We've come to rely on nothing more as we've been sheltered in a lockdown or confined to our homes for safety reasons that we're still able to have birthday calls, that kind of thing. But that's cultural. And I think as we try to be forward thinking to understand the impact of this on something like the delivery of learning in higher ed, that's less certain. It's not clear exactly how that's going to play out. It's harder to predict. But it's a safe bet, I think, to understand that the capacity for disruption is not contained in any sector. And I think it would be naive to assume that there's not going to be very much change to something like higher ed or to the delivery of healthcare, telemedicine. So there are some areas that have gotten more attention than others, but I think there's always room for us to consider what we're overlooking in order to be vigilant and to try to understand where the opportunities and the challenges are going to continue to emerge. So, Sylvester, in part of your answer and going all the way back to the beginning definition, there is built into this pit perspective, a disposition toward the future, whether we're talking about the changes that we normally associate with technology and innovation, but that are, I think we've danced around the borders of this statement that are also fundamentally changes in values that come from our intentions and our institutions and the kinds of perspectives, political, moral that we bring to the creative endeavor that we're engaged in. And because it's essentially, well, maybe not essentially, but because it's importantly a future oriented activity in which we're engaged in, when we're engaged in pit. So I want to use as an exit question for you all. This from Ken Fleischman from the iSchool at UT Austin. How can we use current and future media to orient today's five year olds toward pit or five year olds in our, in our own futures. And let's, who wants to, who wants to field that first but I'm going to go through everybody. So I'll take volunteers. Sujatha, go ahead. Yeah, I'm happy to jump in. I'm not quite sure. So I missed the five year old Dave. But it was, it was my favorite book when I was five, which was Mike Mulligan and his steam shovel, which was a depression era book about technological unemployment. Yeah. And so, I guess for me, one of the ways in which we can enthuse and motivate young people in this, in this vision of public interest technologies, partly trying to sort of change the way or stimulate attention to, you know, what we take to be novel, right. I think our culture. It's so kind of, we're so used to defining novelty. You know, the exciting stuff, the innovative stuff in terms of the technological, right, something that's new gives you a new technological capability. And so, I think if we actually were to sort of open up that question, provide other examples of how, you know, novelty can come from, from achieving the public interest for from achieving the public good right and doing that in conjunction with the technological. I mean, I just want to acknowledge a former PhD student of mine from Malawi and he asked this amazing question. He said, you know, he was interested in technologies that were defined as pro poor. And, and, you know, he did research on these and basically the finding was these things, you know, didn't do very much to address problems of poverty. And his question was, maybe the issue here is that actually these technologies that are described as pro poor, they're not innovative enough. They're not novel enough. And for him, the definition was to be to count as genuinely novel. It has to address the public interest has to address the public good, the social good, as well as, you know, meeting the some kind of technological vision. So I think if we can sort of have more stories that actually convey that that kind of intimate connection between the social and the technological. So that we change assumptions about what we take to be novel or innovative that could be a way forward and to enthuse young people. Great. Thank you. How about Deirdre, this question of enthusing five year olds into the public interest technology. So I think that some of the ways in which we would enthuse children, or build their capacity to question and see the values and technical design are the same ways that we build kind of their capacity to see the values in our society right so like travel books, movies, right things that help them understand that the ways in which we do things here the, the, the infrastructure choices we've made about public transportation or like those things aren't given right that those are all political choices about how we spent our money and what we thought was valuable. And I think those same sorts of like those critical faculties that are built through kind of comparative understandings are so helpful in getting people to be questioning about Oh, does it have to be this way or why is it this way, rather than just saying oh that's the way it is so I think that's super important. I think in this particular moment, you know we have a bunch of five year olds who have all of a sudden realized they really love school man. God they want to go to school, right because they're experiencing the limitations and some of they're like oh well, you know turns out I really don't have the iPad for certain kinds of interaction, but it doesn't really facilitate these other kinds of interactions that I really want to have with my peers and my friends. And why don't they why, why are they so good at consumption, and so bad at, you know, conveying their minds or supporting other kinds of interactions. So I think we have this. Again, there's this, you know, whenever our lives get kind of upended. There's this really sweet opportunity to be really reflective about what we value and and why we value certain things and so this moment for five year olds. I know many of you have them crying, and you know and also the teachers who are experiencing all the parents in their classes right at the, the teaching and official. But I think it's making us all very appreciative of well designed technology of the values built into technology and of the things that it doesn't really do very well and maybe the things that we don't want to do it all. Great, great. Thank you to you. Sylvester. Yeah, I think that there's some wonderful conversations that can be generated with young children using this technology. I think about our granddaughter who has Alexa in their house and she got into a conversation with us about who counted in the family who was a member of their house and our granddaughter wanted to include Alexa speaker. There are all kinds of things and questions that come up but I think there are opportunities then to talk to children about those things. Why is the voice of this assistant a female voice. Do you notice any others that aren't. And that's a great way to begin to talk to a child about how gender functions as a social force and its impact. And so there are other ways that we can use these technologies if it's AR or VR that that aren't necessarily about technical things but they could be about technical things but they provide an in road for young children to begin to understand and appreciate some of these larger societal questions. So I think the technology is actually very generative in that way if we take it as an opportunity to engage with children in these conversations. And David thank you. Just I don't think I can say anything as intelligent as the other panelists but I will say I do actually have a five year old and an eight year old. And the just two fun stories I would share that it caused me to think a lot as one is my five year old really likes to play among us which is a game that you play online where one person but he's he wells up with tears whenever he's the imposter because he doesn't want to kill anyone. And I think it leads to actually a really nice conversation about like, well how does the technology cause you to feel this way and, and I just agree with Sylvester I think there's like a really nice entry points. And, but I think it's also causing me to ask questions. So my elder one, he likes, he never types, even though he could, he only wants to dictate he only uses voice recognition software to write and I, and I part of me was kind of like, this is wrong he was typing, and I could only imagine probably my parents thinking the same way about writing and typing because I typed everything. And so it's really forcing me to think well what is the value in that and what is the importance of that, and then trying to have him to have that conversation think well what skills do you think you're getting from this and from that and I think it can lead into a whole set of nice conversations. So nothing is as smart as others but, but a lived experience at the moment. You know, that's, that's great David and I will share in closing a story I have a now 14 year old son, who when he was eight years old, you know, rain and puddles are very scarce and wonderful things here in Arizona. And he was stomping in a puddle in after a spring rainfall, and I shouted to him, watch out for the puddle gators. And he looked at me with wide eyes as if there were possibly such things as puddle gators that could be hiding in the puddle ready to get him. And he said that he thought about it for a second he said Daddy there's no such thing as puddle as puddle gators. What if they're, what if there were, and we proceeded to have a conversation, a very imaginative conversation, technically grounded to the extent that a bright eight year old can technically ground that kind of conversation. So our imagination's infusing it with a little bit of science and technology and that's actually a side point that we might have gotten to here. This conversation has been dominated by attention to digital technology rather than say, biological technologies that may have just as important today in the 21st century in how we live our lives as as the digital, but that became using our imaginations a conversation about what the, the capabilities of emerging science and tech are, and the kinds of values that we might want to infuse with the decisions that are intertwined with the decisions about science and technology. So I want to thank the panelists Deirdre and Sujatha and David and Sylvester for helping us work through the now couple year old definition of pit and it's continuing strengths in my opinion the way that it relates to a couple of other ideas that are out of the world like responsible innovation, like what we're seeing coming down the pike in terms of the stakeholder capitalism it's connection, potentially to other broader global concepts like the sustainable development goals. I want to thank everybody who is behind the scenes at New America and elsewhere in putting forward this first of the pit plenaries, and the rest of the program, and I will turn it back to Andrean to close us out and lead us into the next session.