 But good afternoon, everyone. I actually want to go ahead and invite our panelists to come on up and get seated, because I'm going to ask them to introduce themselves to you. They'll be able to describe their backgrounds and their work, but also dig a little bit into what PIT means to us, how we might define it, and how we hope it can be defined going forward. Thank you. So thank you. Our panel. So when we talk about public interest technology, as I just mentioned, the definitions may be evolving, emerging, changing. But we know that we mean technology, and we know that we mean buy with and for people. And so when we talk about those two halves of this work, when we talk about the business model side of the technology and really how you engage PIT into industry and into the private sector, at times these two sides can come balanced, but they can be very far apart. And sometimes there are tensions. So I think a lot of the people in this room and on this panel have thought about the tensions, the balances, and then the need for bringing these two areas together. So one of the things we're going to talk about today is what kinds of models exist for partnering with the private sector for investing in and supporting the work of PIT from the institutional side, from the civil society side, from the public sector side. We'll also talk about the role of private philanthropic and venture capital to seed new models for working together and new business models for where technology happens in the public space, and then also how to successfully partner, or not successfully partner what we've learned, so ways to partner. I think that something that makes PIT very unique and something, honestly, it's really a pleasure as an individual to be here and be thinking back to my own time as a student when I didn't have a sense of what PIT was and I was looking for this exact community and I was a student of international affairs and looking for this intersection of technology and society and trying to find ways to define that in order to look through the course catalog, right? So we've come a long way, even from my recent, fairly recent graduate experiences, but I think the thing that makes us all special here in this room and then what makes this PIT field unique is the fact that we're all translators. So regardless of which side of campus or which side of industry we're coming from, we are translators, and so I think that's, one of the big pieces here is I wanna ask our panelists as translators how they've thought about these differences in bringing the social impact into the public sector or the private sector, rather, and then they've also been innovators and so I wanna hear what sorts of innovative models they've taken in the past to do so and what we can learn from that to partnership looking forward. So with that, I'd like to hand it off to our panelists to introduce themselves and I'll pass it to you first, Professor Sweeney. Thank you, it's great to be here. My name is Latanya Sweeney. I'm the Daniel Paul Professor of Government and Technology at Harvard, and, but most importantly, when my hair was black is when I started doing work in this space. So it's been a while, lots of lessons learned and lots of gray hairs caused as well. So at Harvard, we've been fortunate, we've been able to have a concentration for a while called Tech Science. So we've been at it a little while and our students have done amazing work. They've gone on to change laws and regulations and business practice in a lot of the big tech companies. The kind of approach that we've taken is basically there's a technology life cycle and our goal is to bust these technology society clashes. In what ways are technology causing harms to society and how do we make it in a way that society can have the benefits of those technologies without those harms? And so with that goal and with that vision, we just start working the entire technology life cycle from vision and development through commercialization, marketplace, maybe a legal problem and then to the next generation of technology. And over the time what we've done is we've developed tools in each of those stages, places that and had enough cycles of students where we've had real impact in the real world in each of those areas. What's exciting about the venture capital and the vision and development space or the funding space is that the earlier you can intervene with a technology society clash, the easier it is because the truth is most of the clashes that we hear about happen when the technology's in the marketplace. At the moment it's in the marketplace, they're not gonna go back and make a small design. They're not gonna change their business package. It's a take it or leave it proposition that we find ourselves in. And if we can find a little bandaid patch policy or a bandaid patch technology and we just sort of hobble ourselves along and then the next wave comes and it's still even more of it. It's only so far we can go. But if you can adjust it early on because most of it's not intentional. At the time it's in the commercialization, by the time it's in the marketplace, there's no way to distinguish the technology from the business case and either one of them could be the cause of the problem. Either one of them at the moment could have been a simple, easy solution to just change because it just happened inadvertently. If you're a parent, you have probably had to navigate your child using apps or being online. Why? At the time people were developing this stuff, they were 20 something year olds who had no children and they weren't thinking about that. That just wasn't on their radar. And now retrofitting or trying to fit it in has been a nightmare. And so what we've been doing in the, what we like about working with venture capital or in working early in this space with entrepreneurs is if you can teach them how to look for society problems early, help them figure out how to solve their problem in design. And then when they produce something, assess it as to what extent can it make a guarantee. We sort of, in that way, we'll look a little bit like your underwriter's laboratory. Can you at least say guarantee it doesn't have that problem? We have some other problem, but it doesn't have that problem. And so we're trying to start a new program with venture capitalists because they sort of are the pipeline to these products and trying to do it at scale so that we can put technologists in those spaces who can do that issue spotting, who can do those assessments and so forth. At the other end, a lot of times these clashes inspire solutions, and this has happened to a lot of our students as well, where the new solution is a new technology. It might be a new version of an old technology, it might just be a totally different technology altogether. And we were one year, Y Combinator is one of the biggest venture capital companies in Silicon Valley and one of our students won Y Combinator. Now it's a Pitt project and he won Y Combinator. You can imagine the inherent conflict in this. Y Combinator is trying to get return on investment like crazy and he's trying to build an algorithm that's gonna make the cities work better. And so this mismatch makes us realize we also need a venture fund and so Arabella has been working with us. We need a venture fund, or I should say it the other way, I've been working with Arabella, but we need a venture fund to help support any of the products that would come out where it's really in the public interest, that technology. It's not, yes, we want it to be sustainable so that we're not writing a check every year, but at the same time we realize it's not gonna have a hockey stick kind of return on investment. So those are two of the projects that we've done and I hope I didn't go over on my time. Hi, I'm Ish Limkeking, I'm with Cisco Systems. I am responsible for our technology strategy and our purpose organization. The purpose organization is actually pretty new to Cisco. It is, we've always been a purpose-driven company, but a few years ago we sort of took a look at, and actually it strikes me that one of the earlier panelists, Chris, I think from public knowledge, was saying this about the internet. All we focused on at the time was, of course this technology is gonna be great, the more of it the better, and let's just put it out there. Cisco was like that, we talked a lot about connecting everything and without full stop, we're just gonna build the internet, we're gonna make sure we connect everything, and we rethought our purpose, but we rethought our purpose a few years ago, and the new purpose statement is to power inclusive future for all. And if you think about that statement, and there's a lot of work that goes into these kinds of statements, anybody who has been a part of that work at a company, every word of that has a lot of meaning for us as a company, it drives everything we do, it drives what we invest in, it drives the technology decisions we make, and so what we've done is we've created this organization, this purpose organization, that brings together a lot of the assets we've had across the company, whether it's our people organization, our workplace organization, but also things like our country digitization initiative, our government affairs, sustainability, social impact, all of these organizations together to bring together all of these assets in one place that we can connect it to our overall business. So to the extent that we can rethink about a new model and how it comes together, it really creates a different conversation. If you think about, and I'll give you one example, if you think about sustainability, I, driving the technology strategy, I came from the engineering organization. In sustainability, when we work between our engineering organization and sustainability, we have an entirely different focus, and it really a different model than we would have just a few years ago. Most of IT, most big companies in Silicon Valley for the last however many decades of companies have been in existence, have been chasing performance, faster processors, more powerful routers, et cetera, et cetera, right? And the easiest way to do that, if you go way back, is to just use more power. That obviously does not work really well with the reality in most of our customers, the geographies our customers are in. Even in California, we have power shortages all the time. And I couldn't agree more with the statement that you made, that if you think about the problem earlier in the design cycle, engineers solve problems. If you give them a problem with the right constraints, that's what they'll solve. So if the problem was just get the most performance out of this chip, don't care about power, they'll use a lot of power. If you say power is something that matters, then it'll become a design constraint. So it's an example of that, I think, as I guess the T part of Pitt, the extent that we can think of these problems and express them to companies and engineering teams, you can get a different answer that will be a lot easier to do if you get it earlier in the process. So couldn't agree more with what you were saying. Great, hi. What a gorgeous day in Boston. Thank you, PITUN, so much for inviting us all to speak. I'm Ann Cleveland. I'm the executive director of the UC Berkeley Center for Long-Term Cybersecurity, as well as the co-founder of something that we call the Consortium of Cybersecurity Clinics, which you heard about a little bit from my friend and co-founder, Larry Suskind, this morning. And there are a bunch of colleagues actually in the room. So shout out to the cybersecurity folks within PITUN. So my niche of this is obviously cybersecurity. And as you were saying, Shana, our piece is both about the people and the public interest part. So in 2018 at UC Berkeley, we started something that we call the Citizen Clinic, which is a cybersecurity clinic modeled very much on what law schools do in law clinics. So our students for academic credit give pro bono cybersecurity defense services and consulting to what are sometimes called organizations below the cyber poverty line or just resource strapped organizations or civil society organizations who can't afford the kind of enterprise-level cybersecurity services that larger corporations can. So think about, as Larry explained this morning, small municipalities in rural Massachusetts or small nonprofits who maybe have 10 staff, none of whom have an IT background or K through 12 school districts, some of the clinics serve. So this has been a win-win. It fulfills universities obligations for public service to the community. And crucially, it gives our students hands-on learning experiences with real clients that they can put on their resumes. And we find that it is really helping get that first entry-level job as well as attracting students into the field who wouldn't otherwise have saw a pathway or a place for themselves in the cybersecurity field. So fast forward with Larry and some colleagues at University of Alabama and Indiana University. We all realized that we were doing cybersecurity clinics around the same time and decided to form this national consortium of cybersecurity clinics, which started out as just a community of practice for clinic directors and faculty to share what they were learning because we were all struggling with the same problems. How do you secure your client's data, for example? And PITUN were very grateful for the first seed grant that got the consortium of cybersecurity clinics up and running. And fast forward to 2023. This summer we announced a $20 million plus collaboration between the consortium of cybersecurity clinics and Google.org to expand cybersecurity clinics to 20 states around the nation by 2025 and hopefully growing from there as well. So excited to have a conversation with all of you about what factors had to come together to make that partnership successful and some of the lessons learned that I hope will help any of you in your projects or at least help us all grow the field. Good afternoon all. My name is Alex Baldenko. First let me say it's a humbling and honorable experience to be on this panel with these folks. So thanks for the opportunity. I lead the data science team at Mass Mutual. So not a tech company, not an investment bank. You can think of life insurance, okay? At Mass Mutual we help people secure their future and protect the ones they love. And so I hope with that mission I can weave a different thread into the tapestry of this conversation by saying that mission statement from a relatively different sector has provided me and this organization an opportunity to partner with some really great academic teams and groups to find initiatives, efforts in an area of mutual interest and objective that are multidisciplinary in nature that we can't solve ourselves and that there needs some similar like some hands-on experience from students. So wrapped into that mission of helping people secure their future and protect the ones they love has driven partnerships with the University of Vermont and health and human wellness, both mental and physical, as well as algorithmic fairness with Boston University here. So one other angle I'd like to pull out here maybe it's more nuts and bolts is that employees of Mass Mutual and other companies are actually interested in a lot of the same values that PITUN stands for and works towards. And these types of partnerships provide a mutual benefit. So it's yeah, like it's great to get students interested in aware that Mass Mutual is a company at all and that we do cool stuff. But secondarily, our own employees actually have a lot of social mission on their heart and this is an opportunity to engage in that an authentic way through their employer rather than having to do something else. Yeah, I'll leave it there for now. Thank you. Thank you, good afternoon. My name is Carlos Genatios. I'm the director of engineering and technology at Miami-Dade College. And first of all, I really want to thank New America for all this and also the Boston College. But I've been linked to this project from New America since the beginning and I can't tell you that I'm really amazed and grateful from one side and congratulate them all because you have done a great job. What we feel here is a family and I have a feeling that this is the people that I want to be with. This is the people that I want to discuss the main issues that concern me professionally and personally and you have made it available for us and thanks again, not only for the finances that has allowed me to do all the projects that I've been doing but also for creating this environment, this effort. Thank you all guys. Well, as I said, I'm director of engineering at Miami-Dade College. Miami-Dade College is a college that before pandemic we had 160,000 students. Now we have about 110,000. We have been recovered. But the thing is 80% of the students are first time in their families. 80% of them are minorities, even more than 80%. Sometimes Miami-Dade College says that is the largest institution in Hispanic population and the third one in black population. So when we talk about underserved people, we are there. And some people ask us, how do you do to have so many people from minorities? We just open our doors. People just come in and it's an open institution. And we have to deal with all that kind of problems with people who don't have the basic knowledge to be there. That's one of the examples. And what I, in my task in that college is to create especially new programs related to engineering. And one of the ones that I'm gonna talk here about is EcologySec, which means geographic information systems for environmental awareness and community engagement. This project aims to give the students from high school with a dual enrollment program to begin learning GIS so they can make their living out of GIS with a 21 credit college credit certificate that allows them to make their living, to work in the municipalities and do things. And at the same time, this program is embedded, is stockable into an associate in science degree, which also goes into a bachelor's degree in information technology and if they want to go on, they can continue to FIU to do the certificate degree on GIS. So I don't call it a pathway, I call it a highway. If the students get in there, they have a track that they can follow and we're there to support them. But that's not the main thing because this is something, this is a program that can be replicated in any college and I don't know why we don't have it in all the colleges in the US. But the thing that we do, the main thing that I have given to this program besides creating it has been to include the environmental risk analysis as part of those projects that the students have to develop while they're learning GIS. When we do that, the students begin to understand the reality in a city like Miami where we do have hurricanes, where we do have flooding, where we do have sea level rise, even though a lot of people want to deny it. But we do have sea level rise and the crazy thing about this is that when I began trying to push this forward into that huge and wonderful college, people didn't pay attention to me. So it was thanks to Petun that I could get some money to begin developing this program and now it's an important program within the college. But if I didn't have Petun, I wouldn't have this. So you sent us so many questions that I don't know how to answer some of them. But when you're talking about why Petun, how do we all approach Petun? I'm sure that each one of us keeps on asking himself, what do I do in the Petun area? In fact, everybody has a different answer. So the main thing to me is the question itself. When we don't frame it, when we don't shut it down, we have the possibility of asking, what are we doing? What are you doing? How can we relate? How are we delivering some good for the humanity, for the people? And in fact, one of the questions at the beginning you sent us or sometimes we've been receiving is, when did you begin working on being in contact with Petun? And I'm all sure that I'm gonna relate to you my story, but I'm sure that most of you could be something similar. I've been doing this all my life. I'm a structural engineer, I'm civil engineer from training. I became professor at the University, Central University in Venezuela, which where I was teaching for 30 years. On my way I was working all the time with especially in low-cost housing and natural disaster risk reduction. When you have to deal with people who suffer a hurricane or people who suffer an earthquake, you really know what you have to do. It's just like a doctor who receives somebody who's having a heart attack, he knows what to do. Is that public good? Of course it is. So I see that everybody in his own profession has the possibility of doing this kind of things, delivering services for others and that is something that we all know at the end is one of the most grateful things that we can have in life. To have the return of this, giving ourselves to others. Well, sorry for that. But while being professor in these areas and doing research, one day I was called to the government and I became minister of human development and then minister of science and technology. That happened 24 years ago. I was kind of young. And I think I did a great job. But the things were turning so crazy in my country, Venezuela, that I left the government and I went back to the academic but I worked with people who have been, we have been the earthquake engineering family in Latin America is a family. We know everybody in every country. So we always work together. This family was mainly created by a professor at the University of California in Berkeley who was an Argentinian who was called Vitalmo Bertero who died. He had a student from all Latin America and all these guys became a family. They studied together and they went back to the countries and they kept their relationships. And I was like the young kid among all of them, some of them were my professors and whenever they wanted to create a conference and they mentioned that a couple of times. I said, okay, I'll take care of that. So I became the secretary of them all and we created associations one after the other. So we had this family that worked on this kind of things, natural disaster reduction, mainly oriented towards earthquake engineering. And as I mentioned, when you see those disasters happening, you know who are the ones who suffer the most. Why do we do this? And this goes back to beaten again. Because when we work on natural disaster reduction we try to mitigate the impacts of the earthquakes, of the natural hazards, but you know that poverty is already a disaster. Before the disaster happens, the poor, the poor are the ones who are already suffering and they will not, they're sometimes not even able to record from their own conditions. When the hurricane happens or when the earthquake happens, then they're really completely devastated. So when you see those faces, you know what to do, work hard for them. And that's all. So Peter Natyan is something that comes very spontaneously when you are in face of the reality of the suffering of the people. Professor Natyan. If I can just jump on that point. I think that you just said it really well with, this is, you spoke about your experience, but you've been doing PIT, or PIT your entire career. And I think that a number of our panelists would share the same sentiment. I wonder if we could pause on that one point though and think about what we were talking about earlier. We see that PIT is happening all around us and the heart of it you've just touched on. Like we have to talk about the communities we're serving and the communities we're protecting. I'm wondering from the panel, if we could talk one level up from that for a moment about what it takes to define that though. So I know Alex and Ishi, you work in public, sorry, private sector institutions where you've had to go through a process of defining this is what we mean by PIT internally or this is what we mean by social impact. I'm curious what that process has looked like for you to align the social impact values. Go ahead Ishi. Yeah, so it's really in a private sector context there's always the importance of being able to define what you're trying to do as well as being able to measure it because at the end of the day you wanna make sure that you have the right outcomes that you were seeking out and so that you can apply more resources against it ultimately. And so we, at Cisco we like to measure everything. So we have for all of our different programs we set specific goals. I mean one of the probably the broadest one in our company was in 2019 we set forth an objective to positively impact a billion people by 2025 and which is a ridiculously large number but we started to think about like all the ways that we could impact it and how we can measure whether it was what our employees are doing, what we do at our customer events. We usually have some kind of program that our teams and can bring their customers to to participate in at each one of those events and we're really excited that we're ahead of goal in actually achieving that but that's probably the broadest measure. I think we're gonna be at 1.1 billion at the end of this year. So but if you go through every program that we go through, whether it is our network academy, whether it is the country digitization which is a program where we set up kind of proof of concepts for countries that if they have a connected citizen initiative or a cybersecurity initiative, we will set up a proof of concept for that country and we'll run it and then we'll measure it and we'll see how well it does and if it does really well, we'll do more of them. So and we do that in every single area within our purpose organization. So I think that key, like clear objectives, ability to measure it, be able to credibly kind of track back in a closed loop is a really important process. And Alex? I can build on that real quickly and say once an objective has been articulated with an enterprise, there's the acknowledgement that well, there are researchers, there are people working on the front lines in multiple fields that are already like pushing the boundary of human knowledge and connecting the dots. We're not uniquely positioned to do that ourselves within our own walls. So this starts the process of almost matchmaking with translators that can go across boundaries of organizations to say like, okay, I hear that you're speaking in the language of academia. I'm speaking in the language of my own company. How can we bridge that gap and achieve our common purpose? And that's something that I think is at least worth discussing. We had mentioned as a group the concept of narratives and how the jargon might be quite a little bit different and finding that common ground. I don't have an answer, but I think that's a meaty space on how to partner and drive things forward. Yeah, absolutely. I mean, we have to bring it back to professors Cleveland and Sweeney to respond to that. And I would ask that shared narrative piece and then also, are there other pieces of the work that you have to translate differently to work with successfully? Cause you've both built very successful partnerships in this area. So what are your reactions to this narratives piece and how to define this kind of work? So I'm taking, it's like two, my brain is kind of dealing with, my heart is dealing with one thing and my brain is dealing with the other. I mean, I thought you spoke so passionately about the people who can't help themselves who need our help. And I do find that the work that the students do that they're the most proud of are the ones where they are, they're really making a difference. You know, and that's everything. Well, we bust clashes. So that's everything about, you know, only this community is getting these horrible credit card ads and this community, and irregardless of their income and this community is getting these amazing credit card ads. We have a law against that, but our regulators have no idea how to enforce that in technology. And so they tackle that and show them how they can do that. And that gives them a lot of great pride. You know, they're often very motivated by the anecdotal pain point of individuals. And then we provide them with a way to do the work where we all benefit from it. So that's kind of the heart piece of it, you know. The intellectual piece of it, though, of how do you say it is realizing that those people in that moment that you are saving or you're impacting or you're helping, in some ways is all of us. It's like, but what about me and my credit cards? I mean, yes, you chose a particular one, but all of us are at risk to these same issues. That is everything that you care about, any rule you care about, any right that you care about is totally up for grabs by whether or not technology, design allows it or doesn't allow. Or allows or doesn't allow a regulator to enforce what they could do in a brick and mortar building. If they can't do it online, then do you really have that right? And I think, and so that's the intellectual piece that says, yes, we're kind of all in this position, we're all losing traction of our society. Yeah, well, let me see if I can say anything more eloquent than that. So maybe I'll pick up on something a couple of my co-panelists have touched on, which is the values piece. And I think that's a key part of the academic and private sector partnerships, just in the sense that in my own example, you're negotiating a $20 plus million big bet. It's stressful on the university side, it's stressful on the sponsor side, everybody's got their objectives and their executives that are breathing down their neck. And in those moments of stress with my counterparts on the other side, several times we all just took a deep breath and looked at each other and said, okay, let's remember why we are all here. We share a set of values that we articulated quite early on in the project. We care deeply about students, we had all found that we cared deeply about equity and inclusion in the profession and making sure that not just R1 universities, but that small colleges and community colleges and HBCUs were part of the community of institutions that had an opportunity to apply for funding to start a clinic. We all cared deeply about the fact that small municipalities and K through 12 school districts are getting hit by ransomware from really bad people. And so aligning around those values and what you're really there for helped smooth things over when frictions arise, which they do, which I'll go back to another point that my colleagues pointed out, I think years in particular about metrics. No surprise to anyone in this room, universities and private companies are on quite different time horizons. So at a university six years is like nothing that's no time. And at a company it's like it's an incongruous amount of time to even imagine the future. Like whatever we're doing now we will have forgotten about in six years. And so coming back to those stairs set of values when you're negotiating something like timelines and what does it mean to have a social impact ROI at the company and what does it mean to achieve the public interest objectives of both the university and the company I think is incredibly valuable. I'd like to mention something. When we talk about beating, we talk about models or models or data that deliver good for everybody. When we talk about this disaster reduction models that we use to assess the risks. Well normally we use data and models, sometimes the models are very good in the way they analyze things, but the data is not that good. And when I do this program, there is two parts of the program. One the CCC, the CCC, the college core certificate. And the other one is a research platform that we have developed to be able to assess the reality of the people in the ground. When we use FEMA models to assess the risk we get some results of expected losses, social losses, people who might die. But when you go to reality it's completely different and the risks are lots higher. So we need to use high technology, we use a lot of drones and a lot of hard processing processes to get better results for those and we use the GIS. And then we get things that are completely different. And why do we do it? Because we need the students to get aware of this. Because we need the students to get close to their families and make them aware of their lives. But normally again we want to go to the communities and be able to deliver those services for them. And this is something that we cannot do from the framework of the colleges, of the community colleges. Because community colleges have a great good to make which is delivering the possibilities for people to grow but you cannot do consulting within that framework. So we need to create outside companies, non-profits or other things to get close to the community that we are already engaged with. To give us students opportunity to have internships or for us to be able to work with them because there's something that I always think as working in the educational world. We feel like we teach. Do we really teach? We don't teach. We are like players who expose knowledge to students. Students learn. That's a different issue. We don't teach. Students learn. That's the important part. Communities is the same thing. We want to go to the communities and we want to deliver the tools for them to be able to empower themselves and so we can support them in the claims for the insurances to be able to assess the investments that the villages and the cities have to do for disaster reduction to see if they're worth doing it them or not. So in this sense, we are ready with a group of consultants and people who are ready to deliver the results. We have been delivering them for years already and we need to jump into the profit side or non-for-profit side to be able to deliver those services. And that's where we need you guys to help us do this. For the people who really need it, the ones who are already living the disaster. Sorry for that publicity. Go ahead. I might be able to synthesize something and you raise a good point around like the timeline to deliver results across metrics. And I think to connect that with the concept of finding overlapping values, generally at large companies, a mission is something that is robust through time. And so finding that, let's say shared values, shared purpose at the roots of the tree is the way to get there so that through the noise of changing metrics, changing objectives, changing the name of an ESG program, or something else, that through all that change, you can stay at the roots to be stable. That's just something to share because lots of things change, but generally mission and vision are robust. Yeah, big plus one to that. We can keep coming down the line actually. I was going to pose the question, what's one thing as we wrap out the initial, and do we have time for Q and A? We might go and transition into that. So prepare your questions. But if we can keep coming down the line, what's one thing that you think can enable these successful partnerships? Then it sounds like learning, a focus on learning, you mentioned, Professor Canatio, is that there's already some shared language in the approaches and the data methodology, perhaps, that your students need to learn and need to use the tools that they need to use to apply it to this real-life problem. You just mentioned, Alex, the shared values. That shared language is the shared values, perhaps. If we can keep coming, Ann and then Ish, and then we'll wrap up with Professor Sweeney. What's one last note on enablement for these kinds of partnerships? Yeah, sorry, I wasn't sure. We're in line, you were starting. I guess my reflection, which folks will have heard before, is every overnight success was 15 years in the making. And that's because at the heart of any of these partnerships, whether it's with private sector or government or between universities, depends on the relationships that you have. And relationships take time. It seems like it happens overnight, but actually you've been building and strengthening those relationships for a long time, and we would not be in the position we are today if we didn't have a relationship among a national group of universities that could speak with a national voice instead of one of us alone to a huge multinational company where we had also really invested the time to develop relationships. A quick corollary, which is like a way to not build relationships, is to ask for data sharing on day one. Or it's like going on a first date and ask if you wanna share a baking counter or something. And it's a common experience. So I would plus one both of those, but since I'm third, I'll come up with my own. I think that understanding how your partner is, what their objectives are, where they're coming from, and finding that the persistent common thread to it is really important. And when you're working with a Cisco or another technology company, there's often a core purpose around there, around for us, powering an inclusive future. There's often a core technology capability that they bring to the table. But it's not about understanding how the internet works and rounding. It's understanding how our technology and how we think about our technology powers our purpose. And if you can find that thread and have that common, that is persistent. And that will last longer than the next 90 days or the fiscal year, which is often what you'll hear within a lot of companies. And those tend to be very persistent. They tend to be ingrained in the culture of the company. And really, a lot of these issues that we're talking about today, we see our employees and our future employees, these things resonate with them. And so as you think about working with technology companies, these are important things, not just for the bottom line, but also for our people and our future, really the future of the company as we try to attract additional talent, et cetera, et cetera. So. I want to follow after everyone has gone. So I won't, I'll just say exactly the answer from my perspective. And then it is primarily about relationships. It's, you know, I started this, my career was with data privacy, sort of had the opportunity to be first to show this problem to society and being able to watch as laws around the country change based on a little simple experiment that I did back then. And what's interesting is how many of those relationships are still with me and how those relationships made huge changes that people have no idea. But it was literally, it's those relationships that you turn to. And so, and you just don't know where they're gonna come from. I mean, I was a graduate student at the time and just sort of seeing this unbelievable change happening and being able to connect with insurance companies and with public health and the Department of Defense and so forth. People that you wouldn't normally think, I just re-identified your data. I just showed you got a big privacy problem. Things that were on its surface kind of embarrassing to them in the public, a kind of shame on you kind of thing. But actually being able to do that and walk out of that as friends and trying to solve future problems together has been incredibly powerful. I think that's such a beautiful note to round this up out. I think the power of relationships, the power of the existing institutions that we're all at, but then also our trajectories as we go through our careers. And I think that's a great call to action for some networking and some relationship building after this. Is there Q and A or I can't read that actually. 10 more minutes. Okay, okay. Can we open up for Q and A then? All right. Can someone carry the microphone here? Oh, that's good. Thank you. Thanks so much for the panel. Nick Zingali with Cleveland State. So my question, I'm gonna try to get this out without stumbling a bunch. It's a two-part question first for the panel. Do you think there is a widening gap in public perception, the public perception of value between the private sector, government and universities? And if so, what do you think needs to be done to shrink that gap? First, so in one way, this panel has been kind of like la-la land on companies, but there's a flip side of companies too who aren't at this table. And I think the, right now, especially in the United States, there's a lack of trust. We just don't know how to build trust at scale. Companies don't know how to do it, but their business model depends on keeping you in their product. And those kinds of business models, which drive a huge amount of our economy these days, create a real tension point and really do tear at our fabric in a really important way. And I think to the extent that public interest tech also has a message in there and a way to help solve those problems and a path to solving those problems is crucially important. It's not quite the scope of this panel, but that is the flip side. Yeah, a big plus one to that, and I would just add, we're doing a different project around the world that we call Cybersecurity Futures 2030, and around the world in a way that's remarkably distinct from what we heard five years ago when we did this project. People are saying the core concern of cybersecurity is to match the speed of trust with the speed of innovation. And I can't say that anyone has got a silver bullet for that. It's probably a lot of silver buckshot, but I do think there's an increasing recognition that trust is the core problem and it's got to keep up with innovation if we're gonna unlock all of the benefits of technology. Yeah, I'm gonna respond to maybe a reinterpretation of the question, or a different interpretation of the question, which is public trust in private sector, universities, and government, like all three of those combined. And I don't know if there's a loss in public trust, but I think that all three of those parties are dealing with some really challenging problems, and a lot of them are in the public limelight. And so it makes solving and talking about those problems challenging. So I think having some productive forums for those groups to come together, probably primarily through relationships, is the path towards resolving that. And I think there's real tension and friction there. And forums like this are a really great start, but broadening them and making them even more productive and more specific is probably the way to get there. I think this tenet of trust is really important. I think most companies would say that this is one of the most important assets that they can have with their customers, with their partners, with the institutions they work with. And I think if you think about as the reach of the internet and mobile devices, et cetera, has gotten broader and broader and has more of your personal information, et cetera, the notion of that being a steward of that trust becomes more important. And something, Professor, that you said earlier about thinking about some of these issues earlier in the design process and in the thought process and some of these technology trends is really important. I think we're seeing it in real time right now with generative AI. There is a massive amount of things that we could do to improve or irreparably harm the trust among the three tenets that you brought up. And I think it's something that we have to figure out, not just, and there's no one company that's gonna figure that out, there's probably no one university or one government that's gonna be able to figure that out. It is, but I think it's a perfect example of something that is, if we think about it now, we have a chance. And if we kinda let it go, maybe it'll work out, but I don't know that I would bet on that, right? So... I wanna take two words, public perception and trust. There is something behind all this, we all know. The country is getting deeply polarized. So it is very hard for anyone to trust if the narratives are so divergent. See, we don't build agreements to make the country stronger. So I think this is a consequence also of the political reality that we all have to be aware of. I'm coming from a country that we lose the country in a similar situation. Of course, it's quite a different thing, but having such a polarized political environment makes hard for people to understand each other's perspective. Valuable. I think I would actually like to bring an opportunity angle to the trust piece where I think this is an opportunity for Pitt, right? For... Technology has been seen as the silver bullet for trust and it obviously isn't, but I think that when we are able to encourage ourselves as public sector... I keep saying that, private sector organizations to lean into that discomfort and kind of play with the edge, we can't walk away from building technology and so we have to lean into the equity, the privacy, the technical and societal implications of building technology. And so we're actually seeing that at MasterCard, we're seeing that leaning into it as our way to build trust because if we can build without our data, without trust in our data, we're not an entity anymore. And so leaning into data protection from the very beginning and trying to be at the cutting edge of that kind of work, particularly from a societal angle, has actually been a way for us to generate business value but also to think about trust from a strategic and I think a societal element, an angle, I think. Do we have one or two more questions in the room? Go ahead. Hi. Thanks for the panel. It's really nice to hear these industry perspectives. So I mean, it's good that we got on to generative AI, I guess, like it was always going to come up. But now that you've raised it. So, I mean, as soon as these like large language models started coming out, there was a very, I mean, there was an issue within academia very quickly about like, well, what does this mean for assessment? How do we know about plagiarism? All these kind of things. And some of the discussion that was raised was whether there were choices you could have made in the design of these models that would have meant that they would have been, had different places in the world, right? So you could have watermarked them, for instance, and then some of those questions would have gone away. So this question about how do you build in foresight and thinking into the design process is really, really important. But we also know that it's really hard to do it within academic labs but also within companies. So as people that work on this kind of stuff, as people that represent some of these organizations, what do you think are the most important organizational capabilities to do this work? Either that you have seen built, have been involved in building, or that you see are lacking at the moment? I'll jump in. You know, and if we now return back to the scope of the panel in answering that. So the argument to a VC firm or to venture capitalists or to a startup company is really clear. You can lose your technology. You get to the marketplace, and you have a disruptive technology. You can lose it completely. We don't see it happen that often, but we have some great examples. We almost lost face recognition completely. In Super Bowl in 2001 was the first time that face recognition came out, and they surveyed everyone in the crowd and it was for law enforcement, and they wanted to see who had a warrant. It freaked people out. There were laws all around the United States banning face recognition, and all researchers who were working in it, everyone was prepared to lose their funding. Congress was going to basically defund it, and if it hadn't been for 9-11, happening a few months later, face recognition would have just been a goner. There have been other examples. Sony has paid huge amounts of money for a camera that used to see through clothes because it had infrared lights. If you took a photograph in dark clothes, you would literally show your skin and tattoos and stuff like that, and the recall rates and so forth. There are real risks to companies by just ignoring these problems. The problem is we're not able to hold them as accountable because there's a lack of transparency into their technologies, and they sort of masqueraded over. Where are all the places that the Cisco data, not to pick on Cisco, but where are all the places that data goes? I don't know, but shouldn't I know where my data might be? I don't want to go down that pipe. But I did just want to say that that's usually the value proposition that we were making. And the one who comes to the table first is the one who has the most, in my experience, the ones that want to talk to us first are usually the ones with the most egregious technologies because they're scared or they're worried about it. Anyone else want to weigh in? I mean, I'll say something. I think that one of the earlier panelists was saying, specifically actually about AI, it's a tool. And for the most part, most of the companies working on it, I think think of it as a tool. Now, I think it is incumbent on the leaders of those companies and those organizations to think about how it could be used and to think about, you know, what safeguards they might want to bring in. I think some of the examples that she just gave have some pretty obvious challenges and whether it's privacy or just, you know, what's the right thing to do. The one thing that I would say, though, is that it's probably not going to be, it won't be as effective if we just rely on, you know, the inventor of a technology to figure out all the uses of its tool. You'll never come up with all the corner cases. I mean, we develop technology, we don't carry any data, by the way, we just move it. It is, you will not, I've been involved in many product development efforts and every time you put it out into the world, somebody finds a different way to use it that you'd never anticipated. And if you don't have some kind of process to understand that, collect it, react to it, think about it, you will always have some unintended consequence and so hopefully you've done enough up front and the more and better input you get from organizations like this or other partnerships or government or what have you, the better chance you have, but it will, I promise you, you will still find some unintended consequence. It's just, it's a truism of almost everything that we've ever developed. I think any technology company's ever developed. I'll just quickly, you know, I do think this is a place where you absolutely need collaboration between the private sector, between government and from, between academia. There really is no silver bullet, but I also want to say a part of the solution has to be, it should not be an individual's responsibility to protect themselves from the harms of something like generative AI models or large language models. You shouldn't have to read through six pages of consent forms and be able to understand it all and opt out of your data being processed with every single corporation on the planet. You know, it shouldn't be individual responsibility and yet we don't see like a movement of individuals demanding better cybersecurity or better AI security from companies and from government in the way, for example, that you start to see grassroots movements of individuals demanding better climate policy, you know, both from the private sector and from government. And so, to me, there's a real missing piece here in what would it take to build a movement where we as people were demanding better AI security in your example from, you know, both regulators and from companies. Go ahead, Carl. You mentioned choices in the design of models. I got those words when you began with. In the risk analysis for environmental hazards models, if we had from the beginning understood that there is a social disparity and that some people suffer more than others because their social vulnerability is higher and when they lose something, they lose everything, if we had that from the beginning, maybe those models could have been designed differently. Now we're trying to solve this but it doesn't work really because the market is already saturated with models with very powerful companies that deliver those solutions for the municipalities. And then we want to point out that there are some communities inside there that are suffering more than it's very hard to go there. And when you try to work with those communities for them to raise their voice, they don't have the money, they don't have the time, they have so many problems that it's almost impossible to work with them. I think that... I'll be very quick, I promise. Back to the initial question, I think maybe this is a controversial perspective to end on, so apologies. But I think that a large established companies that want to retain the trust of their customers might actually have something to offer to startups in that... One thing that I think gets done pretty well in general with generative AI is looking at... There are examples of it being done well is generative AI not being considered a tool and a platform for any use but to look at specific use cases and generally govern them with some sort of internal control partner council that thinks of customers first. And that's by the nature of being an established company that wants to maintain trust and then a startup that's looking for a niche or looking for some market. So there might be something to learn from that specifically. At the beginning of this panel we tried to define what PIT really is at this point in time and how it's defined differently. Professor Hernandez was encouraging us earlier to try to keep that definition as open as possible so I would say we can actually continue to add to it. It sounds like the PIT practitioners and the educators and the folks that are in this room are also going to be then the translators of policy and implications on society. It sounds like we need the PIT practitioners to be focused on ahead of the curve development and making sure that those considerations are at the very beginning of the development process and it sounds like there's also a huge need for perhaps advocacy and activism from this group in terms of being able to organize around what the standards should be, what some of these ethics should be and what some of the implications of society are. So with that, keep defining this. Please join me in a round of applause for our panelists. And also for you and for New America. Yes, thank you. Huge thank you to put you in New America and to Boston University. Thank you.