 Hello, everybody. Welcome to this session. Today's session's title is best practices in the programs, and we have an amazing set of stickers to talk about it. You're as a host. I'm Dr. Fine. You're saying I'm a professor in the School of Future of Innovation and Society at Arizona State University. So today we are actually going to talk about some disconnects, some disconnects about within PIT and disconnects among the students or the potential students of public interest technology. Because sometimes what we have seen is that many people look into PIT as just a technology and not focuses on the other societal and humanitarian aspects of it. It does not talk about the adaptation of human-centered design. We don't talk about product development. We don't talk about engineering, re-engineering, and also how it actually involves iterative process, how it includes people. So here, to our different speakers, what we are going to do is we are going to focus on some of the philosophical and equitable and social religious aspects of PIT. And we have an amazing set of speakers. I will first introduce them to you. We have with us Professor Alexandrina Agloro. She's a professor in the School of Future of Innovation and Society at Arizona State University. Then we have Professor Elizabeth Graffin, a professor of practice in the SFIS, ASU. Then we have Ms. Joana Abu-Ghazale. She's the president of Pivot for Humanity, an organization that works since 2018. And then we have with us, momentarily, I guess, Professor Eric Fisher will join us. We're still waiting on him. And we also have Professor Robert Cook-Degan. He's the professor in the School of Future of Innovation and Society as well. So I will first start the conversation with Professor Alexandrina Agloro. And her specialty is as a media artist, as a community-based researcher. And she also believes in the possibilities of the colonial imaginary when it comes to using ancestral technologies as liberatory tools. So the first question I have for you is, when we talk about, when we imagine PIT, when we talk about talking to technology, how do we contend with the inherent racial biases that kind of is, you know, we can see in our system? How do you see that? How do you address it? I was also muted. I get to open this up. And I just, though this is a digital convening, we're all located on native land somewhere. So I greet you from the traditional lands of the Gotham, the Yavapai, the Maricopa people. And I understand that it's not enough to give land acknowledgments and real change as Indigenous land repatriation. And I strongly believe that public interest technology can play a role in this kind of liberation. And so when we're talking about bias in PIT systems, I'm going to tell a little story. I was speaking with another ASU professor just yesterday about the first cohort of native students to go to ASU. And they had come straight from the Navajo Reservation and they had no experience living in the city. A few of them were homeless. They slept under the bleachers because it just wasn't, it wasn't just college life that they were unprepared for. It was an entire culture and system and lifestyle shift that they weren't yet ready because they had no prior experience in this system. And so I'm telling you this story because it's also what's going on in our current tech systems. So when we talk about bias, I guess I'm talking about like education and curricula. I want to start even before we get into the classrooms because there are students, particularly young people of color, who never even make it into the classroom. So how do we combat this bias? So we have to think about things like intimidation and advantages, what it means to get into classrooms, whether or not they feel welcome once they get into these classrooms. When we think about careers, thinking about who's at home and in general young people think about their options of who they can be in the world is based on who they see in their families at home and then what they see in the media. So you can't be what you can't see. So they don't see people who look like them. And then that possibility is harder to imagine. And I guess I'll do a curricula one and then I'll stop and we can keep going. When thinking about curricula, like we want to think as educators, so for combating bias in some way, think about your syllabi. Like who are your students reading and at their minimum, do you have women on your syllabus? Do you have people of color on your syllabus? Do you have queer people on your syllabus? Do you have trans people on your syllabus? And it's also important how we represent that, like how we represent who's in our syllabi. You should have everybody, authors of diverse authors should be woven into core concepts in your courses. They shouldn't be siloed into these race days, the gender day, as if outside of special interest topics, women and people of color don't exist. And then can we talk about women, people of color, queer people and technology without positioning them in this deficit or damage-based position? So not always talking about this is what's wrong. These people don't have this thing. So I guess I ask, can we have full color conversations that depict the world as it is instead of that narrow slice that we currently just see? Okay, Faheem, back to you. Thank you so much for your thoughts. So you actually started to answer one of my follow-up questions, which is, and I think we will come back to it. But just quickly, I wanted to ask you, one thing is to recognize it. And then the other thing is how we move forward, how we make it more equitable. So, and you started talking about it, that are we doing this? So we are raising the questions. So, but before we move on to our next speaker on a similar topic, would you like to add something in terms of how should we make it more equitable? Or how you foresee so that because in many cases, for example, within SU, we are just starting, we have just started the program. So what are the things we can do? So I like this phrasing that disability activists use and they say nothing about us without us. And so when thinking about design, so taking it on from even a technological design perspective, do not create things for people to help them. You want to co-design with people, teach skills that enable diverse shareholders to take charge of issues that most affect their own communities. So like when we think about these words like we're giving voice or we're empowering, we're doing all of these things and I'm sure these like words come up like especially in a relatively new pit space. I would say don't do those things, give skills to people, teach the skills so that you're actually not needed. If you can step back from the world that you are in and they can carry on without you, then you've totally done your job. Like to move away from that system or to move away and to not be needed is exactly what I think how we can make our systems more equitable, teach skills and move on and let people use those skills. Don't be needed. And then last diverse from unequal structures of power, which I know is really hard because we're all coming from fancy places with fancy degrees and like fancy educations and fancy jobs. But being equitable means that we might get a little bit less so that everybody can have a little bit more and be ready for that if you really mean it. That's it. Thank you. We need to be ready for that and I will come back to some specific follow up questions and I'm pretty sure the listeners will have some follow up as well. So, I'll go to our next speaker. Mr. Jimana. She is a doer. She's a president of the people for humanity, who she founded in 2018. And the idea of this organization, which I find fascinating. It's to work towards a data driven social tech industry and foster a responsible ethical and accountable internet. How do we do this, how do we do accountable internet and also serve the humanity. That's that's that's my question but not for this session but eventually I think, eventually I think you will come you will address it. So, Jimana, what I would start with is you as a practitioner, you look into the industry, and you were striving to make it more ethical. So, what do you think, how should we teach ethical codes of conduct and principles in STEM, within the curriculum. What, what, what are the, what are the, what needs to happen, what needs to connect the dots, and the grad between the theories and the practices. You are muted. Okay, now everybody else has to do it because it's a pattern okay Elizabeth and Bob, we'll start talking on mute and then we can all. Thank you so much I'm thrilled to be here and really humbled by the panelists. You know before we started he asked if he should address me as doctor and I said no I'm just regular old Jimana. I don't have your qualifications and expertise so bear with me as I bring this sort of outside perspective outside of academia. I wanted to say so how do we teach codes of ethics in stem and I think that I would start by taking a step back and saying how are we teaching stem in the first place. Because, you know, we think of stem disciplines, typically as sort of objective, rational, empirical sort of cerebral disciplines when in fact, in, in the real world in application they're not. They can get quite messy. And so that's, you know, there is bias and in all of them, for example so there anything but objective and empirical and one of the things I think it's really important is to say are we, are we grounding students of stem. In the idea that these are applied sciences that they're not sort of just theories but things that actually are forming the actual building blocks digital infrastructure of society. So these aren't just concepts and and, you know, things to learn but they're actually going to be part of how these students go out into the world and what they build and what they do. So, one of the things I would say is we shouldn't think about sort of ethics in stem, but the ethics of stem. So not as an additive sort of extraneous thing that we bring into the discipline but that's something that's embedded in it that's innate that's part of the process. And with that I'd say, we need to, you know, first define what that is as a whole. So what is stem, what is stem's role in in society, and how does, how do we think about the ethical consequences and ethical implications of it. And importantly I'd say in terms of education I'd say it has to be a clear understanding of what ethics means in the context of stem and the ethics of stem, but also quasi universal definition so that from institution to institution from academic organization to general framing is the same, because if it differs in different institutions, you know, you're going to have a patchwork of ideas about what's right and what's wrong. And then you're asked about sort of the, the, how do we connect the dots from that theory from that environment, academic environment where your students are learning and getting grounded in the context of ethics to practice. And I think that that happens in essentially two ways. One is, we need to re humanize the data. We're so enamored with big data right now and you know Alex just mentioned the, you know, nothing about us without us but we tend to see as big data as having all the answers as, you know, the kind of stuff that like numbers don't lie that kind of thing and taking that as as gospel where there's a real truth that data, big data also obfuscates it obscures it makes us as individuals and invisible, it hides our humanity, it makes us. It dulls everyone into sort of a number and so people actually disappear. And so one of the things I think is really important is to re humanize the data and you can do that with students by making it personal by by helping them for example to say whenever they're thinking about a problem or a process is not just who is it for, but what if this is going to be used by somebody you really care and worry about on the one hand, and somebody you you fear somebody who you think is very dangerous on the other. And what if both people use it at the same time, what would you do differently, how would you think about creating this thing differently. So that's on the one hand is like expanding the personas and making them personal so that when we are creating things, we know that these are eventually things that the entire world we want the entire world to use so what does that look like. And second, you know, if we really want ethics in practice. I think it has to be, you know, it has to switch from being something that is aspirational something that's nice to have to something that's required. So, you know, like, like it is with professions, it's a must have, you know your duty bound to practice ethical behavior. If we just sort of if students just graduate with this understanding of what's right and wrong and go to into the workforce and have no way, no power to enforce this understanding to apply this learning where you know where we find ourselves at ground zero all the education doesn't matter. It didn't help that they were trained that way. They have to have the power to say, I need to apply my these ethical standards in my work, even though, and even if the corporate values or the structure that I'm entering doesn't support it. You know, a profession professionals are duty bound and and I think if we ever want ethical practices in stem we have to make them required not desired. So what what we are actually what I'm getting from this conversation is that it's just not a tick box, we have to actually walk the talk, and we have to include those things as we mean it, and I have some follow up, more, more practical questions for you I will come back to you related with how the young practitioners in kid can actually include that, not just in their curriculum but right after they're actually in the industry. I'll come back to you. Thank you so much. Now, our next speaker would be. Our next speaker is the professor of practice, Dr is a graphic of school of future of innovation in society, and she is focusing on issues related with these rocket changes at the intersection of science, politics, and culture. Her research explores and experiments with energy innovations that enhance that that enhanced social sustainability and resilience. And she has advised and worked with numerous NGOs and academic and government organization. So, I have the first question I have for you is, if you look back like your years of federal policy experience. Now, what capacity, would you think that you want to the kids graduates to have if they would like to work in the policy. What kind of curriculum, we need to provide them, so that people who are interested in care and policy can, they should take it. So what can we do about it, and what how do you see these connections or the tools in there. I'm going to break the trend and I'm going to unmute myself from the start. Yeah, we all learn technology we share our knowledge with each other right. I thank you for the introduction and I, it's really difficult to not just respond to the people that have already spoken, you know, before each of us because the conversation seems to want to get going. So just address your question for him in a couple of different ways and first put in context. When I was in the federal government which I was for almost two decades. I was a national policy advisor for a lot of that time. In the executive branch. Well, initially in Congress and working with folks around the country and in the executive branch and then in the executive branch, working in science agencies with policymakers. And so that gives me a particular perspective. One of the experiences that I had in that and I have to say that these experiences have been also reflected a little bit in my teaching at ASU over the last 10 years is that there is a tendency for people that have technical degrees to arrive in the policy process believing that their purpose is to shed rational light on the issues before policymakers, and that the main knowledge that they need to have is their technical knowledge and that somehow or other that has a linear, you know pipeline sort of relationship into the decision making process. And it's actually not true. It's not really how it works. And so one of the first capacities that I think any students who are interested in working in policy spaces, regardless of what level those policies faces may be would be to have an authentic and deep understanding of the institutional arrangements within which decision is being delivered. And that means not just understanding the rules, and not just understanding that there's a thing called politics but understanding them as systems that are as technology rich, although there are social technologies as opposed to physical technologies as anything that they have that they may have learned in their STEM disciplines. So, one, I think is to develop a really authentic. We're transdisciplinary a lot, but I think that that there is a kind of an integrated holistic sort of a view that people really need to learn and embrace. And it's really, it's not typically taught. So one of the challenges for Pitt curricula would be to figure out how to do that in a really solid way so that people come out with mastery of the whole system, not just the technical disciplines. So, another way of thinking about that is that Pitt is not a way of softening STEM disciplines right so Pitt is not, I think in the ideal for engineers who want to learn about society or policy. Pitt is I think a signal it's a way of reclaiming the idea of technology as being a public thing a public phenomenon a social phenomenon that has multiple facets so what do we what does that mean well technology isn't just something that you know Apple designs right I mean technology occurs in all sectors public and private and hybrid sectors that occurs within and among different kinds of disciplines each have their own sort of ethical premises that each have their own ways of thinking about how you plan and design that each have their own ways of thinking about risks that have their own ways of you know thinking about who Publix are and that in a lot of ways are governed by different policies and laws about what writes those sorts of let's call them the the non traditionally design co designers and input in the public or people with a public interest might might have what what roles they may have what rights they may have to get involved in the processes of technology or even determining what houses technology and what doesn't if you look at what happens in the policy space, we have I would say there's a public interest technology theme, pretty much in every agency in every legislative space, whether you're talking about elections or food or water or medicine you know health climate change, you know, energy development, I mean really across the board, there are there are technology elements, and if we I forget the I forget if it was Jimana or Alex that talked about it but if we try to split those things out their technology elements and then the cultural or policy elements I think that that would need us a stray. So, there are other ways of thinking about how messy the space can be, but the challenge for pit, I think as an emerging discipline is to recognize that messiness make attractable for students so that they can actually learn how to kind of think about those things in an integrated way, and and so function across them without feeling like their role is to be a sort of a technology disruptor in a public space, or to sort of become a more, you know, politically aware technologist I think we're really talking about a new sort of capacity and new kind of holistic capacity, and we're learning, I think as we put our own program together how to do that. I wasn't watching the time, I don't know if I went over or under or sorry. You're good, we are good and these are important things to talk about. Now, in the continuation of your conversation, if I can ask you further in terms of like looking ahead. Hopefully, the link coming by the next session. What kind of, how do you see the issues arising, the things that you are talking about being disruptive, being inclusive. How do you intersect when it comes to the social stability and resilience in this case. So it pretty much all the, all the issue areas I mentioned or, you know, things the administration will contend with but let's just look at a couple of them. First out will be the pandemic. Right. So the public interest technology issues which I'm guessing Bob is going to talk about so I'm not going to get into it much at all. But what the thing I want to stress here is that the way we think about technology and think about managing the pandemic or whatever term you want to use. It's not just about whether we have the right medicines, you know, the vaccines therapeutics, it's not just about whether we have the right kinds of, you know, digital maps that show us what the different risk exposures are around the country, even though they're very cool. It's not just about whether we have tracking apps. There is a, there is a fundamental sort of socio cultural political resilience aspect to how we do it. That is at least as important if not more important than the specific technological options that we may develop and I'm using technologies here in the kind of more traditional sense. So if we develop tracking apps or we develop really, really effective masks or we develop terrific vaccines, but somehow we do it in a way that that depletes or fails to actively cultivate social cohesion, they aren't going to get us very far. And I think that those kinds of considerations are have so far been not very well represented in the discussions around the pandemic. And I think that we're seeing some of the results of those things not having been priorities from the beginning, we're seeing a lot of resistance defiance confusion that are in some ways a response to the fact that those were not priorities. And the same thing is going to be true when talking about climate change, whether we're talking about, you know, energy innovations or talking about specific kinds of climate interventions that may be necessary, some of which we're doing research on at ASU. These are, these are issues they're going to have to kind of be developed, not in laboratories and then sprung on the public, but they need to be kind of brought into conversations where we are thinking in terms of how we're building resilience together, as well as how we're thinking about solving problems that we might normally think about as being kind of STEMI problems. And I think those challenges are really front and center for this administration in particular and I'm hoping that they're going to prioritize that in the leadership in different agencies in their selection of leadership, as well as in how the policies are designed and implemented. Thank you so much. And you also gave me a very good, you know, conversation starter with Bob on this. So I'll start with you Bob. So we are our next speaker is Professor Robert Reagan of School of Future Innovation and Society at ASU. Bob was also the founding director of Genome Ethics, Law and Policy in the Institute of Genome Science and Policy. He's the author of the Gene Wars Science Politics and the Human Genome. And my first question to you, sir, is when we talk about COVID-19 pandemic, the present situation, and as we are talking about heat and how to make it more inclusive, more equitable, more responsive, more complete, how do you see it just not in the short term the heat graduates focusing on or, you know, addressing this COVID issue, but this COVID-19 post-period, how can heat graduates be ready for this? And where are the scopes? Where are the possibilities? And maybe some challenges. Yeah, thank you, Fahim. So just focusing for a second on the area that I work on. I'm a physician who became a molecular biologist and then went into policy. And I think one way to frame what we're trying to do with Pitt is to stop thinking about SDEM as an end unto itself, but rather think of technology as a means to an end and those ends are human ends. And in the area that I work on, biomedical innovation and medicine, think about it and think about it in the context of the COVID epidemic. The United States has the most robust system for biomedical innovation that's ever existed on the planet. And yet we have horrible outcomes in connection with this pandemic. And so how can we explain that this incredible amount of expertise and resources, and we spend more money on healthcare in the United States by a factor of two per person than any other nation on the planet. And yet our health outcomes are worse. And we exclude more people out of our system than any other jurisdiction of the rich countries, the OECD countries. And we actually do worse than many other countries that are not OECD members. For example, when the World Health Organization ranked health systems a little over a decade ago, actually two decades ago, we came in between Cuba and Costa Rica in the ranking of our health system. And yet we're spending 10 times as much as Costa Rica is per person for health services. So that tells you, wow, we've got a system that is not focusing on outcomes, it must be focusing on inputs. And so that's what we're trying to educate folks about, which is, well, wouldn't it be nice if we focused our technology on actually reaching the outcomes that we intend to achieve. So focusing on healthcare outcomes. The COVID crisis has really illustrated very, very starkly the inequities in the current system. And it's also illustrated the incredible strengths of the system. I mean, it is totally amazing. Just think of the last week, what has happened last week. Number one, the Supreme Court is thinking about dismantling a law that we passed a decade ago to include more people in our health system. That's a great thing to do in a pandemic, right, kick people out of the healthcare system. So that's happened in the last week. We also had the announcement of really optimistic data about a new vaccine. So the technical people have been doing 24 seven working really, really hard to develop vaccines and it looks like maybe we will have some vaccines available early next year. So that's, that's a wonderful thing. Right. But now we're going to have to solve the problem. Who's going to get the vaccine? How's it going to be distributed? And will it be fairly distributed? And will only people in rich countries get access to it? Those are all questions that are still hanging out there. And the final thing that I'll raise as an illustration is if you compare what's happening with COVID to what happened with another huge epidemic that affects even more people on this planet, hepatitis C. What you see is in the United States we have drugs that are actually curative. 90 to 95% of people who get hepatitis C can be completely cured. They have drugs to treat and get rid of an illness. And in Egypt, they've actually managed, they had the highest prevalence in the world and now they are on the verge of eliminating hepatitis C because of these new drugs that have become available since 2012. In the United States, fewer than one in six people who are infected with hepatitis C have actually been treated. And the richest healthcare system that's the most expensive in the world has completely failed to deliver a powerful technology that can eliminate a disease. Why? Because our system is really, really poorly structured. And those are the sorts of issues that we're trying to train our students to think about. And that's not because the technology doesn't work. That's because our system of paying for things and doing things is not functioning properly. That's it. We wanted to hear more on this from you. So you actually shared a very strong example of not just focusing on the inputs but on the outputs and then you talked about the other cases. But, but imagine when we are talking about, I'm just thinking if I, if I were to graduate. So, and, first of all, they need to be, they need to be included within the system. And then they need to change the system. Or how do you see some of the, because some people graduates are the people folks when they see it, they are, they can be a bit, I feel a bit overwhelmed with the things that's kind of going wrong. So, I'm just wondering whether there are some short term wins and then we turn them long term wins when we are addressing. Can you quickly shift some lights on this? I think we could open this up to everybody, but I want to go back to something that Alex raised because I think she basically said the most important thing that we can say which is teach skills and get out of the way. And I think that's what we're trying to do. I'm thinking about the master's students that are in our new public interest technology master's program. It's amazing because they all have day jobs. This is very different from teaching a PhD where that's all that the student is doing. These are folks who have day jobs that are related to technology and actually want to get some intellectual background that provides them, gives them the tools that they can go out and do their work in such a way that it reaches the outcomes that they went into their business for. I mean everybody usually picks an application, picks a career path because they do want to pay their mortgage, but they actually want to make a difference in the world. And these students are kind of looking for the skill sets that would allow them to make more productive use of technology so that it reaches the ends that they intend to do. So it's a really unusual group of students and I think we're going to learn as much from our students who are distributed all over the United States. We're going to learn as much from them as they're going to get from us. What we can do is kind of channel that effort and create a skill set and a set of exercises that allow them to be more powerful at the jobs that they're already doing. And I think that's probably going to be the kind of education that we're trying to do that will be a bit distinctive to PIT compared to your standard PhD or your standard undergraduate education. Thank you so much. And I will be actually, I think I will revisit this question to ask everybody about this, but this is fascinating. Actually, I was taking some notes on this. So, at this point, before I go back to some of the other questions that I have, let's look into some of the questions that our audience actually put for us. So, I see several ones. I think first one is for Alex, I think. So, in regard to Agloro's idea of teaching skills instead of helping question is, like, do they even want or need these skills or what are the skills you have in mind, Alex. So, what gets to a really good starting question is don't go into places where you're not wanted is, I mean, that's probably like the precursor to all of this is that it isn't just about showing skills and as Bob said getting out of the way. It's also treading carefully being cautious listening, and then proceeding with proceeding with care. Humility is a huge part of this. So, when thinking about like teaching skills, like, I think humility is like probably the number one skill that we could be teaching our students like humility brings ethics, humility brings listening. In, like, just in particular, like the ways that the ways that we've been trained as technologists and when we think about building technology or currently our engineering and computer science curricula, like that there's like an ethics day and then we move on and we just carry on going. It's because we're not proceeding with humility and thinking about like what we do and what we don't know and thinking about, like, who, who could potentially be helped but also who could potentially be harmed from this. So, I think, like, if it's particular what skills, it's the skills that are necessary to to to better whatever scenario and so it's really hard to say like what skills should we be teaching exactly. But it's, it's whatever is needed at the moment. So, I mean, obviously, if somebody wants water, they don't want to learn how to code. Like that's that like, there's that disconnected there. You know, if we need health care, like, we don't need to learn how to code, like those sorts of things so we're thinking about like hierarchy of demand and hierarchy of needs. And let's take care of like first things first like let's all let's let's get basic humanity on on par and then we could think about this other tech stuff that can make our life easier, but like food and water and health care and shelter. So I think things are probably more important first then whatever cool tech thing that we all know how to, we all want to learn how to do or we all know how to teach. Thank you so much. So the second question actually start with, I think some reflection and then you have a couple of questions, and anybody can take it. That's a great idea. That's a great idea. If they cannot relate to the data. It does not make sense. Indeed. So the questions are, how do we make that happen, especially when the learner never experienced that. For example, how do you make them understand extreme poverty when they never come close to that in real life. I'll take it. Yeah, I mean I think that that's a really good question, but I think that that's a lot of the work that we need to be doing because I think that as we've focused more and more on on big data and the bigness of the data. We actually reduced our need and desire to to really understand context, and to understand where this data comes from and what we don't know I mean so Alex was just talking about humility, and I think that big data makes us gives us like an arrogance booster because you think you have all the answers. I think that we have spent a lot less time bringing in new perspectives challenging our assumptions making sure that we understand that not everybody looks like us and wants the things that we do and making the space to include those voices and experiences. I mean one of the things that you know I think is just fact today is that technologists are creating with the incentive, like the measure of success is does everyone on the planet use your thing. And if, if that's how we're going to think about it then it is imperative that we know who everyone on the planet is. And if not, then just say it's not for everyone on the planet and make sure that we understand who it's for and how to protect the people who it's not for, from the harms that it causes. I think that it has to come from, again, you know, humanizing the data and making it personal what would, what would some of the questions I mean we're ego driven people right we're human so what would we fear, what would we, we would fear not being understood or not being seen and kind of making sure that as students as practitioners of Pitt we have that in the back of our minds that there are people out there whose lives we absolutely do not understand and who's this and the assumptions that we make can be deeply deeply harmful, not just to them but to the rest of the world. So I think part in like a huge part is to say, let's relearn what the world is made of let's relearn let's get. Let's develop a new and warm and small data understanding of the planet and it's and the environment that we're in. Thank you. Anybody else. And so you were. Yes, you are. I want to take that and bring it back up a little bit to the policy level because I think that there's something should find about this which is that all of it. If we're talking about technology development and design or if we're talking about implementation of use. These things are happening within an institutional context. So by institution I don't mean an organization all over there, but that there are certain rules of the game there are ways that things work that sort of dictate how things will will happen. And those rules can incorporate ethics and awareness and inclusiveness or they don't necessarily have to but I think that one of the traps that we sometimes fall into, not, not just with with pit, but with anything that involves public engagement or the idea of any interest at all. So this one's across all policy areas as well, or science in general, is this notion that we fix it by including people early and often and co everything and one of the things that is just a reality is that when you are in a policy position in government when you're in a position of responsibility, you don't always have the ability to do that in the kind of granular way that we might imagine it. You don't always have a way of checking in with everybody, you don't necessarily have a process for being able to actually contact everybody in the world. And that's why I think being able to build that kind of reflexive awareness into how we teach people to function in those kinds of spaces is really important because sometimes there are mechanisms for being able to physically engage people, you know directly engage people. Sometimes there aren't, but when when they are not there you need to be able to figure out how to interpret data and a few anecdotes or no you have to be able to to project and think about the implications of what you're actually doing for for real people. And at least I became really interested in finding out this is when I was in government started doing research on this stuff. I, I was interested in finding out how much people really want to be included because we talk about it a lot. There's a lot of academia about the need to be inclusive and to engage. And so I did research about it. And part of what I discovered is that there, there are a lot of times when people don't care. They don't want to be engaged, they don't want to, they, it's your job, whoever got elected is your job who has, you know, who's appointed to fix it. You know, to be responsible to be accountable to the public or to the public's and to really and to do your job so that they can continue being nurses or teachers or, you know, wait staff at restaurants or, or professors teaching something else, and not this. And I think that that is one of the absolute trickiest pieces of all this, that there is some sort of a, there is constant balancing point between being kind of directly inclusive and being able to somehow have that frame of mind, even when you don't have the ability to, to, you know, run, run, I'm going to run focus groups about every decision that you make or whatever, and you can't realistically co design in a kind of a practical way with everybody. And I, I'm wondering if Alex might have views about this because there's this kind of very micro and very macro aspect to this work. And I think that that the micro stuff is a little bit easier to think about in terms of classes and curricula. But the kind of work that our students are likely to be doing, maybe at that macro level, and how do we teach them to sort of take that sensibility forward with them. Am I going to respond. Okay. Yeah, you look like you wanted to that's why is that okay. Yeah, so I think this is like the moment for me to plug that we can't let go of the arts and the humanities. Because we all stop being human without those lines of thinking so like as we're defunding the arts and we're defunding humanistic thought and we're piling everything into like stem education and stemming everything. We, we need to not leave, we need to, yeah, we need to not leave behind care caring for each other like learning those skills of like inquisitive thought curiosity asking questions, not just solving the problem set. And because people do have to make on the ground decisions on their own like eat like for as much as I am all about co design. I also know that if we co design everything, we get nothing done, because we'll be like moving so so slowly. And so I think that like my approach to it is that you somebody has to be in charge, like, even if it is co design somebody has to be in charge and at the end of the day, somebody has to actually make those decisions. And the way and people, people also want to feel safe, like better co design is co designed within some kind of parameter or a structure, where you're deciding between certain sets of stuff, and not like dumping out the entire sandbox and starting from because then nothing gets done. And so I guess, like on on a macro level I think you're totally right that people need to do their jobs and make decisions. And I think that, again, humility and ethics and humanistic thought are ways, things that we should keep teaching our students going out into the world, it to help them process better think better make better decisions be more thoughtful. And for him just a thought here also I think there's a methodologic thing that we can insert here into how we're teaching our students. And that is this question is really about how we can get stuck in an office thinking about statistical lives instead of real lives in the real world. And one of the things that Lisa and I used to work at this place called the Office of Technology Assessment which part of the US Congress. One of the things I learned there was the most powerful methodology that I ever had as a study director at OTA was to actually go out and field trips and go talk to people that were doing the things that we were supposed to be writing about. It's amazing how few people actually do that. And we've brought that into our curriculum, we're asking all of our students to go out and actually interview some stakeholders about the technology that they're focused on, so that they don't just think of this as statistical stuff. It's like real people facing real questions in their real lives. And I think there are there are methodologic things that we can do like that to kind of change the norms so that it's not so academic and only focused on publications. I totally hear you. Yvana, you have a comment. Just a quick pylon here. I think, you know, a violent agreement with all of the comments that are happening. I also sort of get a bit anxious when I hear so much of the, I mean, I know this is the panel on what we're talking about, but so much sort of responsibility being put on the educators of Pitt or STEM, because you can be the best educator, the most, you know, the most inspiring. You can have your, you can create a legion of students who want to go out there and do nothing but wonderful, beneficial things for humanity. But if they are, if they go out there in the world and are confronted with institutions that want none of that, it is impot like it's not something that just the educational system can fix. It has to go beyond that there has to be a holistic, you know, Elizabeth was talking about, you know, reclaiming the idea of tech of the social phenomenon, I think that that is, that means it has to go beyond the walls of academia, it has to go into the workplace. And so what are we also doing, what expectations are we how are we preparing our students for that disconnect, you know, how are we, what can we do to go in and say, unless we can change that environment, which is sort of like saying that like, if there is a way we can build in, make this required rather than desired, we can say, no, you don't have a choice. And they can't make you not do it. They can't say, No, your ethics don't matter because my KPIs matter more. If there is a way to protect the students once they leave that environment, that you know, that's the Holy Grail. So listening to Bob and Alex and now Jimana makes me think that language is really important. And we haven't talked about what our backgrounds are. I'm a political scientist and economist by background. So words like public interest actually means something very specific to me and it has to do with sort of competition and coalition building within the policy space to put it mildly. And I think that if that's the perspective that people take with them into these jobs where they have to try to decide how they're going to, what ethical stance they're going to take right, what are they going to bring if they're small fry in the organization if they're in a leadership position, what do they do. Listening to you, I'm thinking that perhaps the language we should be using as public service technology, because it just puts you in a different frame of mind. So the frame of mind that it that it puts you in is that, regardless of what role you have in your organization, whether you're someplace that really wants to have that, you know, perspective or not. That it makes you other looking, it makes you focus on trying to understand how you can be of service and to whom, instead of being able to kind of become complacent that as long as you can figure out what the interests are, you're good. And it kind of turns it from being kind of a relationship based sort of way of thinking about it and instead of a methodological or analytical based and I don't know what the rest of you think about that but that public service ethic is something that we've seen make a big difference in the last few years and and going forward into the, you know, the next federal administration. It's likely to change how governance gets done. If that comes back. Then we have like four minutes left so we need to also have a quick response I think that's really interesting framing Elizabeth because I think that the changing it from interest to service. Sort of imbues the practitioner with an ownership it's with a role with an action, rather than just a knowledge. And I think that that is a really interesting shift and a really important shift because it puts the, you know, the learner and the driver seat, how are you going to, it's a it's an active term versus a passive term, so I'm all for it. Okay, so we are actually redefining fit that that's that's awesome within within that within an hour that this is this is very good conversation. So when we talk about language when we talk about contentualization then we talked about governance policy and technology industries. I actually have a question for all of you, if you can reflect on this quickly. That the globalization of it is just not about us it's just not about OECD. It's about us that we all inherit. So, in terms of globalization of Pete now let's be very specific, like imagine our program within as you that we are we are we just started. What are what are the couple of things that you'd like to see each of you to make sure that we are inclusive in terms of just going to a century, but also out for whoever wants to take it first. Well, one thing that I'll observe is that in fact, the online technology. In many ways it's not the same thing as being in a classroom and it's not quite as rich and experience and all that but the big advantage is, you can do it from anywhere. And while our first inaugural students in the master's program are coming from all from the US they're all over the US but I don't think we have any international students, there's no reason that we can't. The skillsets are at least as relevant, maybe even more relevant in countries outside the US than they are in the US and I think the methodology and the ways of thinking about emerging technologies are are totally global. I just, I can I should I. I think I mean I think Bob talked about it earlier when he was talking about, you know, the, the earth of people going out in the field, I think it's just the idea of thinking about the field is a really big one, and we need to go into it and how we, you know, you don't just include voices by sending out a survey and seeing what people say. So you do your homework up front you go into the field you spend time there. And that field means very different places very far and very removed places and I think we ought to spend a lot of time thinking about how we can enter them respectfully. Yeah, so I would second what was just said by Jamana and Bob also to note that recruiting if we're thinking about global issues here. Recruiting I think is a question clearly the people who can't engage in online learning are going to be excluded by definition. And I also think that the emergence of the term and people not understanding what it is could be an issue. And third, if we're trying to figure out how to kind of embed this, then we may need to think about things besides degree programs. We may need to think about sort of short term weekends, you know, sort of things are more accessible to people that are ready for a degree. All right, closing it out. I think that learning together globally helps everybody to learn better that if we're learning things as a whole, then it isn't just like us looking at them them looking at us but we have a lot to share. The US based students can learn our own culpability about how the world has gotten to be the way that it is, and some of the places that we've been at fault for, and then not going into this trap of making simple fixes for simple problems that leave everybody feeling good, but never ever tackles anything of substance that we could do that all better together learning together. Thank you so much, thank you so much so you see this is very fruitful and like if I end with some of your quotations that nothing about us without us so we need to make sure that we're inclusive. We are going to reclaim the technology with that that just for that does not just focus on inputs but outputs as well. We need to make sure that we do co design as we made it that we are asking real people real questions, and we have policymakers with empathy, so that we can definitely make sure that it's inclusive it's equitable. It's globalized, and it's definitely also contextualized. Thank you so much guys for this conversation. I wish we could talk more and hopefully we would be having the conversation offline as well, and have a great rest of your day, and I'll see you guys soon. Thank you.