 Hello, folks. So welcome to the Berkman Center. I'm happy to moderate this session, where we will be discussing the content of and the implications of hydrogen suns book on technology and public interest. So, just a few words at the outset. Quite a long time ago, how Chen came here for an LLM degree, and at the time was working with my nudging on this issue and he's been focused on it. Ever since it was prescient for him to take up this topic because it has become ever more important in the years since. With a little bit more historical context debates about the relationship between distributive justice and technological progress have been going on for a very long time. They've been running parallel to an equally venerable debate about whether technology or specific manifestations of technological innovations actually for the public good at all. And the assumption that a particular advancement is net socially beneficial, how to make it as widely available within a given society and globally has been discussed for a long time. It seems to have peaked within the legal system in the mid to late 20th century, when there were some as, as how Chen indicates in his book various expressions of the rights of all persons accesses technology in that they did in the background of it. Two aspects of the pandemic have brought it to the for once again. The first concerns access to information technology when we all came to depend upon online education and was quickly became apparent that access to the net resources essential to online education, including internet access and computers was highly unequal, even within the United States. The second of the two aspects of the pandemic that brought this issue back to the floor concerns access of vaccines, when we on the one hand, celebrated the extraordinary breakthrough and speed of the mRNA vaccines, and their capacity to shield us against the coronavirus of the pandemic. And then acquiesced in a pattern that resulted in very slow still highly unequal distribution of the vaccines globally. When Latin America and even more so Africa to wait much longer for any access to vaccines. Now, people disadvantaged by these especially glaring instances of inequality are pressing for remedies. And most of the time they're pressing for remedies in the form of institutional change, like the coven waiver for example currently be discussed in Geneva which would argue with supporters would increase global access. But how Jen is providing a different way of addressing this debate, which is rhetorical and legal, trying to re elevate a theory or understanding of the nature of the public interest and his relationship to technology, which is compatible with an arguably good health support that changes which are the main focus of activists at the time. So, it's against that backdrop that's especially timely for him to come and give us a overview of this book and provoke a conversation about it. I have a message. Thank you so much, Professor Fisher. As Professor Fisher mentioned, I actually started off this project. As an alum student working on a long paper, being with copyright limitations and public interest, and Professor Fisher was my supervisor at that time. So it was so great to have him with me right now, introducing the book. I'm very excited to be back at Berkman's class center. Talking about my book. So, I'm going to talk briefly, briefly about the background information about the book and why I decided to work on this topic. Professor Fisher mentioned the COVID-19 pandemic to four elements that have caused, you know, people's concerns about technology and public interest. And the first of all is absolutely the inadequate access to the internet for online learning. So I'm going to supplement with some data. For example, at the very beginning of the pandemic, around 15 million students shifted to online learning, but surprisingly in the United States, one fifth of these students had no reliable access to the internet in their homes. And around the time, at the very beginning of the pandemic, research showed that around 42 million people in the US were unable to purchase broadband internet access. It is also a global problem. In 2019 nearly half of the world population had no access to broadband internet. So when you may take the granted that the United States is absolutely the most event, most technological advanced country in the world, but still internet access as the pandemic has revealed is one of them, you know, it's one of the most, you know, serious problems in this country. And also the vaccine I have a little bit more information to our figures to supplement. For example, as of September 2021, a mere 3% of people in low income countries have been vaccinated that was in stark contrast with the 60% vaccination rate in high income countries. And this is a question about, we have the, you know, great technology such as mRNA vaccine, a vaccine for the problem is about how to distribute such technological benefits to, to more people right, not just people in the high income countries. And when it comes to the normal circumstances, actually, public interest might not still be adequately served. One of the best example is to take a look at a social media companies, we are dealing with such companies every day and somehow, you know, there is. There has been a severe asymmetry of power and a responsibility that such companies should assume. One textbook example is provided by Facebook. So Facebook as you know we we have been using Facebook a lot right. The problem with this company is that its data policy has caused serious harm to the public interest. For example, as you must know, the Cambridge data analytical scandal absolutely is the most serious breach of personal data. And Facebook's mishandling of fake news is set to have swayed the outcome of a 2016 US presidential election. And so there, it's no surprising that a Utah judge even denounced Facebook as a tool for evil. So this has caused a lot of trouble. And also when it comes to take a look at tech companies as patent holders, you can see a lot of irresponsible actions or decisions. For example, at the very beginning of the pandemic, a patent holder actually launched a lawsuit to block coronavirus testing involving the use of its patent. So that was denounced as the most phone that I issued and history. And again, in 2021, when the vaccines have been developed right so the World Health Organization encouraged Big Pharma to participate in the technology sharing scheme such as a COVID-19 technology access pool. But as of March 2021, no pharmaceutical companies had voluntarily joined such scheme. We're still some of them have even condemned it as nonsense and dangerous. So all what has happened has motivated me to work on this book project. So my central question is how we can develop and apply technology in the public interest. So my approach is to have to have the rights and responsibility combined so I call it the rights and responsibility approach to to deal with this topic. And so the first part is about the protection of rights. So I propose that we should recognize and protect a new right called the right to technology. And first of all, we could protect it as a human right under the international human rights system. And secondly, we could further protect it as a collective right in domestic civil rights law. We also offered a solid experiment arguing that we could even protect it as a fundamental right under domestic constitutions. So this is the right part and the second part of the Pogo approach deals with the so called fundamental corporate responsibility. So tech companies should be obligated to reciprocate for users contributions. You should also assume an active role responsibility and also encounter injustices caused by technological development. So in general, I, I want to set the agenda for protecting the public interest emotion first, and then we can through actions such as lawsuits we can gain better understanding of the nature and scope of the public interest. This is because the public interest is one of the most elusive concepts. And so is technology. So, through, you know, action and motion first. So let me move on to talk about the hero right to technology. Actually, this is not new. 17 years ago, international leaders came together to deal with unprecedented harm caused by the second world war, especially the military technologies. And the weapons are used in the world that falls unprecedented harm to human lives and properties. So they came to the conclusion that technology must be utilized in the public interest, and then they devised a specific clause, or promoting such gender. So articles 27 of the universal declaration of the human rights states that everyone has the right to share in scientific advancement and its benefits. And at the same time, article 15 of the international covenant on economic social and cultural rights have a similar mandate. Surprisingly, more. This is the adoption of the UDH are this human rights has become a sleeping beauty some scholars call it a sleeping beauty. In my book, I call it an often internet and the international human rights system. How the human right has received such scan attention. And this human right remains obscure dormant and totally ineffective. Why is that. So in a book, I revealed that there are three major contributing factors. First of all, this human right is inherent obscure. How do we understand the nature's scope of technology how do we share the benefits of technology progress, for example, these are obscure concepts. Similarly, as you know, international human rights systems lacks the teeth in enforcing human rights. When there's a violation of human rights. It's, it's exceedingly hard to enforce through human rights treaty because it lacks the enforcement factor enforcement mechanism. In my opinion, the third factor, the international communities over emphasis on intellectual property and is a leading contributing factor. Why is that because in the past 70 years or so the international has been emphasized on the intellectual property protection, and, you know, a whole bunch of international IP trees have been concluded. And so the international community was trying to promote distribution of technological benefits through IP protection. IP protection means voluntary transactions in a market place. If you want to enjoy, for example, a panted product, you have to get approval from patent holder or through paints certain kind of loyalties like that. So this kind of a market based distribution of technological benefits actually has has eclipse the human right to technology. So how can we resurrect the human right to technology then in the book I argue that we could first protected as a collective right to technology in domestic civil rights law. So, before I talk a little bit about collective rights, I want to come, you know, situated in the idea of individual rights individual rights such as freedom of expression and property and privacy. So there are personal interests and individuals personal interest. In contrast, collective rights are designed to protect people's interest and their social membership. It focuses on how people can become a social member instead of just that individual right. So this kind of bustering of social membership is key to the idea of collective rights. So there are two kinds of collective rights. First of all, we have societal rights in my opinion there are two kinds of two kinds of collective rights. It's called societal rights. It's about how we can take advantage of all kinds of resources to better develop a community or a society. So the right to development is a typical example about society rights people according to this right people can rely on natural resources to make to develop economy to make a society much better. Okay, group rights is the second category. They're designed to promote or protect resources, such as language, cultural symbols, helped by minority group of people. So if we protect technology as collective right first of all we can protect it as societal rights. So when it comes to, you know, technological benefits that are crucial to the development of the society then it could be recognized as it could be subject to societal rights to technology. I think internet access one of the best examples we rely on the internet heavily to participate in civic dialogue. And a lot of you know things are occurring on the internet so it's essential for people to have access to the internet to participate in civic life of politically and culturally and group rights to technology are also very important. Professor Fisher mentioned pandemics and I think HIV epidemic is a very important example about how we can rely on the group rights idea. So it's basically about identity right so if we, for example, we can recognize the number HIV positive people as a student identity then they could, this kind of status could entitle them to claim that they should enjoy the benefits of medical research in in the areas of HIV Madison and this right can also guard against harmful use of technology are getting specific people specific group of people for example facial recognition technology, having widely used these days, but some of the facial recognition technology actually automatically recognize for example, people of color as criminal suspects. So this harmful way of using technology can can be guarded against by the group rights to technology. And I said I also offer a thought experiment to see whether the right to technology can be further protected as a fundamental right on the domestic constitution. So what are fundamental rights in under US constitutional some liberties are so important that they can be regarded as fundamental rights that will receive the most rigid form of legal protection. A good example is about the number of freedom expression expression protected by first amendment property protected by the fifth amendment. So, these are enumerated fundamental rights right so the US Constitution also allows courts to recognize fundamental rights that are not expressly recognized by the Constitution these are on enumerated fundamental rights. So, what can we do that. So, basically we can rely on the concept of liberty under the 14th amendment courts have recognized on enumerated fundamental rights. If they can be defined as liberty under the 14th amendment to the US Constitution, such as the right to travel, the right to vote, the right to marriage. So, my, in my opinion, we can rely on this 14th amendment conception of liberty to recognize technology as a fundamental right. If we have it recognize a fundamental right, then it will set further mandate to require government to the government to provide fear or equal distribution of technological benefits. So, a fundamental technology such as fundamental technology. Basically, it's about fundamental technology technologies are fundamental to the sustainability of people's lives and freedom, then they should be protected by the fundamental right to technology. So, fundamental technologies such as electricity, transportation, telephones and internet access should be protected and the government should try their very best to distribute the benefits of these fundamental technologies to every member of this country. But I should have a caveat here, we're not supposed to require the government to distribute some fancy form of technology not that are not fundamental technologies. So, we should not require the government to let everybody to have, for example, have a ride on Mercedes-Benz buses. We should not require government to distribute iPhones, latest version of iPhones to every member of this country. So we're talking about fundamental technology instead of having a fancy or luxury form of technology. I want to talk about the second part of the proposal. It's about the recognition and enforcement of the fundamental corporate responsibility. So, first of all, I argue that we could rely on the ethical norm of reciprocity to require tech companies to reciprocate for users contributions to the end. So, reciprocity, as I mentioned, it's an ethical norm requiring that one should respond to a possible action by another through returning that action proportionally. I think friendship is a very good example that friendship is normally fostered by the ethical norm of responsibility when a friend treats you a dinner gives you a gift. So, you are supposed to reciprocate by treating him dinner, but not necessarily maybe a lunch, giving back a gift or saying something nice on Twitter or Facebook for his or her goodies, right? Also, that's the healthy way of developing friendship. So if we rely on the idea of reciprocity, we could see that users' contributions are very important here. First of all, users have contributed a lot of contents to tech companies. We have uploaded a lot of videos, we have posted a lot of information on a variety of internet platforms. And users are also essential to tech companies or social media platforms advertising revenues. This is because companies knows that social media platforms can use targeted advertising to reach out to users. Users are really essential to an online advertising these days. And also users have been essential to tech companies innovation because a good example is about the data that users has contributed to tech companies. So, when it comes to artificial intelligence, such as, you know, chat, we use is actually contributed a lot of data without, you know, users contribute data contribution then it's, they're really impossible to develop, you know, artificial intelligence. You know, we need to use users data to train algorithms that are applied in artificial intelligence. And secondly, I also argue that we should require tech companies to assume an active role responsibility. So the idea of role responsibility was first put forward by a British jurist whose name is HLA Hart. He argued that responsibility should be imposed upon a person based on their specific roles such as husband being husband, being a sea captain, being a judge. So if, if, as, you know, as an husband, you know, somebody needs to provide support to his family right sea captains as opposed to take care of their passengers and judges as opposed to make impartial judicial rulings. So if we take a closer look at the tech companies roles, we can find that they actually play the role of disseminating information. We use this upload a lot of information on their platform, they disseminate information to the public. So they play the role of being information disseminator. They also collect a lot of information from users, right. So they play the role of information as information collectors. There are also information creators. And again, chat to be is really a, you know, the best example about how tech companies can create information. So, given that they play these roles. I argue that they should take care of take takes their roles seriously and then assume an active role responsibility accordingly. Last but not least, I argue that tech companies should also confront is justice and justice is called by technological development. So social justice here, or distributive justice as Professor Fisher mentioned earlier is really one of the poor humans, one of the fundamental values that we chair daily. So, when it comes to social justice in the conventional ideas that we have, for example, resource based inequality. The distribution of resources, lead to resource based inequality, and also unequal recognition of social status such as racial discrimination, also calls status based inequality. And we also have the third form of injustice these days. It is called it should be called technology driven injustice technology are actually causing this kind of third form of injustice, why is that. For example, we have seen a lot of examples about unequal access to technology benefits, such as in, you know, unequal access to the internet and access to COVID vaccines. So this is, you know, the first examples about technology driven injustice, and also the ways in which technologies are used and also cause this kind of no injustice. So, improper use of technologies can cause serious harm to the market. Emission of, you know, pollutions have caused serious, you know, degradation to water resources. So, so in the last chapter of my book, I talked about how we can deal with, you know, such kind of a fundamental corporate responsibilities, I use patent holders, tech companies and patent holders and example to to to enforce this idea. I'm interested you can read the last chapter of the book. And so this is pretty much like what I wish to share about the book and I look forward to comments from the panelists and from the audience. Thank you so much. Okay, two comments. So, this is an excellent argument actually in the introduction or the order clarity and it's written very well, if you actually buy the book, I encourage you. One of the critique critique to to promote the discussion with the engagement from the audience is that we have a very different situation in the US in the Kobe situation to of course the access was the problem of the internet. But again, the more access and more compensation sometimes actually create more problem. That was the case of the Kobe misinformation. And I think that the solution some of it, of course actually makes sense for the global or South context, but they in the US situation, we have a very different situation. It's the same as South Korea to they have a lot of access, but that doesn't mean that they have a better knowledge of the use and then access sometimes create more problems. So how we actually reconcile different demand, along different tiers of nation. Oh, sure. Great. So first of all, I wanted to say this is really important and timely book so it's great to have an opportunity to have a conversation about it. You know, post COVID but also a lot of the key challenges that we're facing today when it comes to large scale platforms, the development and deployment of AI systems. And one of the real reasons why I like this framework is because it offers a kind of alternative framework regulatory ethical governance framework, then the public health framework that we often seen, you know, so discussed in other platforms and I'll come back to that in a moment so I have a comment and two questions about the book so one is, I think, you know, one of the real nice values and contributions of the book is some of the history that you have to pass for an article 27 right you look at the right of technology under, you know, international human rights law, and you're quite white you're quite right that, you know that that right of everyone to share in scientific advancement and benefits has late dormant and I think you you breathe life into that history and I think that's important to understanding current debates for sure. And I suggest an additional bit of history on the same period that's been somewhat less dormant, and the way that I, you know that that I came to engage with it was, you know, and years ago, this UN special repertoire for freedom of expression in his report in 2011 recognize a right to the internet. Got a lot of attention and shortly thereafter I, you know, I wrote sort of a brief history and intellectual origins on this right. And as part of that history sort of looked at the right to communicate, which got a lot of momentum in the 1980s. But also even the intellectual origins behind that and LaRue's report in the right to internet access is, you know, this post war context that you write about in the book. In addition to this interesting scientific right under article 27, you had the post war period the free flow of information paradigm, right where during the war you had propaganda and radio jamming at scale by states and so the solution to that in the post war period was well we need information to flow across borders and knowledge was key to that access to shortwave long range radio technology was sort of a the long term solution to ending propaganda which could lead to war. And so I think there's a really interesting history that because you look at the right to communicate which got momentum within UNESCO in the 1980s, internet access, and some of the history speaks to access to technologies. Right. And I think part of how it lost momentum in the 80s it got tied up into sort of contrary movements within the international community around censorship radio jamming. And so then lost the support of the West but I think that's another sort of historical angle that justifies exactly what you're you're arguing at least international. Two others sort of quick comments and then questions. Secondly, for those were here earlier for Biela Coleman's presentation she talked about the politics of technology so I'm thinking, lend and winners famous piece. You know, do technology to do artifacts have a politics and we argued that, you know, the famous example he gave is that you know that the bridges on Long Island were built low. So intentionally, so that public transport to the beaches on Long Island. You know wouldn't be as accessible to sort of marginalized communities, like African Americans or lower class working class families that would be using public transit. There are two of the buses couldn't pass through. And so maybe one question that I would post your work. How do you respond to because you're quite sensitive to the politics of public interest. How do you respond to the politics of technology to use the covert example. And I, and I young also mentioned this in his comments is how you know during the pandemic, you know zoom provided enabled great opportunity for connection and communication during lockdowns, but also we're seeing in retrospect how education done and the educational outcomes of learning outcomes have been dreadful done remotely and disproportionately it's been marginal communities that have been disproportionately affected. So that technology, remote technology video conferencing enables certain social outcomes but it also precludes social outcomes as well so that's sort of one question. And the last question. And I love to hear more about this I know in your final chapter you talk about patents, but I look to understand a little bit more about how you think your framework works in response to in relation to some of the new governance proposals that we're seeing in the US and around the world around social media platforms for example. In Europe you have the Digital Services Act. The conversation in the US, including, you know, leading BKC folks like information, Jonathan Zittrain talking about information fiduciaries imposing responsibilities on platforms technology companies. And of course, lastly, my home country, Canada, thinking about imposing on platforms a duty of care, and often the justification for those frameworks are a public health framework that technology has health and harm implications so therefore we should justify these regulatory frameworks on the basis of public health conceptualizations, but I think what's interesting about your proposal is that it provides sort of a rights base framework for those same proposals, but maybe I'm taking it too far. So my question is, how do you view your framework in relation to some of those new regulatory proposals because I think it, I think it's interesting because it provides an alternative public health in this way. Thanks. Thank you so much young and john these are great comments I learned so much. And so a quick response to young concern about having too much access to technology or too much enjoyment of technology. And this is indeed concerned these days because, for example, smartphones become our best fellows or lovers, you know, so they're more they could be more important than our beloved ones. And, you know, so, so I think what technology is so this is, you're talking about, you know, the, the other aspect of technology, while technology can do good things to us at the same time technology can also do bad things. I talked about the example the second one, where military technologies cost the unprecedented to, you know, to our blow to our world. So, I think, you know, so the to deal with the harmful effects of technology or egregious even agree that it's absolutely part of this project, because the right to technology or the human right to technology, as I mentioned is to prevent technology to be harmful to the public interest of humanity. So that's the idea. And so we, we can rely on this idea to deal with the harmful effects of technology. And so, for example, I think the group right to technology is a very good example about how we can sort of rely on this idea to promote. harmful user technology to a certain, for example, group of people by their race. And, for example, children could be seen as a special group, how we can prevent technology from being used, you know, harmful to children's interest, for example, their social media. So these I think these, if we have such rights, we could, you know, prioritize these agendas, because these are the rights that, you know, that deserve urgent attention and legal protection. So, with respect to john's first comment on the, so your first comment is concerned with sorry, I was talking about a lot. Yeah, yeah, so the, the, that additional history behind like the rights to communicate which with the right internet access for the right to communicate, which got so momentum in the 1980s, within us and sort of built into that. Right, it was based on the SSPR right to see, you know, part of the mission, one conceptualization that got so momentum was a positive right that states without to provide access to communication technology so it's an additional history that I think provides some forms, but also provides additional foundation for your article. Yeah, yeah, yeah, yeah, so this is really relevant actually cited the news report in my, in my book. And so, you know, access to the internet, according to that report relies on the human right to freedom of expression, instead to, instead of the right to technology. So this is a sort of, you know, an interesting element to show that how this human right has been, you know, has been forgotten. Right. So with respect to your questions. I think the, the, the, the, the other regulatory approaches are also really important. And so, for example, I talked with Professor John Falcon, yesterday about the information fiduciary approach. And so I totally, you know, agree that, you know, you know, these theoretically and policy oriented approaches are really important. But one of them, one of my concerns with the information fiduciary approach is that it is a little bit narrow minded it only deals with tech companies responsibilities. Dealing with collector collection of personal data, but actually tech companies are doing much more than collection of personal data I mentioned that they also play the role of disseminating information they also create information they also moderate contents, etc, etc. So in, in my book I showed that the rights and responsibility approach actually offers a more, it's kind of broader approach to deal with a whole range of issues that all problems that tech companies have caused. And so I think so that the approach that I propose in the book shares some commonality with the regulatory approaches such as the information fiduciary approach but in general it's, so you know, it takes the, you know, information fiduciary approach to deal with, you know, several other important areas as well. I intend, I intended to broaden the information fiduciary approach. So, that's, that's really interesting. And, and, and just so I understand you. So then, if from your view from your framework, information fiduciaries is sort of a too narrow framework. I think that's really interesting. So maybe your approach, which you say that is, is more likely to justify, for example, the approach by the EU under the Digital Services Act which imposes an obligation on companies to manage their systemic risk. And that can include a range of different context, not just data collection, content moderation, recommender systems, all of that. Okay, that's really interesting. So I'd like to, you know, another question about how we should better distribute benefit technology benefits to a range of disadvantaged groups. I think the, the rights and responsibility approach also has this kind of edge. So, for example, the inclination fiduciary approach. So, you know, it's more or less, as I said, it's more or less dealing with, you know, collection of personal data, right. So, in my opinion, ish, we should also deal with, you know, broader projects, for example, Professor Fisher mentioned distributed justice. So, I think the whole, the whole landscape we're seeing is that tech companies are trying to dominate our daily lives. But the problem is that we, we, we still lack the legal tools of legal doctors or regulatory mechanism to counter their, you know, dominance. And how we do that. So I think the right to technology as well as the fundamental corporate responsibility can shed light on how we could, you know, sort of come together to reveals a lot of, you know, problems such as, for example, to foster to specific groups of people and society and large. So we need to give people the legal tools to protect their interest. For example, we rely on the recognition of the collective right to technology, then we can empower citizens to sue the government or tech companies. So if their collective right to technology is not adequately protected. And if we have legal actions like that, we can start talking about not only technological benefits but also the nature and scope of the public interest. So we have a lot to have the issues hide in black box, especially the, for example, algorithm developed and utilized by tech companies as black boxes. So, yeah. I think at this point we should engage the audience some more. So, we started a little late. So can you give me a sense of when we can go to. We have three questions from the audience, and then we have a couple from the virtual that was handed around off the show. Okay, great. Do you have a microphone you pass around or you want to take mine. Thank you so much. Thank you. Yeah, I just wanted to ask on your comment on distributed justice. The question that we have to better distribute take to this advantage. I'm glad you started with examples of COVID-19 and HIV ads. My father died of COVID. That's fine. But I want to give an example, you know, the way he died is because his roommate was instructed by the West to push everyone into quarantine. And yes, so schools closed and schools like you know that we don't use technology like here like zoom. So there was no school for about six months. And my dad who lived alone, you know, now had to accommodate about 20 people in the house, you know, so that was one of the ways in which he actually ended up getting covered. Then the second thing is within the communities. There were people that were coming up with ideas, say, Oh, I think this can work. And the governments were instructed not to actually follow those ideas, but wait on the West, the Western world to actually create a vaccine to actually help those people. People are living without a solution because the West dictated that, you know, so, and it's the same with HIV. I know some people would say, Oh, those people are not qualified. But the question is who decides who's qualified. I know for that, a scientist who actually came up with what he called a vaccine or a cure for AIDS. He was highly scrutinized. He still uses it the government of Zambia where I'm from, actually tried it and they said we said you can use it, but it's useless. That was a point, you know, so the question that comes is, like when we talk about, you know, the fundamental right to technology, whose technology are we talking about. To what extent can you guys actually say, I'm not feeling well, I'll go to Zambia and use their medicine. You know, or to what extent are you going to say, I'll go maybe to the global sound and have surgery. You know, so I think those are some of the things saying a fundamental right to technology using an African phrase is a pregnant point, you know, we need to define what we mean by technology. You know, do we really mean an inclusive technology are we open to actually, you know, accommodating other thinking, or somebody has to decide what I need to help. Should I respond? You probably should respond to that question. Okay, great. Thank you so much. First of all, I'm very sorry for a loss. And, and so what you said absolutely reminds me of paper that Professor Fisher published on the local production of vaccines in low income countries. So I think this is very important to promote local capacity. And so, thank you for this great project. And what you remind, what you told shared with us reminds me of the very important dimension of the public interest. So, when it comes to the public interest, we normally just think in the context of national considerations. We protect the public interest, for example, just within the United States. So this, in my opinion, this is a narrow minded understanding of the public interest. The public interest also has a global dimension that this, this pandemic has proved proven that if we only have a so you know, national conception of the public interest, it's going to be really not feasible or not, you know, it's not workable. The pandemic has proven that because you know a coronavirus can spread spread around the globe. Right. Okay. And so the idea of right to technology actually promotes this kind of global vision of the public interest, because we need to think about how we could, define, you know, the public interest in, for example, at a society large to a certain, you know, group of people within one country or in another country, and also fundamental right, you know, basically cares about every individual's personal needs. So it does have a global dimension. Yeah, I think we should take one on the audience, the online audience and then wrap it up from there. But one person asked I think this really helps tie up the conversation that in a global world with borderless technology. How can we actually partake in a borderless United dialogue and effort when it comes to unifying us with this concept of public interest. And as we move forward with making sure that technology is accessible to all. What does that look like. Again, when we live in a world where we're borderless in many ways, and we have very different backgrounds to how do we take it to account. So this comment actually is absolutely related to what this gentleman shared earlier. And, and also the pandemic has proven that we absolutely need to have a global vision and think beyond borders. So, so the human right to technology, as I mentioned earlier, absolutely, you know, in national leaders 70 years ago, actually saw beyond borders by protecting this human right. Right. This human right is regarded as a universal right. It's embodied in universal declaration of human rights. So at the very beginning. This human right has a borderless, you know, kind of, you know, a mandate. But, but then the problems that when it comes to intellectual property protection. It's actually protected domestically. I mean, intellectual property protection has boundaries. So that's another kind of, kind of a way of thinking about it. So the potential conflict between the protection of human right to technology and, you know, globally and protection of intellectual property rights within a specific country. This could cause a conflict of interest. And it's a matter of how we should deal with it. And I think the last chapter of the book shares light on this discourse. We should think we could think about rights at the same time we should also think about tech companies responsibilities. So when it when it comes to responsibilities. There are rights protected, not only in the United States, but also other countries as well. So as an intellectual property owner who, for example, who live in United States, their rights are also protected elsewhere in the world. Although there are boundaries, right? Okay, so they're the responsibility attached to their intellectual property rights, not only should be enforced here in the United States, but also enforced in other countries as well. So if we think through, you know, this kind of rights and responsibilities approach, we could have a much broader vision of how we should deal with the whole range of global matters. Just two things in closing. The first is following this lead. And his colleague, Madhavi Sunder, have contributed to a course, an online course that Ruth Okedaji and I put together on intellectual property and global public health. It is now in its 10th of 12 weeks and it most of the participants are from low and middle income countries that were intentionally selected so there are 500 people from 90 countries now and and it is a global conversation about these issues. With respect to this gentleman from Zambia's question. You might all the materials are open to the public. You might find it useful of a recorded interview I did a week ago with Martin Fried, who is the head of vaccine research at the World Health Organization. He comes from Namibia as educated in Cape Town. He's now the head of vaccine research in Geneva. He's very interested in the issues that you identify and reflects upon them in this interview which is accessible through the website for this course. It's not simple, the answer, that's for sure. But, you know, that's one source of information. Here's the second comment. He would urge how Chen as you deploy this book into the world to not to put too much weight on a formal adoption of the right sufficiently vigorous to support a suit lawsuits against the companies. I think that's unlikely to happen, at least in the United States and they use a little bit climate but what I think more generative, very helpfully generative by your book is altering the rhetoric the way in which we think and talk about these things, which can support a wide variety of initiatives to augment the stripping of justice vis-à-vis technology. Thank you everyone for coming and for participating in a live in a conversation.