 My privilege as well to welcome Dr. Amy Goodman to the stage. She's the president of the University of Pennsylvania, as well as a political scientist and writer, author, and editor of many books and academic articles, and an ethicist as well as an evolution of her work in politics and academia. And I think we're going to try not to come all the way down to Earth, but to extend the last conversation. Since neither of us is scientists, we're not going to have any discussions about Philosopher's Toe Cheese and other fun subjects that preceded us. But to some extent, we are going to extend this conversation about frameworks, practical frameworks, governing frameworks, and regulatory frameworks that was part of the last conversation with the advantage that Dr. Goodman has actually overseen what I think history would record as the only specific review about synthetic biology governance. It's just been completed. A report was released about three months ago. I hold the executive summary in my hand and direct you to it after this conversation if you haven't already absorbed it. But I thought what we would do with our time first, since not everyone is equally acquainted with the work that you've been doing as the leader of the president's commission on bioethical issues. Is that the way it's framed? Was it stood up to accommodate your leadership, or did you join an existing body? It was created. I was appointed chair, Jim Wagner, president of Emory University, was appointed vice chair, and then a commission was appointed to serve. And it was newly created. Every president has had a commission, and each one has been newly created more or less by the president. So let's talk first in narrative terms about the study you did of synthetic biology, which seems by the account of your own report to have begun with the media coverage of Dr. Ventner's event, whatever it's best described as being. And so tell us how that generated this study and what you actually did to reach the conclusions in this report in summary form. So I believe it was May 20, 2010, the J. Ventner Institute announced that the first synthetic genome had been created, injected into a bacterial cell of a different species, and that cell became self-replicating with the new genetic material. That day, I received a letter from President Obama charging me, or as the New York Times put it, ordering me to take up our investigation of the ethical and social responsibility implications of this new discovery and do it in, I think it was nine months time, Val Bonham, as executive director, six months? Six months time. And we did it in six. We actually did not ask for an extension, as many of my students have done in the past, but we actually did it on time. And I'd like to tell now everybody who works for me at Penn that follow this model, when the president asks you to do something within reason, you do it, and you do it on time. So we took up a- Can I just stop you? Yeah, please. You said the letter commissions you to examine the ethical and social issues, but not issues of social responsibility. Was that short of examining the regulatory framework or legal issues? No, it included that. In fact, one of the things that I would say to all of you is that partly I think because of the way ethics has been used in politics, which is very narrowly speaking, we and the president in his charge have defined ethics far more broadly to include issues of oversight, regulation, anything that would be required for us to consider to guide the federal government and the public in how synthetic biology moving forward will be used to maximize social benefits and minimize social harms. So that definitely includes looking at the regulatory situation. And in fact, that was a central part of our investigation because if you think about bringing, as you said, Steve, ethics down to earth, one of the big issues is, OK, what kinds of regulations are needed and what kinds of regulations should we avoid in order to have this science proceed in a way that maximizes public benefit and minimizes public harm. I want to talk about those philosophical principles, but just before we do, during that six-month process, you took testimony, you held hearings. What else did you do to reach the conclusions that you published? It's fair to say that despite the fact that the members of the commission are national leaders in science and medicine and ethics, none of us was an expert in synthetic biology. So the first thing we did was really bone up on what synthetic biology is, what are its main discoveries to date. And to do that, we not only read a lot, but we asked some of the main members of the synthetic biology community to testify before the commission. We also asked some of the main critics of synthetic biology to testify before the commission. And we asked people who are members and, in some cases, leaders of groups that have taken on the issue of both genetic engineering and synthetic biology to testify. And we questioned them. They questioned one another. And we engaged in, not surprisingly, given my specialty as a scholar and teacher, we engaged in a form of democratic deliberation on the topic. And so to come to the frameworks and the principles that guided you as you approached your conclusions, because you do issue some findings of fact that I also want to explore. But let's start with the framework, the philosophy. And it's the title of this conversation, Public Beneficence in the Pursuit of Science. And you used a language that sounds, without being able to quote my Jeremy Bentham and my John Stuart Mill, pretty utilitarian, maximize the public benefit, minimize the risks to the public. And I would assume that in reaching that framework you reviewed, but must have set aside the competing precautionary principle of thought, which has risen in some regulatory frameworks to argue against risk analysis on the basis that where there are technological or other environmental risks that may be severe and irreversible to a certain degree, government or the state has an obligation to intervene even before the science is certain, if that's a fair summary. Did you go through a kind of philosophical review in addition to the kind of fact finding? So that's a really fundamental question and a complicated one. And here's what we did. And so let me simplify without making this simplistic. We felt that if you're building a house, the walls of a house, and you have the drywall and nails and insulation, but you don't have the frame, you're never going to build the house. I mean, you may have a lot of the materials, but the house isn't going to be a strong structurally. So what we needed were some principles, guiding principles. After all, we are the presidential commission on the study of bioethical issues. So we needed some robust principles. But we didn't want to take them out of thin air. We wanted the principles that would be the most relevant to this topic. And so we picked public beneficence because that is part of the charge of the president to do things that are in the public good to recommend. But we also picked principles that critics, as well as defenders of our final recommendations, which we didn't know at the time, could subscribe to, that are not purely utilitarian. For example, responsible stewardship. What does responsible stewardship mean? Well, it means that we are trustees of the interests, not only of ourselves, but also of future generations. And we have to be responsible stewards of nature, as well. We also thought that the principle of intellectual freedom and responsibility is a core principle. Part of the greatness of our democracy depends on the freedom and responsibility of scientists and other intellectuals. And another principle, and I won't go through all five of them because I know you have other, and you all want to hear more about our conclusions. But another principle, which, again, is not utilitarian in nature, it's general, is democratic deliberation, which is we would hear all views. And then, on the basis of those deliberations, come to a set of recommendations. So another way of titling this, since the overall conference is called, here be dragons, is our goal was, if the dragons are synthetic biology, dragons do good. We would like to make recommendations so that the dragons do good. And that, I think, is a goal that everybody could agree to. And in coming to that goal, did you looking at the particular risks and state of synthetic biology as a commission come to an explicit finding that the precautionary principle was not warranted in this case? We did. We did. So here, we began by looking at some of the more purely philosophical objections that could be made to synthetic biology. For example, a lot of the headlines. And if it wasn't in the headline, it was in the first paragraph of almost every story was the creation of life, the Institute created life. And if there were the creation of life, and indeed, if there were the creation of any life, let alone higher order life, from inorganic materials, that would raise a series of really interesting philosophical questions that have not been posed by any other discovery that have not been posed except in science fiction. So a lot of these stories also used the term frankincel. Well, it turned out that there wasn't the creation of life. And we figured that one out pretty quickly that this synthetic genome, which was created from inorganic material, was inserted into a living cell. So as a matter of fact, they did not create life. So once we figured that out, then the question is, should we put a moratorium on synthetic biology until all known risks could be mitigated? Actually, until the precautionary principle roughly says, do not proceed with the science until you know all the risks and you mitigate them. And there's something very appealing about that principle. And we heard people who defended that principle. And indeed, that principle in Europe is pretty ascendant in the area of genetically modified organisms, GMOs. So why did we reject that principle? And we did reject it. We rejected it because we also learned and heard that synthetic biology in the not too distant future has the capacity to save lives. And this is not to put too fine a point on it. Jay Keesling, for example, a synthetic biologist working at the University of California, Berkeley, has managed to find a way of synthesizing an anti-malarial drug called artemisinin. Artemisinin comes from a natural, is found, and can be made from something occurring in nature called artemisia, which is also called sweet wormwood. And it's a very scarce material. In a year's time, it's possible that this will be able to be produced synthetically, partially synthetically, in large quantities and save up to 800,000 lives a year, largely of poor young children in sub-Saharan Africa. If you go, and if we recommended the precautionary principle because, and it is a fact that not all risks are known at this point projecting future, we will at the same time prevent the possible and likely saving of hundreds of thousands of lives. So we rejected that principle. It's a very important conclusion. And as I was reading this and assuming that you must have flipped through those possibilities and set them aside, I was also trying to think about historical analogies in the regulation of emerging, sometimes transformative technology from sudden spectacular events such as the emergence of nuclear fission technology, first expressed in a military setting, and then after the war, quite a lot of utopian thinking about its potential to transform the human welfare. But then secondly, the development of commercial and medical and other applications of chemicals, the chemical industry, which also, in a less dramatic way, offered to save lives, to improve lives, but clearly over time, risks that were not easy to anticipate at the beginning accumulated. And it was really from that experience as much as any other that the precautionary principle eventually arose in Europe. So I guess I'm wondering, in your sense of synthetic biology and in your discussions, did you find yourselves locating where you think synthetic biology resides in this world of analogies? So the answer is yes. And here's the flip side of rejecting the precautionary principle. We also rejected the principle on the other end of the moral spectrum, if you will. It's also at the other end of the political and social spectrum, which is, to put it colloquially, let science rip. And we heard from some scientists that you could never, you should just let scientists do their thing and engineers do their thing. No regulation, because all regulations are counterproductive. Science will out in the end. And we rejected that as well, because there are risks for synthetic biology. Most of them are prospective. Right now, synthetic biology doesn't have the capacity to release new very unusual organisms into the environment, which might wreak havoc with the kinds of environmental balance, or if there are possible new biofuels that are more efficient and they're made of algae and the algae escapes into ponds. And this is science fiction, but because it's not yet possible, but it's not at all ridiculous to think that something like that might happen, because it's happened in other cases. Then really, the fish stock dies and all kinds of things we've seen happen that are unintended consequences of science. There's also the possibility of malevolent use of new synthetic organisms. For those reasons, because there are prospective risks, we thought it would be irresponsible for us to reject the precautionary principle and recommend science without any boundaries, without any regulations. So instead, what we argued for, and I think we gave robust reasons for this, is to have the executive office of the president constitute oversight of synthetic biology moving forward in a process that would engage all the agencies that right now have responsibility for doing this, but cannot assure us that they're actually coordinated. So we recommended no new laws or regulations, because we thought that would be inefficient and indeed ineffective, but we did recommend an oversight process and asked the government to report back to us in 18 months time and tell us what it's done in coordinating oversight. And just to carry on with that kind of Washington subject, because it will be where the rubber finally meets the laboratory. Frank, in the earlier session, I think reminded us of the emergence of trucking and then aviation as a challenge to the extant regulatory system for railroads, and pointed out that from time to time, the government says this is sufficiently new that we need to constitute a new regulatory agency with new expertise, as in the Federal Aviation Administration. So what did you learn is the location of delegated regulatory powers now that are relevant to synthetic biology? And if, in the unlikely event, there were a sudden surprise, an accident, either malign use or some kind of loss of control, and the president called you up and said, my god, we've got to get a handle on this not in six months, but in six hours. Where would you instruct the White House to go to locate the authorities that it already possesses to go out and change the way this is being handled? Well, there are, won't surprise you, there are multiple agencies who have responsibility depending on the nature of the discovery. So if we're talking what we recommend, just so you understand, we recommended that before any release into the environment of a new organism, there'd be a reasonable risk assessment. And that. So who has the authority to order that and about? And that's the question that we have asked the executive office of the president to tell, not only us, but the American public, because we were told in no uncertain terms that agencies were somewhat uncertain about who had the authority in different cases. And so we could have recommended a new agency, new laws. I can assure you that that would have been the most inefficient, ineffective thing to do. It's the easiest thing. It's crystal clear, create a new agency, create new laws. And it's bonkers. We have so many agencies and laws now, we don't need more of them. What we need is for the executive office of the president to convene the heads of the agencies, which include the Food and Drug Administration. It includes lots of NIH, lots of agencies in health and human services. It includes biosecurity agencies, so the FBI is involved. Convene them, figure out who has responsibilities for what developments in synthetic biology. And then they know how they're coordinated and they can assure us. It's not our job to assure them. It's they can assure us, the commission and the American public and the world public, frankly, that there is an ongoing review of developments in synthetic biology. By the way, you all know this, but it's not at all clear to the American public and even the educated American public that synthetic biology is in its infancy. It really has not progressed that far in manufacturing these kinds of new organisms that we need ethics education for all of our scientists. We now, bioscientists have it, but engineers don't. And we need more education of the public in science. So another thing that we recommended is the bio equivalent of factcheck.org. I don't know how many of you know what factcheck.org is. See, that's great. I mean, so factcheck.org, when a politician makes a statement, factcheck.org, and it's publicized, factcheck.org, you can just click on factcheck.org and they'll tell you whether it's truthful or not. And they'll tell you if it's ambiguous about its truthfulness. When claims were made about this new synthesized genome, there was no place for journalists or the public to turn for that. So if you could have a bio factcheck.org, not run by the government, not run by industry, but run by a nonprofit like factcheck.org is, that would be a great step forward in science education in this country, especially since now we're all so online. It would be terrific for young people and for journalists alike. Not that journalists aren't young, but there you go. Given that you say, I'm sure correctly, that the risks as you found them are prospective and to some extent uncertain, then we could spin out and the science fiction writers could help us spin out an almost infinite number of potential scenarios that would require government intervention or regulation. But you mentioned one specifically that would seem at a common sense citizen taxpayer level to be pretty important, which is you're suggesting that it would be wise before the environmental release of synthetic biological material, whatever exactly that means. It's use in treating malaria potentially or other kinds of releases that there be a risk assessment. So as a citizen, I guess my question is, are you confident that there is somebody in the government that A, has the authority to define and carry out such a order the carrying out of such a risk assessment and that it will be done under the current regime? Or is this something that you in your role now need to monitor and make sure actually flows from the recommendations that you've made? So the answer is no and no. No, we're not confident. And what was the same and we'll, so now what will you do to ensure that it will happen? So we're not, if we were confident, we would have made different recommendations, which is basically if we were confident we would have said we're happy to report, Mr. President, that the regulatory regime not only needs no new laws or agencies, but it is totally prepared and geared up to assure the American public and assure you that all the reasonable risk assessments would be done. We did not get that assurance and therefore we recommended that there be an assessment of who would do this before the release into the environment of organisms. We think, by the way, and we have reason to think that there are agencies that are charged with doing this. But again, this is not an argument that the agencies aren't doing their job. It's a very new science and we had a unique opportunity to make recommendations before this happens, before the dolly is cloned. And we have asked that within 18 months time the Executive Office of the President tell us that those mechanisms are in place. Is it possible that extant laws, such as those requiring environmental impact statements and assessments and so forth, could just be adapted or ruled to apply? It's not only possible, we think it's likely. And we also asked if there are laws or regulations that are needed, we'd be told that too, but it's very likely that rules could apply because this form of synthetic biology, and it's not the only form, but the form that would create modified organisms and release them into the environment is not dissimilar from what genetic engineering, which is far more advanced, has already been able to contemplate. So it's likely that there are rules and agencies prepared to oversee them. But again, it's being able to be proactive in having the government assure us and also asking scientists to take responsibility for putting in suicide genes or comparable safeguards that when something is released into the environment, there's a way of shutting it down. We would like that, we think that it would be responsible, publicly responsible for us all to know that that's a process that will be undertaken and will assure us the safety of this kind of release. So I want to open it up to the audience, but before I do, I want to ask one last question around your sort of finding a factor, if not in a formal way, your own intuition and the extent of agreement or debate within the commission about a question that was on the screen, I guess to frame the last conversation, which was I think the sort of thing that conferences put on screens will synthetic biology and human history, I think was the way I was phrased, yeah, and human history. But what's implied, obviously, is a concern or a fear or an assessment that is at least conceivable that like nuclear weapons, the only other man-made technology in the last 60 years about which you would probably make that claim, that this has embedded in it the potential of a global catastrophic event. And I wondered if you think that's true or not. Or do you not know? So here's what we know. We know that scientists are working with the methods of synthetic biology to try to, and making some progress on creating an anti-malarial drug, which is not now possible to provide in the quantities from nature itself, that has the potential for saving millions of lives. We know that they're working on possible biofuels that would be more environmentally safe that have the potential for mitigating global warming. We know that they're working and making progress on vaccines that could create better, more efficient and effective vaccine production faster than we can. We also, so I begin by that because if there were no benefits, then any risks would not be worth taking, but there are really significant benefits. And I suppose that's not dissimilar in that sense from nuclear energy. At the beginning there's nothing to hope about that. Right, and nuclear energy for some societies like France provides a tremendously environmentally sound way of producing energy without raising the risks of a nuclear holocaust in any way, right? So the question really is, can we provide the safeguards to prevent the more rapid occurrence of some horrific risks? The answer I think is clearly yes at this point. There is no reason for anybody to worry about the end of the world from synthetic biology in the near future. Now I, as Yogi Berra said, the future it's very hard to predict and it's particularly hard to predict the future. Well, if you're talking about what could happen decades from now, I would say it depends on whether the US government takes our recommendations seriously. And if they do, there won't be any reason to worry about the end of the world from synthetic biology. I mean, if I worried about the end of the world, I would not focus on synthetic biology. Well, and that is a very specific answer in reference to the nuclear analogy because. And by the way, I do worry about the end of the world. I worried about climate change, the end of the world, nuclear, nuclear proliferation. There are reasons. I don't dismiss the end of history. I just do think that life is short. You have to have priorities. You have to wanna help save people's lives and make lives better. And if you're spending a lot of time worried about synthetic biology creating the end of history, I think you're just putting all your eggs in the wrong basket. Right. And that seems to me, if I'm hearing you, a kind of a finding of fact on your part. Absolutely. It's not a war. Because when we gave birth to the International Atomic Energy Agency and immediately established a global regime to place all fissile materials under international control, it was because Hiroshima and Nagasaki had already occurred. So there was an understanding of the scale of catastrophic risk that this technology had given birth to. And what I hear you saying is that while there are uncertainties and there are important developments to monitor, it's the judgment of the commission that the risks as they can be best rationally assessed now do not warrant that kind of intervention. So we're saying something more than that. We're saying don't wait till there's a Hiroshima and Nagasaki. We have a unique opportunity, not just the opportunity that the commission had, but we as a society have a unique opportunity to get ahead of this science and get ahead of it, not by shutting it down, but get ahead of it by keeping, having oversight over what it's capable of doing and giving it maximal freedom to do the good and putting safeguards against the bad. And so a safeguard that don't release new organisms into the environment unless you have some way of shutting them down if they start proliferating is a way of getting ahead of the potential Hiroshima or Nagasaki scenario. That's what we're saying. Good. So let's take some questions. Yeah. Sir, we'll start with you and I guess just wait for the microphone. Oh, well, it doesn't matter. Go ahead, you're there. One, Josh Calder, foresight alliance. I'm wondering two things. When you say that if the US government follows your recommendations, then problem partially solved. That sounds very parochial given in 20 years we can imagine Nigerian biolabs just like we have Nigerian hackers now. Is that not parochial? I mean, it comes back to the WikiLeaks thing. Power is changing, capabilities are changing. The second thing is, isn't it not a finding of facts but a philosophical judgment weight we give to what happens to our kids and not us? Therefore, I mean, you discounted greatly because it's not gonna happen in the short term. That sounds like a philosophical position. Yeah, okay. So first, is it parochial that we're advising the US government? We are the Presidential Commission that was charged by the President of the United States to advise the government of the United States. So that was, our charge was to give advice to the government of the United States. The advice we give is not parochial because it would be good for all governments to take this advice. It happens to be that very few, certainly not Nigeria, there isn't much synthetic biology going on in Nigeria. So it really wouldn't be a good use of our time nor, so it's not parochial but in the larger sense you may be asking are we just operating as an American Bioethics Commission in a vacuum? And the answer is that's no. One of the first things I did before we even met as a commission was I traveled to Singapore the Global Conference of Bioethic Commissions from all around the world was meeting there and I heard what other bioethics commissions in Asia, Africa, Europe, Latin America were thinking about this and we asked international bioethicists and leaders of international groups to testify before the commission. So we're not acting in a vacuum, we're really acting as one actor on a global stage and we recommended in our report that there be coordination internationally. And the second was are we just pushing out risks on to future generations? And the answer is no, we called for prudent vigilance for the precise reason of mitigating those risks that would place undue burdens on future generations. But it's also the case that if we recommended the precautionary principle, there would be as much as we configure and this is a matter of both fact and empirical judgment on fact, there will be lives that don't exist in the future because we did not allow lifesaving medicines to go forward. So we have to actually judge what is responsible now for the future and that's what we did. We did not push out risks into the future. We actually made a judgment of how to mitigate risks in a responsible way so as to allow the benefits to move forward. Yes, hi, my name's Ryan Ogle, I'm with the Biomorgenoblasts in Maryland where one of the DIY slash crossover academia organizations and I want to applaud yourselves on the committee for the act of actively engaging the community science organizations as well as Ed Yu who's been with us this weekend for taking the WMD FBI approach of actively engaging these communities so that we can create that foundation and framework. Since I first got involved in this only last February, we've seen dramatic movement within the public and media on these issues, especially with Ventor's discovery and I just want to say that people that are in these community organizations are actively with your guidance contacting ABSA and getting that kind of training and what we, many of us foresee as this whole question of will synthetic biology destroy humanity. It's more of can synthetic biology save humanity and you see this with Kiesling, you see this with ND, you see this with Ventor. The main problems that we see societally towards end of the world scenarios, these offer very rapid tools to attempt to approach those and one of the things we've seen on the DIY side is the possibility of taking the citizen core of bioliteracy and just educating our public and our populace towards understanding these processes in the same way that Stevenson had approached Diamond Age and education. Can I draw you toward a question or do you have a question? I just wanted to say thank you very much for approaching that and what do you think about the idea of a bio core of citizenry distributed that might have labs on chips with local pathogens that they can print out what is the pathogen in the month and be able to give a decentralized feedback of what's going on in the environment. So we took up and heard from the do it yourself community and it's great to hear that they have taken on more and more members of the do it yourself community are taking on the responsibility of ethics education and moving forward with us in a responsible way. It is one of the challenges of a very decentralized science but it's also one of the glories. Part of, we learned from the history of computer science that if you put undue restrictions on science and the key is undue restrictions because we all live with some restrictions, you don't have the capacity to actually respond when nefarious people do bad things. So we use, we want people to know how to hack into computers in order to fight terrorism. If we shut that down, that's not a good thing. So thank you. Thank you for your comment. So one there in the back on this side and then we'll come right in. Hi, LeMore Schaffman from Keystone Tech Group. Question. I didn't hear, I'm sorry. LeMore Schaffman from Keystone Tech Group. And my question, it's wonderful they were all for the positive and everything's good. I'm just gonna sort of take the other side of it for a second and I'm curious if you have all these agencies that are going to be overseeing different things, who's gonna control the kill codes? And what happens if one particular type of bio matter is created which is beneficial to one agency but detrimental to another's interests? How are, what are the decision making processes gonna be? So it's a little bit of control and power which I would envision is what's gonna happen coming down the line in certain situations. Well that's precisely why we, I don't wanna, I'm not painting a rosy scenario here. We asked for there to be coordinated oversight of synthetic biology moving forward. And the question you asked and a host of other questions need to be answered through an ongoing process of coordinated oversight. So it's an excellent question and that's precisely the kind of question that a commission can't answer but the government not only can but should answer and that's what we asked them to do. Right here in front, the gentleman. So my question is, is there a society getting too risk-averse? So some background on that. As a science finder, I've always talked to scientists who tell me about technologies that were developed like cochlear implants that were developed in a very regulation free environment where you can basically implant patients without having to report to the FDA then really allow those technology to come into existence. And people say to me now that the regulations governing things like retinal implants are so burdens that it's very difficult to develop these technologies now. So the question I want to put to you is could we as a society beginning so risk-averse that we're actually doing things for adeptments that we are inhibiting the development of useful technologies because we are so afraid of possible negative effects? So are we getting too risk-averse as a society because we're so afraid of the negative effects? Well I think that the commission's perspective was precisely to navigate a road between the extreme risk aversion of the precautionary principle and the extreme libertarian don't do anything even when you know there are risks that could be mitigated view. And so I don't know the answer, I don't think there is a single answer to are we too risk-averse. Here's what I do think is the case that underlies, maybe underlies your question and concern. Even when you have reasonable regulations, they're often entrenched in bureaucracies that take a long time to act and don't always act efficiently and effectively. And part of the drive behind the commission's recommendations of having a coordinate oversight mechanism was to try, was to encourage, and we are only advisory, alas we can't order, the president can order me to do something, well he actually can't, he can charge me to do something, but I can't order the government to make a law. But what was behind our recommendations was to avoid unnecessary bureaucratization, overlap of charges, and really for the agencies to sit down together. And this is very difficult, we've seen this in security issues, but sit down together and come up with a clear understanding of who's overseeing what, and to be as efficient as possible in that. So I don't know that we're too risk-averse, but I do know that a lot of our regulations, and President Obama has said as much, are not efficient and effective. Just a couple more questions, the gentleman there right by the microphone. Hi, I'm JD Hansen from the International Center for Technology Assessment. My question for you is, looking at some other analogous technologies like genetic engineering, where the government has tried to force new technologies into old models, right now the FDA is regulating genetically engineered salmon as a animal drug. Mike Taylor, who's at the FDA, before he was at the FDA says it's a stupid way. It's not gonna be anything that encourages public confidence. How will, you know, the coordinated framework basically has failed in genetically modified organisms. We don't have a way to do anything other my organization sues the USDA. We keep winning because they can't figure out how to do an environmental risk assessment. But how, when these other coordinated frameworks, security and genetically engineered organisms have failed, what advice do you have for the agencies to make it work this time? Well, so let's start with the one that had the most clear risks, which is security. The coordination is far from perfect today, but it's a lot better than it was before 9-11, a lot better. Now, there's no crisis that's requiring there to be coordination with regard to synthetic biology, but there's at least now a report that urges that. And we're not urging new regulations and laws. So I think what we're recommending is all to the better. I can't tell you whether it will succeed. It has a far better chance of succeeding than the alternates. You know what Churchill said about democracy. It's the worst form of government except all the others. Well, you might think that our recommendations are the worst oversight, coordinated oversight is the worst way forward except all the other possibilities which are really not acceptable. You can't just have no oversight at all of science that does potentially pose biosecurity and environmental risks. But in saying that, are you ruling out the potential of a dedicated agency in time? Should the facts lead you there? You're just saying not yet. The facts do not lead us to recommend that and the overlap between synthetic biology and genetic engineering is very great. In fact, a lot of the scientists are doing both. I mean, is Jake Heasling a synthetic biologist or does he do genetic engineering? He does both. So it would be a mistake at this point, but I don't rule it out in the future. I think it's very unlikely that the best way forward would be to create a separate agency for synthetic biology, a separate one for genetic engineering. It just, right now it doesn't look like, it certainly doesn't make sense now and it doesn't look like it will, but... Yeah, you can come up with a title of what the agency that unifies regulation is. The end of history agency. The Federal Administration for Life Modification, something like that, yeah. One more question and then we'll, Sean, you can have the last word. Luckily it's a very brief question. What was the commission's minimum threshold for considering something to be synthetic biology? And it relates to what you were just saying, actually. Right, so the question is, what was the minimum threshold for us to consider something being synthetic biology? So we began, even before we talked about the principle framework, we talked about, well, what is it? What is synthetic biology? And basically to a person, every scientist who considered himself or herself a synthetic biologist said, you'll never answer the definitional question because it's got the fuzziest edges, which goes to the answer I gave to the last gentleman's question. And what Steve has asked, it doesn't make sense to create a separate agency for a form of science that doesn't have sharp boundaries. We've talked about, Steve, and you've focused on, and as of many of the questions, on creating novel organisms, but there's a huge part of synthetic biology that doesn't do that. What it does is it creates biobricks or building blocks to standardize the way you can make existing organisms, and therefore, we're talking about at the level of bacteria and E. coli, and therefore, synthesize in a very environmentally effective and efficient way. So there was no minimal threshold. I mean, the minimal threat, or to put it differently, the minimal threshold, it was a little bit like what Justice Potter Stewart said about pornography. I can't define it, but I know it when I see it. Well, we knew who was a synthetic biologist by what he or she said he or she was doing and others were doing, and we couldn't define sharp boundaries to it. That said, there were certain markers, and we describe those in our report. There's a kind of bottoms up and top down approach that most synthetic biologists do one or the other of. Do you want one more? We can take one more, sure. One more, gentlemen, just, you don't, okay. Thank you very much. In the mix, is that considered okay? Okay, so no, no, no, no, I didn't answer. You can't, so one thing synthetic biology is not as it doesn't map on to a kind of drug. It's actually a method for producing things. And your commission can sit with you, mentioned, right, right. Is there the chance that the poor usage of steroids is not in the public interest, and to what extent did the commission look at that? Okay. I'm thinking about that. Yeah, well, I will answer your question, but not as chair of the Presidential Commission on Bioethics, because we haven't taken, we have not taken up that. I do think the, it's not just potential, but I think steroids are commonly misused in our society, but we haven't taken up that question. Let me conclude by just telling you what we are taking up very quickly. We are taking up two very important issues that unlike synthetic biology, I think most Americans have some, at least, rudimentary understanding of. One is clinical trials, which use human subjects. And specifically clinical trials that the US government sponsors overseas. So international clinical trials. What should, what are the standards that now govern them, and what should the standards be? And we were charged for doing this in the wake of the revelations of what happened in 1946 to 1948 in Guatemala, where a US doctor funded by the government actively attempted to inject syphilis, to develop syphilis in people, in prisoners, in prostitutes in order to see whether penicillin could prevent the development of it. So that's one thing. And the other topic we're going to take up is the ethics and social responsibility involved in genetic and neuro testing. So that's what we're focused on now, and I look. You keep an eye on the drive-ins in the meanwhile. Exactly, exactly. Well, thank you for your openness and your clarity and energy. Thank you.