 Good morning. Welcome to the Toxicity Testing Press Conference hosted by the National Institutes of Health and the U.S. Environmental Protection Agency. This press conference will last for 60 minutes. There will be four primary speakers who will provide brief remarks and then members of the media will be able to ask questions. To ask questions, you can press star and want your touch tone phone to enter the queue. You may remove yourself from the queue by pressing the pound key. This call will be recorded, transcribed and available on the websites of the three sponsoring organizations, the EPA and then from NIH we have two sponsors, the National Human Genome Research Institute and the National Institute of Environmental Health Sciences. Now I'll turn the program over to moderator Larry Thompson, Chief of Communications at the National Human Genome Research Institute. Morning everybody. This is Larry Thompson. On behalf of the National Institutes of Health and the U.S. Environmental Protection Agency, I am pleased to welcome all you news reporters who have joined us for this telebriefing. I remind you that this is tied to a 2 p.m. embargo today for a publication in Science Magazine. I'd like to welcome all the folks up in Boston who are participating at the AAAS meeting to be participating in this telebriefing. This is great to have you all here. After 2 o'clock there will be background and visuals related to this announcement available on the websites of the respective institute. Certainly you can go to genome.gov and you'll find all the stuff in our press room as well as at the National Institutes of Environmental Health Sciences and at the EPA press room. So our expert panel in the speaking order will be Dr. Elias Zerhouni, who's the Director of the National Institutes of Health, Dr. Francis Collins, Director of the National Human Genome Research Institute, which is part of NIH, Dr. Robert Kavlock, Director of the National Center for Computational Toxicology, Office of Research and Development at the Environmental Protection Agency, Dr. Samuel Wilson, who's the Acting Director of the National Institute of Environmental Health Sciences and National Toxicology Program, which is also part of NIH. And then we'll have a bunch of additional experts from the agencies who will assist in answering your questions, especially the technical ones. So we have Dr. Christopher Austin, who is the Director of the NIH Chemical Genomics Center, Dr. John Butcher, who is the Associate Director of the National Toxicology Program, Dr. Raymond Tice, the Acting Branch Chief of the Biomolecular Screening Branch at the National Toxicology Program. So now let me introduce Dr. Zerhouni, who has some brief opening remarks. Dr. Zerhouni. Well, thanks, Larry, and thank you for joining us. I think this is a very exciting time. And today we're seeing the birth of what I would consider a new approach to a crucial problem in public health arising from the NIH Road Medical Research. And it's obvious that we have needed for a long time a way of exploring the toxicology space, if you will, in systems other than animal systems. And the roadmap, when presented five years ago, was really a space to explore and pilot new approaches. And it wasn't really designed to initially think about toxicology, but when we envisioned these projects and collaborations, the idea was that with the current need for larger scale, more complex, larger scope experiments that are bigger than what any single scientist or even institute can support, we thought that this would lead to scientific projects that at that time could not be predicted. And it's clear that this announcement today is showing an outcome that was completely unexpected three, four years ago. We did intend the Roadmap to be strategic to fund high-risk projects that had potentially big payoffs. And we selected the molecular libraries and imaging components at the time. And when we did it, we envisioned creating a library of chemicals that could be used by individual scientists to probe the complexity of biological systems to help us understand them better in health and disease and hopefully a screen for a compound that might not only provide treatments but provide clues, more importantly, to the complexity of the biology underlying the disease. We're clearly early in the development of this, but already we're seeing dramatic progress in the sense that we have now new technologies that have been scaled up to the extent that you can in fact envision the ability now to screen for toxicity in a completely new way. We had 10 high throughput screening centers operating within the network from the University of Pennsylvania in Philadelphia to Emory University in Atlanta, the Scripps Research Institute. And all the data that is produced is made public into a public database called PubCamp. This is a fundamental tenet, I think, of the NIH strategy here, including the announcement today in collaboration with EPA and NIHF and NHGRI. We think it is very important for the entire public worldwide to have access to these very precious experimental results so that more insight can be gained. These systems to do research were aimed at developing treatment, for example, next week there will be an announcement about a breakthrough and for the first time in 45 years in the treatment of schistosomiasis, a rare disease that would not be of interest to, in terms of financial interest or ghost-shade disease, which is also in the works. But I think today what we're really wanting to report to you is this remarkably unique collaboration one that I could not have foreseen when I started the roadmap initiatives with my colleagues, Dr. Francis Collins and Dr. Tom Insel. And this is, in some ways, an example of how we can go forward when we see technologies that arise from completely, completely unexpected corners of the field of science and have an application which I personally did not envision just a few years ago. So with that, I think I'll turn it over to my colleague, Dr. Francis Collins, who will brief you more on this collaboration. Thanks, Elias. And good morning to all of you. The research collaboration we're announcing today really has the potential to revolutionize the way that toxic chemicals are identified. As you know, historically, such toxicity has often been determined by injecting chemicals into laboratory animals, watching to see if the animals get sick, and then looking at their tissues under the microscope. And though that approach has given us valuable information, it is clearly quite expensive, it is time consuming, it uses animals in large numbers, and it doesn't always predict which chemicals will be harmful to humans. The correlation is not as precise as we would like. So what's being proposed today and outlined in a paper being published tomorrow in Science Magazine, to which your attention is drawn and again, note the embargo about this that was mentioned by Larry Thompson, is to bring together the skills of three established scientific organizations from two different federal agencies into a whole that is certainly stronger than any of the parts alone. So what are these three components? First of all, you heard already from Elias about the National Institutes of Health Roadmap Project and Small Molecules and the Chemical Genomic Center at NIH, the NCGC, is the component of today's announcement that brings the high throughput screening technology to the table. Its staff, led by Chris Austin and many other really talented colleagues, knows how to test millions of compounds quickly and cost effectively. The scientists at the National Toxicology Program, part of the National Institutes of Environmental Health Sciences, know more about chemical toxicity than just about any other group on Earth. They have decades worth of animal research experience to guide this project. And the Environmental Protection Agency's Office of Research and Development is filled with real experts in chemical informatics whose computational skills will put all this data together, compare it to the historical animal data, and draw inferences about what kind of new approaches we could take for high throughput identification of toxicities associated with compounds that haven't previously been tested. So together, this information will help the NTP and the EPA do their jobs to protect all of us and the environment from harmful chemicals. I should be clear, however, that despite the promise, this collaboration is still a research effort. There's a lot we have to learn and that's why a memorandum of understanding has been signed for a five year project here to research, develop and validate these new and innovative toxicity testing methods. We suspect, although this is for five years, that this collaboration will probably last many years into the future. The idea of being to usher toxicology screening really into the 21st century, providing the kind of capacity which was envisioned in a recent report by the National Research Council that called for the development of just this kind of systematic screening systems that could eventually take the place of animal-based designation of chemical toxicity. Now, let me give you some details about how this will work. The NIH Chemical Genomic Center provides this public sector capability of industrial scale technologies for high throughput screening and chemistry. The center is already highly capable of identifying small molecules that can be used as chemical probes to study the functions of gene cells and biochemical pathways. It's even possible that some of these chemicals, like the one that Elias mentioned for shift of the myosas, may end up being new medications for rare diseases, which is a great hope of ours. But of course, there's a flip side to this. If you can use this same screening system to identify beneficial compounds, you may also apply the same technologies to see whether certain compounds have toxic effects. It is basically the same strategy, but with a different output. We can use the robotic plate handling and quantitative high throughput screening, which is really amazing to see if you have the chance to come and visit the center. You would be, I think, quite taken by the robotic capabilities that now exist here. So because the NCGC can test so many chemicals at one time, it can also test one chemical at 15 different concentrations, which is really important for toxicology. You want to know not just is this compound something that might be harmful, but at what concentration would that harm occur? Because almost anything, if given in a very high concentration, could be toxic. And you want to know what that dose response looks like. And this is where NCGC is extremely experienced and makes it a wonderful partner for this enterprise. Finally, let me say that the scientists involved in this collaboration didn't just decide to get together this morning to make this announcement that they've been working with each other for quite a while, testing their ideas. Already, NCGC has analyzed something like 2800 compounds, including pesticides and industrial chemicals. And in fact, there's a publication in the journal Environmental Health Perspective reporting the results of about 1400 compounds that were supplied by the National Toxicology Program. And they were tested against 13 different types of cells to see what their consequences would be over this range of concentrations. So this is an early publication of what will undoubtedly be a very large output from this collaborative effort. So finally, I'd like to say the marriage of this chemical genomic center technology with the experts in our collaborating institutions at the NTP and the EPA is really a powerful and promising advance. And I think it is both good science and it's a wonderful example of how federal agencies, seeing a real opportunity can get together and do something collaboratively to try to benefit the public. So let me now turn this over to my good colleague from the EPA, Bob Kevlock. Bob. Let me first express the regrets of Dr. George Gray, Assistant Administrator for Research and Development at EPA, who was scheduled to participate in this briefing, but who was called to testify to Congress during the same time period. EPA and regulatory agencies around the world are facing an increasing gap between the number of chemicals for which we need to access toxicity and the ability of traditional lab-based animal studies to provide that needed information. EPA recognized the opportunities afforded by advances in molecular biology and computer science to provide faster and more effective chemical assessment procedures when it established the National Center for Computational Toxicology three years ago. The NCCT has the lead within the Office of Research and Development for bringing these new tools to environmental protection. Their research is working closely with its clients and the regulatory offices to provide solutions on how to determine which chemicals from the long-listed candidates for which they are responsible are the most important for us to study. These approaches can also be used to reaffirm or modify existing approaches, such as the category approach to chemicals under the high-production volume chemical program. Our ToxCast program launched in 2007 is nearing completion of its first phase of development. It is evaluating the effects of more than 300 chemicals and nearly 400 different in vitro assays to develop predictions of the outcomes of tests and traditional animal models. We expect to be in the next phase of extending and validating the predictions by the end of this year. The ability to interact with the National Toxicology Program and the NIH Chemical Genomics Center will allow us to move faster by uniting complementary expertise of the three partners. Already we are working together to coordinate the chemicals we are studying, the toxicity pathways that will be studied, the data we are obtaining and the methods to interpret and understand that information. The international community is also interested in this issue as evidenced by the establishment of a working group on molecular screening under the organization of the Summit Cooperation and Development for which EPA is the current lead. We welcome additional partners to join us as we move to integrate the more efficient and effective methods in the assessment of human and ecological risk of chemicals and we look forward to questions from the media. Thank you. Okay and Dr. Wilson from the National Institute of Environmental Health Sciences, please go ahead. Larry, thank you very much. We at the NIHS and the NTP are very excited to participate in this collaboration with the EPA and the NCGC. The power of the collaboration is bringing together new strategies and technologies to address important needs in toxicology. The collaboration comes at a time when the biochemical pathways underlying responses to toxicants are becoming well understood. Making use of this information within the collaboration will lead to a new toxicology paradigm that will transform toxicology and toxicity testing and provide a path toward better protective relates to chemicals in the environment. Now the National toxicology program has been committed to characterizing the toxicity of chemicals through its bio-athlete program and to the development of methods to improve our ability to identify hazards in the environment. As an interagency program the NTP collaborates with many groups to develop the information needed for regulatory decision-making. Our commitment to the NTP bio-assay and to meeting public health needs of regulatory agencies will continue. The NTP released its vision and roadmap for the 21st century in 2004 and this included initiatives to integrate high throughput screening chemicals into the NTP testing program. The NTP's expertise in toxicology and its large database of chemical effects and animals will play critical roles in evaluating the high throughput testing process that we're announcing today. We recognize that full implementation of a new toxicology paradigm will require substantial effort over many years but ultimately will allow us to generate data more relevant to humans as Dr. Cullen said earlier and to reduce animal use in toxicity testing. In closing we look forward to working with the EPA and the NCGC in this collaborative effort that will greatly benefit public health. Thank you very much Dr. Wilson. So what we'd like to do now is open this up for your questions so I would ask you to you know join the queue as instructed by our operator and please identify yourself and your news organization so we know who we're talking to. I remind my colleagues that when you can't see our faces so say who you are when you start your answer so that reporters will quote the right person and we have some of our experts also gathered here to help answer any of your really technical questions. So why don't we start with the first question please. Our first question will go from Pat Rizzuto. Joanne is open. Hi clarification and then a question. Dr. Kavlok could you repeat the number of in vitro assays that you said were part of the phase one. This is Bob Kavlok within phase one depending on actually how you count them it's somewhere between 350 and 400 assays we're having right now. We've added a few additional partners over the last few months so the number has grown. And then my second question to whomever would be there's great interest in alternative tests and the nanoworld. Is there a component of this that will help determine whether these alternative tests can also predict toxicities for intentionally engineered nanomaterials? This is Bob Kavlok from EPA. We certainly think some of these technologies are going to be used for looking at nanomaterials and in the next phase of our toxic gas program we plan to test about 10 or so of the chemicals for which we have some animal data already. So we're optimistic it will be useful for them. And how many years do you think it will take before you're confident with the prediction for regular chemicals? So this is Francis Collins let me say that this is after all being announced as a research enterprise and we need to figure out exactly what the correlations are going to be between the results of animal testing and this new high throughput approach which will involve both cellular assays and in some instances assays of model organisms like zebrafish for instance. Of course we have a wonderful legacy database of information where you know what the answer is and that's going to have to be used then to tune the system to try to assess which of these new assays are most predictive of the results if the toxicities are to be believed in humans. But exactly the time over which it will take to develop that is well that's why it's research. I think we could have a pretty tough time predicting exactly what that pathway is going to look like. I think the news today is that we do now have the pieces in place to be able to ask that question and answer it effectively. Thank you. One more response by Dr. Wilson and then we'll move to the next question but Dr. Wilson. Yes so I think the answer to your question actually would depend on the type of chemical or the type of exposure under investigation so that in some cases the pathways are well enough worked out with attendant models in cell culture systems and animal systems so that we could very rapidly do the cross species and cross system validation necessary to document the information obtained through the high throughput approach. So we think that in the relatively short term that is in the next two or three years there will be some examples where the high throughput information can inform in a very meaningful way as to the priority setting for cell model testing and then for use of animal models to confirm the information that we glean from these other approaches. Let me also ask my colleague in the National Toxicology Program John Booker to comment on this point. Thanks. This is John Booker. We really look upon this as an iterative process because we're going to be learning so much in the very beginning that's going to really guide us in the further development of this program that to predict exactly how long it's going to take at this stage is very, very difficult but the process is going to involve stages of convincing the costs first and the scientific community at large that the prediction and the output of these assays are really making sense in the sense of and then there's going to be this stage of convincing the scientific community at large that this is the right approach to take and then finally the regulatory communities are going to have to be involved in these steps from the very beginning and brought along with us. Okay, let's go on to the next question please. We'll go next to the site of the Maggie Fox from Reuters. Your line is open. You've got me now and I want to just say huh? Can you guys start all over again and do this in cocktail party language because I'm not clear exactly what it is you're saying. Are you saying you're shifting over to non-animal testing and you think it's going to be more accurate? I apologize but I need words of one syllable. Hi Maggie, this is Francis Collins and I guess I'm the elected cocktail party participant here so let me see if I can try this again if I'm sorry if it's been a little too technical. Basically we really need to know for all kinds of good reasons whether a particular compound has the potential of doing harm to human beings. We'd all agree with that, right? That is the science of toxicology. The way in which that has been done over many decades has depended heavily upon the use of animals where you basically decide okay I'm going to take a particular animal species or more than one and apply this compound at a variety of concentrations and I'm going to look to see does the animal get sick and if it looks like it's getting sick or sometimes even if it isn't you then look at the tissues of the animal and try to determine what exactly was the damage that was done here what organ system was affected. This has been our mainstay for trying to make predictions about human toxicology but I don't think anybody would say we're totally happy with it. It's slow, it's expensive and its precise predictive ability has often turned out not to be as good as we would like. There are differences between species. We are not rats and we are not even other primates. And so that desire here is to try to see if we could do better. Now there has been much discussion including a recent report by the National Research Council about moving toxicology testing into a new 21st century kind of era where we take advantage of a lot of the things we can now do based on technologies that come from other directions many of them I will happily tell you from the genome project that make it feasible to begin to imagine a toxicology in a totally new way. After all, ultimately what you're looking for is does this compound do damage to cells? So could we in fact instead of looking at a whole animal as our first line of analysis look at individual cells from different organs of different animals with different concentrations of the compound and taking advantage also of what we're learning about the fact that molecular biology is coming along. We have a better sense about pathways involved in the cell than we used to and you might be able to begin to make inferences about what's going on by tapping into that information about so-called systems biology. So the proposal here, Maggie, and it is a proposal which is now going to be tested in a rigorous research environment is that at least in some instances in the longer term we might be able to do a better job of predicting toxicity by using these high throughput cell-based assays where you basically try the compound in a laboratory situation on cells growing in a little tiny well of a plate and would in that situation be able to assess is it likely to hurt the liver or the kidney or the brain of an individual because you've collected that data in a high throughput low cost and much less low risk much more amenable risk situation than what we've currently done. But we don't know if that's going to be as good as we would like and so hence the purpose of this collaboration is to test this out particularly now using situations where we know the answer because we have a list of compounds that we know do create human toxicities. If we didn't already know those answers could we actually develop a system that would have predicted them accurately? That's kind of where we're trying to go. You've got to tune the whole process before you can just assume that it's going to have the predictive power that we want. Now does that help or do you want to push back with a question or two? Now that helps but you're not going to use human cells you're going to use animal cells? Oh no we're definitely using human cells and animal cells and again we have access to lots of different human cells growing in culture these days and probably the goal here will be to test out as many different ones as possible to see which of those is the most predictive of what we really want to know which is toxicities human beings. Thank you. Okay one more short quick answer from Dr. Wilson before we move on. Yes I think I'll take a crack at this question Maggie also. This is Sam Wilson in the toxicology community we've known for a long time that we need to improve the throughput of our testing to be able to test many thousands of compounds that today we're not able to test because we simply don't have sufficient throughput. Secondly the cross species extrapolation of information that we gain from animal model studies to humans is not always as efficient and precise as it should be so we have a big recognized need in the field of toxicology to have more precise ways of predicting human toxicity. This collaboration we're announcing today really is a milestone because HIP for the first time gives us the power the research power and opportunity to apply a whole new generation of approaches to this question of toxicity. Now as I said in my comments earlier in the vision or strategic plan for the National Toxicology Program several years ago we recognized the opportunity or the need for taking this step but today is the first time that we've actually formalized the collaboration at least here in the United States that will give us a shot at achieving this vision that we laid out in our strategic plan and that we've been talking about today. Cool. I can I've been to many cocktails so they've gone by. This is Dr. Maggie, I think this is Dr. you always are. I think you're posing the fundamental question I think what you're seeing here is the fact that as a society we need to be able to test thousands of compounds in thousands of conditions in much faster rate than we did before. These technologies which have come up through genomics technology chemical genomics technologies which we have really developed to screen for example 300,000 potential drugs in less than a couple days. This was this would have taken two years five years ago and because of the scaling up the natural idea that I think is being proposed here is to move the 20th century paradigm of testing one compound at a time in many animals to going to the 21st century paradigm to test five 10,000 compounds against five 10 20,000 conditions in cells that are very specific to human toxicology and because of doing that what we hope to do is accumulate enough knowledge based on what we already know to crosswalk from the 20th century method if you will crosswalk reliably in a validated way to what we would call high throughput 21st century toxicology and this is what this whole project is about this is why I think we're excited about it is to move forward at a pace and a rate which will be consistent with the toxicology risk that we see in human society including new compounds. So I want to observe that this Larry Thompson I'm going to observe that we have about 20 minutes with our friends in Boston and then they'll be having to get out of the newsroom so let me go to a question in Boston first and let's let me ask all my colleagues to keep your answers as short as we can so we can get as many reporter questions as possible so we get a question to you from Boston and then go back to the general queue we do have a couple of questions from Boston and then I'll ask the reporters to give their name an affiliation okay thank you my name is Rachel Aaron Berg and with science news and I have a couple of questions one is by high throughput do you just mean a lot at once if you could also please just take through an example of okay there's a chemical you want to test you have your little dish of cells you how you're doing various concentrations how frequently might you check on it how do you then determine if damage has been done and also how are you prioritizing which compounds to do first so Dr. Austin please introduce yourself this is Chris Austin I'm the director of the NIH Chemical Genomics Center it's a great question and how this actually works is we take a dish that's about three inches by five inches that contains 1,536 different little wells in it those little wells are a fraction of a millimeter across and we put the same cells in every one of those 1,536 little containers that's within that dish within that tray then we take 1,536 different chemical 1,536 example is in the container next to it that means that you killed off half a cell if you want you have to do all the conditions why we need to have such a high throughput system about 30 years now we've tested in very great relation to the chemical we have two more questions from Boston we'll try to make them quick BBC could I just press you a little more on the time frame and also if this becomes a reality how many animals might it save a year I don't know do we have anybody actually done that kind of we don't know if anybody's done that kind of a calculation yet have we so this I think would be sufficiently hypothetical that it's probably not wise on our part to try to give a specific time frame or a prediction about the change in animal testing that were resolved I hope you understand what's being talked about here is an assessment of a brand new approach to try to do high throughput screenings for toxicity testing of very large numbers of compounds and exactly what the trajectory is of that science going forward is not really possible for us to say at the moment I think having said that it's fair to say the organizations involved in this are not interested in going slowly we're interested in pushing this agenda as fast as we can and certainly one of the hopes for outcomes of this would be a reduction as soon as possible and the number of animals that are necessary for being tested this is Dr. Shavlock at PBA and I echo the urgency that we work with this and I know the groups are planning a meeting in March to begin to lay out a timeline for actual working through this a memo of understanding this is Dr. Zerhouni I'd like to point out to everyone that the work has already started in other words this isn't theoretical if you go to the issue the online issue of November 22nd, 2007 of environmental health perspectives you'll find the work that has been piloted really there's been the pilot behind this whole proposal of testing more than 2800 NTP and EPA compounds which were tested with over 50 biochemical and cell-based assays I think if you went to that paper you'd see exactly the format of what needs to be scaled up but as far as starting and testing and road testing the concept it's already been done last question yes sir I'll go ahead please hi this is Phil McKenna from New Scientist I'm wondering if you could go over again just how much faster compared to your animal tests and if this works are there applications outside of toxicology and again what was the regulatory approval that would be needed for this okay so I'll give you this is Chris Austin again I'll give you a sense of the timing an example we frequently use is that to test 100,000 compounds in 15 different concentrations takes us two days we calculate that if we would take a person if you were going to do the same thing a one person would have to work eight hours a day seven days a week for six months to get that done we do it in two days so it's it's much much faster as far as doing it in animals it would be 100,000 compounds in animals is logically impossible well we know we know the throughput of that it's been 2,500 over 30 years so you know it's about 100 2,500 we do 2,515 concentrations in a single afternoon so single afternoon versus 30 years Larry you might want to mention the footage that's available people want to see what the system looks like so yeah genome.gov at two o'clock today we'll post on our website b-roll footage of the high throughput screening system it's a series of robots and incubators and carousels that have these collections of chemicals in their chemical library it's kind of cool stuff to look at that you can watch it right on the website in flash and for broadcasters you'll be able to download quick time movies that are in full resolution that can be imported directly into a non-linear video editor for use in a broadcast piece there's about 15 or 20 clips up I can't remember how much and a couple of sound bites from Dr. Austin who's the director of the center so if you want to just see the thing and see what the system that we're talking about you'll be able to look at it in about two hours next one more question let's go back to the regular queue we're done in Boston yes Earl yes that's it for Boston okay I have a very good this is Bob Kavlock from EPA I asked the last question they had two elements that we didn't get to was the application outside toxicity testing these actually are inside toxicity testing but you can think about applying this to mixtures for instance which is a very difficult issue to deal with in toxicology you can deal with a lot differences of chemicals and understand whether there's differences in contaminants between them or you can develop high throughput screening assays to look for genetic susceptibilities that might be useful for finding susceptible subpopulation and then in terms of regulatory adjustments I think that depends on the use in terms of screening and prioritization those kind of acceptances I think could come quite early as we just identify those chemicals that are most important to study if we talk about actually replacing animal tests with these in vitros it will be a much longer route and require perhaps even different legislation than we're dealing with right now yeah that's Chris off again just very briefly that one of the major points about this just to reiterate is that is that this is a dual use technology in fact this technology was developed and we use it every day to develop chemical probes to understand the genome and starting points for development of new therapeutics particularly for rare and orphan diseases that are of interest NIH it's the same technology essentially with some important modifications that's used within pharma and so we've taken that technology and applied it to a completely different use so it already is it's a multi-use and dual use technology that that has been developed elsewhere and applied to a new application let's go to the next question please we'll go next to Lauren Mirgaard from the Associated Press Joanna's open hi I'd like you all to put this in a little more global context my understanding is the European Union is about to enact a ban on animal testing of cosmetics and it sounds like from what you're telling us here you know you all are at the very beginning of this in terms of how reproducible using this instead would be so if you could just comment a little bit on what's going on globally and how your work will apply to that and then I would love some specific examples of chemicals today that you know the animal testing has not been able to answer the certain questions that this perhaps could so can I ask Dr. Wilson to go first and Dr. Cavlock to answer the Lauren's pieces and we'll just follow up yeah so concerning the first portion of your question about some of the trends and thinking in the EU it's true that they are considering the approach of discontinuing animal testing and this is broader than the cosmetics industry it actually extends to the entire chemical industry in the initiative under the acronym REACH now their planning and research on alternate approaches other than animal testing for how to assess hazard and toxicity is in a very very early stage probably even earlier by a considerable margin than our research here in the United States the alternate approach that we are talking about here today really is a step in the right direction toward a more robust method or technology for assessing a whole range of chemicals as we've said but I think that our approach here in the U.S. is actually a little more mature than theirs in the EU let me ask now John Booker to say a few words about a specific example of limitations that we have with animal testing well in many cases the things that distinguish the response of animals and humans are related to the rates of metabolism that the various organisms have for various chemicals so in one case you'll have animal strains and species that are particularly sensitive to particular chemicals and in other cases those same chemicals don't seem to have the same effects in humans there have been some historical instances where humans have proven to be more sensitive than the animals to substances such as solitimide in the earlier examples a long time ago an example that made us really be aware that the animals are not always giving us the right answer but that we have to obviously use all the information that we can get from all kinds of different systems to generate data that are going to be predictive of human health effects and Dr. Kavlak can we ask you to sort of address the international regulatory part Dr. Kavlak, are you still with us? Yes I am I'm sorry we're just on mute this is Dr. Kavlak at EPA this is why we are working with the organization of economic and cooperation and development in Paris to bring an international perspective to this we've introduced this molecular screening initiative to them about two years ago there's now a working group form that has representatives from Canada from Japan from the Netherlands from England from Germany and a number of other countries that are meeting to begin to understand how we can bring this kind of technology to the international field we've also had discussions with the European Chemical Agency and with the European Commission in Brussels about supporting research in Europe that would complement the kind of research that we're doing here in the United States and those discussions are continuing true, Dr. Zuny? This is Dr. Zuny I'd like to point out however Lauren it's very important to also state you cannot abandon animal testing overnight as I said there is a very important crosswalk that needs to occur between animal-based technologies versus non-animal-based technologies and at this point I don't think one could say that you could validate the new technologies and abandon abruptly animal testing it will have to be intertwined for a few years until we fully understand the scope and scale of the problem We'll go to the next question please We'll go next to the site of Elizabeth Wise from USA Today your line is open Yeah, I wanted to ask if you could go back to the beginning because you were it sounded as if this was a fortuitous outcome of research that you hadn't expected to lead to this and again I'll ask for the words of one syllable cocktail party discussion of exactly how that happened Dr. Zuny again I think what the intent was initially was really to be able to analyze biological pathways in health and disease with the purpose of developing therapeutics or diagnostics We really did not see the paradigm of looking at this approach at that time for toxicology studies However, as we progress what was really amazing is the fact that A, we realize that we could in fact test many hundreds of thousands of compounds in the space of a week for therapeutic purposes we had shown that this could be done reliably at different doses which is what the work of Chris Austin was and all of a sudden the eureka moment occurred because you had a conjunction of interest here the National Research Council was looking at the issue of toxicology NIHS was looking at the issue of scaling up our ability to study many compounds as you know globalization is leading to the production of many many hundreds of thousands of compounds and one plus one became three and we turned the logic on its head instead of testing compounds for therapeutic purposes what about testing them for their potential to disrupt disrupt normal physiology in an analytical way that would be completely different that was the unexpected nature because you really don't cannot predict in our mind where the breakthroughs are going to be because you see progress on multiple fronts which all of a sudden create a new horizon and about when would you say that happened those two years ago I would say two years ago Dr. Booker and Dr. Kemp yeah this is Francis Collins and isn't this actually a wonderful example of what you hope will happen in science that you develop a technology for one purpose with the hope that it's going to work there and it starts to work and then you realize goodness we could take this approach and apply it to a totally different problem that we hadn't initially thought of this happens over and over again I mean who thought when the ARPANET was being put up to try to handle a small amount of data that needed to go from one place to the other within the defense industry then we would end up with the world wide web and the internet and all of the spin-offs from that again I think that's what we're trying to do by bringing together technologies and scientific needs that you couldn't have predicted even a few short years ago would fit together realizing that they do and then trying to put together the appropriate framework to push the science forward as quickly as possible Thanks so much Next question We'll go next to the site of Robert Stevenson from International Scientific Communications You know it is open Thank you I'm struck by the apparent parallelism between the Human Genome Project which involved competition between the private sector and the public sector and Dr. Collins was involved in that and what we're doing here I mean the drug companies are Novartis for instance has 350 screens done in the last three years generating screens against a million compounds and again in the afternoon or so It just seems like we're reinventing the wheel here This information is already been collected by society and being processed daily I just I'm really not so impressed Well I'm sorry This is Francis Collins Let me try to impress you or clarify this a little bit I think you're mixing up a couple of things here What Novartis is doing with their screens which we all celebrate and hope they will do more of and find more wonderful drugs for common disease is to look for compounds that have positive effects on a potential disease state What we're talking about here is to turn that around to try to identify compounds that may be toxic in the environment the air, the water That is not something that Novartis sees as their business plan I can promise you that And even in the other aspect of this that Dr. Austin referred to in terms of the fact that we do have the capabilities at the NCGC for doing this kind of high throughput screening to look for compounds that might ultimately be valuable in the long term It's many, many steps and the main focus would be on rare diseases that Novartis and other pharmaceutical companies are really not interested in So I think you've misunderstood the circumstance here We are fully aware and find very valuable our opportunity to work with the private sector in this area and in many others And what you're hearing about here is filling a niche which the private sector really would not be interested in putting their resources into That's not capillic and EPA I would just like to add that I think one of the unique things about our partnership that we're developing is it will be done in the public domain All of our information will be released publicly so that scientists can look at it the public can look at it and regulators can look at it And I think that's a major difference between what happens in the private sector in the pharmaceutical industry and what we're trying to do here And I think Ray Tice here also has a statement to make Well, I actually just said it, Bob This is Ray Tice The only other thing I'd point out though is that they're not going to be testing the same kinds of compounds that we test And so our focus is on things that pose environmental hazards to humans They're looking at a specific class of drugs that might be beneficial And again, they keep that data private One of the biggest problems is trying to get the release of information that they have so that we can use it as part of our process of moving forward I think that's a real problem there That's when you should really be addressing But that information is there I think they have their tremendous libraries have many of the compounds you're interested in We're running out of time So let's move on to the next question, please We'll go next to Larry Greenmeyer from Scientific American Your line is open Hi I was hoping you could talk a little bit more about the technology behind this There have been reports of meta-chips and data chips that have been used Are there any breakthroughs on the technology side that are enabling this or is this more an announcement of that this work is being done Dr. Ostin, would you like to answer that, sir? Yeah, I think the Bob Kablok mentioned this a minute ago There are many ways to approach this problem and many different types of assays We plan to use all of those Yes, the meta-chip that you're referring to is something that we're actually quite familiar with and are interested in using in this collaboration So yeah, we're always looking for new technologies Always looking for new ways to approach this problem The problem with a lot of those technologies like the meta-chip is they're just not high-troupled enough That's fundamentally what it comes to Let's go to the next question We'll go next to Liz Buckley from Pesticides and Toxic Chemical News The join is open Hi, I have two questions, if possible First of all As long as they're short Yeah, well, maybe not But take what you can Okay, please proceed First of all, if there are limitations to animal testing and the existing data to which you'll be comparing the new test results that you get is from animal models How do you address that uncertainty? And my second question is I have heard some toxicologists criticize some of these assays saying that toxicity does not affect just a cell or cells but a whole animal So how do you plan on addressing those concerns? So this is John Booker I think what we're part of this breakthrough if you will is the recognition that toxicity pathways as identified by the National Research Council are really conserved across species And we know enough now about how cells respond how organisms respond to toxic agents that we can try to design systems that will allow us to probe those particular pathways such that the data that we get will be applicable both to animals and to humans So that's how we hope to bridge that particular problem Cool Can we go to the next questioner? We'll go next to Bob Grant from The Scientist Okay Hi, thank you I'm glad I finally got to ask my question Anyone's free to answer this But Dr. Collins said in the beginning that you guys didn't get together this morning and decided to pursue this line of inquiry There's nothing particularly new about high-throughput testing or computational toxicology and there's certainly nothing new about in vitro acid So I'm wondering, given the fact that I'm hearing very little about specifically the infrastructure of this new collaboration What's the news here really? Dr. Wilson, I'd like to answer that Yeah, I think the news is the capacity to test many thousands of compounds over a broad concentration range and against a whole range of target cells or target biochemical pathways And that is a capacity that we really haven't had until this collaboration I'll stop there Okay This is Bob Kavlock at EPA I would say what's new is that we're really laying out a very logical framework to use these technologies to understand them and to bring them into a situation where they can be used in a regulatory framework This really takes a systematic approach of examining these and comparing them to traditional approaches and to gaps in the traditional approaches And I think that's what's really news is that there's a strong commitment by three organizations to study this really in-depth and to make progress on it in a very rapid fashion So does anyone care I'm sorry, does anyone care to flush out that framework? I'm not hearing a lot about the specifics of how data is going to be shared between agencies or the funding that's behind this anything like that Sir Well, I think this is not just anything I think there is no question what you're seeing is the combination of the 30-year history of the NTP, the National Toxicology Program with their 2500 very well documented compounds with very, very specific animal phenotypes which have been characterized over the years which we well know Then you have the EPA Knowledge Base which is then combined with what we do best which is the issue of using large-scale testing of thousands of compounds at 15 different concentrations in hundreds of different assays combine this into the same format as we have in POPCAM If you go visit POPCAM you'll see that all of what we do for beneficial research in other words finding therapeutics is made public The same thing is going to happen here But at the end, here's the dream The dream is this Could you in a battery of tests end up with very specific molecular signatures that will be predictive of human toxicology in ways that you just can't do in animal testing today That is the value In other words, like I said it's 1 plus 1 equal 3 the whole is much greater than some of the parts here So this is Francis Collins Just to answer your question then what is the news? I think you've heard why this is a novel new paradigm for doing toxicology testing What's special about today? Well, there is this memorandum of understanding signed by all three of the agencies which will be up on the web which you can look at which lays out the agreement between the groups to work together That's new The paper in Science Magazine being published tomorrow will in fact lay out this approach in a fashion that I promise you most of the scientific community has not heard about and if they look at the paper which I suspect many of them will they will be excited about this I think this is not something that will be sort of oh yeah, we knew that Of course, in any new announcement of this sort you get together and figure out what the scientific opportunities are So yeah, we've been at it for a couple of years getting to this point But this is the launch This is the moment where you can say we're officially starting something that could really change the way in which toxicology of compounds is assessed in the 21st century We thought that was news So this is Larry Thompson I'm afraid that we're at the end of our hour I want to thank everybody for joining the briefing today I want to remind you that this stuff is embargoed until 2 o'clock today with the Science Magazine embargo of the paper At that time on all of the websites of our respective organizations you'll see lots of material going up with background information and the MOU and the scientific paper or the science paper And I want to remind you that there's footage where you can go and look at the system and you'll get a little bit more of a sense of the physical infrastructure of it all, which is very, very cool And with that I'd like to bring this briefing to a close and thank you all very much for participating