 Once upon a time, there was a golden age of science. It was an age when having helped America win World War II, scientists were rewarded with ample budgets to pursue their curiosity about nature wherever it might lead them, and where the nation and society benefited immeasurably through economic growth, military superiority, health, employment, and social well-being. This is a picture of Vannevar Bush, who was the perhaps last and only true hero of American science policy. He was the director of the science and technology enterprise during World War II, including the Manhattan Project and the projects that led to the development of radar. But more importantly, he was actually the author of the narrative that we've all kind of internalized about how it is that science works and science leads to social benefit. A narrative whose utility has perhaps come to the end of the line, and yet which still continues to hold a pretty powerful hold on our imagination. Now, the way we think about science, the kind of golden age myth, starts with really the kind of benign presence of Albert Einstein. Here's a statue of him that adorns the front of the National Academy of Sciences. So it's also their statement about what science is and how it's most embodied. And from the brain of Einstein, sitting in the patent office in Switzerland, thinking about time and space and how the universe ought to be reconceptualized from the ground up, we end up directly with something like this, the first nuclear weapons test at Alamogordo, New Mexico, prior to dropping the bomb on Hiroshima and Nagasaki. And so embedded in this kind of golden era myth are a number of assumptions. And first obvious idea is that the more money you put into the system, the more benefit that comes out of it. So here's basically the war on cancer, the billions spent since 1971. And here's rates of breast cancer mortality. And the obvious point is it's hard to see how those two might be connected to one another. Scientists working, pursuing their curiosity, wherever it might lead, those are Vannevar Bush's words. Accountability, this is actually really difficult one. The notion that science can only be accountable to itself, otherwise it risks being politicized and ruined. So we have peer review, we have reproducibility of results, we have what the philosopher of science, science Carl Popper called conjecture and refutation as a way that science can maintain its quality. And yet we are finding, here's a provocatively titled article by a very serious scientist named John Ionides, now at Stanford. We're discovering systematic, discipline-wide biases in particular directions that make it very difficult to know whether or not the mechanisms of internal accountability for science are actually functioning. Here's an obvious one, a no-brainer. When we have a problem, we do research, that research informs the way we take action. Turns out to be much more difficult. This is, I suppose, a slightly irritating slide, but it shows a number of pages of climate change science summarized in the quadrennial intergovernmental panel on climate change and carbon emissions over the same period of time. Again, I don't wanna claim a causal relationship, but on the other hand, one was supposed to help make the other better. Finally, and maybe most importantly, is this notion that science and society operate according to an unspoken social contract. And the idea of the social contract is that we provide the resources and institutions for science, and science provides us the tools to make society a better place. And I think it's obvious to everyone that there are all sorts of assaults on this notion. Just one recent example is the work that was done on the H5N1 avian flu virus in the Netherlands and the US, where there is a very difficult device of debate over whether or not, A, the science ought to be carried out, and B, if it ought to be carried out, it should be published because it can be used, perhaps as a weapon of bioterrorism, as well as a public health tool. So just a couple quick things to help guide what I think are some of the issues that ought to be on the table. The first is it's simply not about money. So this graph shows the percent of the discretionary budget that has been spent on non-defense research and development by the government in the last 40 years or so. The, let's see, do I have a little red thing here? All right, anyway, the hill at the beginning is the Apollo project, which is an applied engineering project. But if you get rid of that, what this shows is the government is basically willing to spend a certain percentage of money on R&D. The other point is Albert Einstein was not responsible for the nuclear, for the development of nuclear weapons. An incredibly complicated organizational, institutional arrangement that brought together universities, the private sector, the government, thousands of brilliant scientists, thousands of civilians and others to translate a whole array of scientific knowledge into one particular technology was the real story of what went on. So for what are we creating knowledge? How is it going to be used? What are the institutions that are involved? And how do we ensure that we're creating the knowledge that we need to actually resolve the immense challenges that we're facing today as a nation and as a society?