 In nuclear power, the term risk combines the probability and consequences of a potential reactor accident. In the 1950s, there was so little operating experience that experts could only say that the risk of a major accident was very low. Quantifying that risk would have required a complex probabilistic risk assessment, or PRA, that could require thousands of engineering hours to develop with uncertain results. Nevertheless, in 1975, the NRC published the world's first PRA, the Reactor Safety Study, nicknamed the Rasmussen Report after its principal author. The Rasmussen Report aimed to prove that the risk of a serious accident was negligible. The claim was controversial, flawed, and for a time seemingly repudiated by the NRC itself. Yet today, the study is considered the foundation of the risk assessment profession, and its methods are used by the aerospace, aircraft, chemical, and even medical device industries, as well as government agencies, including the NRC. I'm Tom Wellock, the historian at the NRC. In this video, I will trace the birth, death, and rebirth of probabilistic risk assessment, or PRA, as an essential tool to safety. In 1957, the NRC's predecessor agency, the Atomic Energy Commission, published WASH 740, a report on the hypothetical consequences of a major reactor accident, where the primary cooling piping and the containment building failed. Up to 3,400 deaths were possible. While the AEC believed the chance of such an accident was, quote, exceedingly low, it admitted that reliable accident probability calculations were impossible, and no one knows now or will ever know the exact magnitude of this low probability. As AEC Chairman Dixie Lee Ray put it, WASH 740 was an albatross around our necks that the agency could not live down. The AEC, however, did not rely on probability estimates to ensure safety. It simply assumed that a major event could happen, such as a large loss of coolant accident, and it designed safety systems conservatively with extra margins of material strength and redundant components, an approach it called deterministic design. By the late 1960s, however, the AEC's approach to safety was thrown into doubt. The growing size of nuclear reactors meant that in a loss of coolant accident, the fuel could get so hot that it might melt right through the bottom of the containment building, a hypothetical accident popularly called the China syndrome. Environmentalists in Congress pressed the AEC to prove that the China syndrome had a low probability. By the early 1970s, hope grew for defensible probabilistic methods as aerospace and nuclear experts developed fault trees that broke down safety systems into logic diagrams and could roughly estimate safety system failure probabilities. Some experts thought such methods could be used in regulation. Chauncey Starr, a veteran of the Manhattan Project, called for making quantitative comparisons of the risks and benefits of various technologies. This approach, he wrote, could give a rough answer to the seemingly simple question, how safe is safe enough? The pertinence of this question to all of us, and particularly to government regulatory agencies, is obvious. The AEC got the message. In 1972, it hired Norman Rasmussen, an engineering professor at MIT, who, along with AEC staffer Saul Levine, led a 60-member team of regulatory staff and contractors to create a probabilistic risk assessment. AEC leadership hoped the report would, quote, bury Wash 740. Published in 1975, the Rasmussen report came to several groundbreaking conclusions. The probability of a major accident was higher than previously thought. Its consequences were much lower than Wash 740's estimates. And, importantly, the greatest risk was not from large loss of coolant accidents, but a collection of lesser mishaps, such as small loss of coolant accidents, human error, and station blackouts, events that had received only limited attention by the NRC. Yet, the flaws the report came in for intense public scrutiny. Most glaring were the explicit risk comparisons between nuclear power and well-known technological and natural hazards, which indicated nuclear power risk was extremely low and comparable to freakish deaths from meteor strikes. Critics countered that the report's estimates had large potential error due to limited data and dubious calculational techniques. An NRC-appointed committee reviewed the Rasmussen report and rendered a jarring combination of approval and harsh criticism. It praised the report's innovative use of fault trees and called for greater use of PRA and regulation. But the committee excoriated the report's calculational techniques and agreed with critics that its absolute estimates of risk had potential error bands so large that they could not be used in comparisons to other well-known risks. The report, its chairman concluded, overstepped the state of the art. In January 1979, the NRC commission announced that while the Rasmussen report had made many praiseworthy advances, it withdrew support for the executive summary that contained the flawed risk comparisons and it cautioned the staff to make limited use of PRA and regulation. The press treated the announcement as a repudiation of the entire report. The future of PRA and regulation seemed bleak. Two months later, the accident at the Three Mile Island nuclear power plant in Pennsylvania destroyed a reactor, but it saved PRA. The accident involved a small loss of coolant accident and human error due to inadequate training. Exactly the kind of risk the Rasmussen report had identified. Post-accident review committees criticized the NRC for not using risk assessment more in its regulations. On this turn of fortune, the Rasmussen report's co-author Saul Levine glimpsed vindication. The NRC's attitude, he wrote, has now come almost full circle. The nuclear power community is finally taking into heart the words of Cicero, probabilities direct the conduct of wise men. In the coming decades, the NRC sought to use PRA wisely, remaining sensitive to its limitations and the value of a traditional deterministic approach to safety. This led to what we now call risk-informed regulation. We will cover that part of the story in our next video on the history of PRA. Thanks for watching.