 Hi, my name's Tim Waters. I'm a member of Longland City Council representing Ward 1, which is the east side of town. And I've posted a couple videos like this in recent weeks where I've shared the four Rs that I think are essential for getting us through this pandemic with a minimum a number of casualties. Those four Rs are responsibility, reliability, resilience, and resourcefulness. In other videos, I've reflected on some of those Rs. In this one, I want to reflect on the second R of reliability, a concept not unfamiliar to all of us. Reliability means we can count on people or count on systems to perform the way they've said they'll perform and the way we expect them to perform. In examples of systems where reliability matters like the bus schedule, you expect when you show up to take the bus, the bus is going to show up at a particular time, get your ride, TV viewing schedule, same kind of thing, that your program comes on at the time where you expect to come on. Examples of systems that we expect to be reliable. But both are examples of systems where reliability is a nice thing. It's not a matter of life and death. There are systems where reliability is a matter of life and death, or if the system doesn't perform the way we expect it to, errors or mistakes can cascade into catastrophic failure, meaning people die. The electric power grid, air traffic control systems, nuclear power plants, chemical processing plants, interestingly enough, aircraft carriers, all of these systems or entities have been the subject of studies of researchers wanting to answer the question, what do those systems or those organizations or those entities do that result in their performing in ways that are highly reliable, meaning they get it right every time, every day for everyone, because if they don't, failure is catastrophic, people die. In those systems, there are five sets of practices that differentiate higher reliability systems from systems that have more variability than is tolerable in systems like air traffic control, electric power grids, etc. Those five areas of practice, I'm not going to go through all of them, but one of those areas of practice is deference to expertise. What that means is, in a higher reliability system, air traffic control, power grid, whatever, whoever has the most expertise, whoever knows the most, in this case knowledge, experience, matters. If things start to go wrong, it doesn't matter what your title is, it doesn't matter what your level of compensation is, it doesn't matter how long you've been in the system, what your seniority is. If things start to go wrong, people defer to those with the most knowledge or the most expertise about the problem and the ways to fix it or intervene so that errors or mistakes don't cascade into catastrophic failure with people dying. Right now, as a society, we have to achieve high levels of reliability to minimize catastrophic failure, which would be people dying as a result of this virus. We have expertise, we have experts, public health experts, who have given us guidance, practices that we need to both recognize and implement, whether it's staying at home, safe at home, or when we go out, how we go out, social distancing, the use of masks, ways to protect ourselves so we can protect others. So, we're at a time right now, our number two, reliability. If we want to minimize catastrophic failure, which would mean people dying from this disease, we need to defer to the expertise of health, public health experts, implement what they've advised us to do to avoid the kinds of errors and mistakes that cascade into catastrophic failure, and we need to be able to count on one another to do that for one another. So, we can minimize catastrophic failure or minimize the casualties in this pandemic, but we're only going to do it if, number one, we defer to the experts, and number two, we can count on one another to implement what they've advised us to implement. We can do this, we can minimize catastrophic failure, but we can only do it if we're all in it together.