 is Professor Bob Ulanovic, who is at the Department of Biology at the University of Florida. Professor Ulanovic has a long-standing career in various fields of ecology, and some of you might know Bob maybe from his well-sighted book, Growth and Development Ecosystem Phenomenology. And today, Professor Ulanovic will talk about process ecology, a deep step beyond physics. There you go. Your slides are up, and I think you're ready to talk Bob. Thank you, Dr. Keplin. And thanks also to Drs. McCrady and Fath for their invitation to come here and present some controversial ideas and results. I'll start by saying that most who are involved in ecosystem dynamics strive to make it a hard science, like physics. And in fact, most believe that if enough mechanisms are described, we'll finally be able to understand how ecosystems behave. Now it may not be popular in this forum, but my personal opinion is that such an enterprise is not fully adequate to the task, because physics is ill-suited to capturing all living phenomena. Physics is limited in how it can deal with history, dimensionality, logic, sufficiency, and the contingency of living systems. Take dimensionality, for example. Physics is all about objects moving according to timeless laws. However, life is not an object, but a process. In fact, it's a configuration of irreversible temporal processes. Logic? Well, long ago, Whitehead and Russell proved that the laws of physics can operate only on homogeneous variables. Living systems, however, are massively heterogeneous by comparison, which lead to sufficiency. Laws always require boundary conditions. And even moderately heterogeneous systems, common tourists, defeat our ability to make closed boundary statements. Now, the complexity of the boundary statements grows roughly as the factorial of the number of different distinguishable tokens. And that figure becomes rapidly unmanageable. The consequence of all of this is not that the laws are somehow violent, but as the dimension of heterogeneity increases, they lose their ability to determine outcomes. That is, they still constrain, but they can no longer determine the manifold behaviors of very heterogeneous systems. So how then are we going to treat any host of irreversible processes having at least moderate heterogeneous dimension? Well, my answer is that we portray the relationships among processes as a network, which can be represented as a matrix, the elements of which are magnitudes of the individual processes. For example, the hypothetical network of processes shown to the left in the figure can be represented by the matrix to the right. Now, the algebra of matrices can be invoked to good advantage. For example, the elements of any column can be normalized for the sum of the column, sum of the column to yield, in the case of trophic interactions, dietary coefficients for how much the diet of the species in column J depends on the prey from I. So that the element in row I and column J displays the fraction of everything entering J in school diet that comes from source I. For example, taxon 4 provides 8.6% of the diet of taxon 3. The matrix of columns are calculated in this way is called the system dietary matrix. Now, the algebraic integer powers of G the dietary matrix allow one to calculate how much of what these I reaches J over all pathways of all integer lefts. For example, this network displays the direct exchanges of carbon among the 36 major taxa of the Miso-Halein-Chesapeake ecosystem. The top creditors are striped bass and bluefish shown here on the right circled in green that seem to occupy the same carnivorous niche in the system. But using the powers of G, one can calculate that the mid-water predator striped bass depend indirectly on zooplankton for fully 66% of their sustenance, whereas the more bottom feeding bluefish rely overall on the same zooplankton for only 29% of theirs. The integer powers of the G matrix allow one to assess the average trophic level of any omnivore. One sees from the table that the top carnivore striped bass feeds effectively at level 3.87. That's not very much higher than the lowly and pesky sea nettle at rank 26 that feeds at level 3.44. Now, alternatively, one can apportion the flow for each species to virtual collections that record how much transpires in the system at each integer trophic level. That is, one can map the complicated food webs, like the one you saw in the Chesapeake Bay, into trophic pyramids or chains like the one shown. Such trophic pyramids can help an investigator evaluate how healthy the system is in processing resources or in responding to system external changes. Now, control in ecosystems is often tied to cycles of resources. Backtracking algorithms can usually identify all simple cycles in a network, and each cycle can be quantified by its limiting flow and extracted from the network without disturbing any balance. When all the cycles in the Chesapeake system are extracted and aggregated, the pattern of cycles reveals a by-part type trophic dynamic. There's a nexus of cycles among the planktonic subsystem up here in the upper left and a separate one towards the bottom and the right that describes the benthic and tectonic realm. Now, resources are transferred from one realm to the other by the series of filter feeders which really don't explicitly involve any cycling, but their function, their ecological function is to transfer resources from the planktonic to the benthic and tectonic realm. Finally, information theory allows us to quantify the level of constraint binding a network together as distinct from its flexibility for further change. One begins with the familiar Shannon diversity index, H, as applied to the flows in the system. Here, the joint probabilities, Pij of a flow through from i to j can be estimated by the magnitude's Tij where Tij is the flow from i to j. And a dot, like t dot dot, indicates some mission over the system. So t dot dot would be the sum of all the flows, sometimes called the total system throughput. I wish to emphasize that this measure, the Shannon diversity, is neither an entropy nor is it total information. It's been described as both. There are total information that holds it together, but rather it is an entanglement of both. To see this, we decompose the Shannon index into two non-negative complementary terms. The first is the average mutual information, A, which is greater than or equal to zero, that measures the constraints that can hear the system. While the second one, Phee, is called the conditional entropy, it gauges the residual freedom that our flexibility the system has to adapt to perturbations and new circumstances. If we normalize A by H to obtain lower case A, which is what I call the magnitude of relative constraint, which also varies between zero and one. It is remarkable that all the values for various types of systems tend to cluster and congregate near value of A of about 0.4 or 40%. One further notices that the cluster occurs somewhere near the maximum of the function F equals minus A log A. Now, lower case A is a surrogate for system efficiency, how well on average the system moves medium across the network. This is an extremely important result. Why? It says that ecosystems cannot reach their maximal efficiency without losing their capacity to adapt to novel circumstances, thereby leaving them highly vulnerable to catastrophic collapse. This finding contradicts conventional optimization period where the practice is to optimize distribution networks for maximal efficiency. But this operation can lead to major catastrophes. Like power grid blackouts or water system failures. To avoid misfortune, one can apply simple calculus to the function F to guide how to change each flow Tij so as to move any system in the direction of maximum F or maximum fitness. Engineers are now using this method of biomimicry to design network systems that fail soft. That is, they don't collapse completely. In these days of COVID emergency, when optimized networks such as food supply chains are failing all around us, fail soft designs could save many lives and avoid a lot of suffering. Indeed, physics can help us to understand how certain ecosystems behave, especially like the ones we've seen that are strongly given by physical geomorphology. But the larger story of how living systems function demands that we address processes and heterogeneity with phenomenological tools such as network analysis. Doing so means that ecosystems analysis is no longer a derivative science reducible to physics, but is moving into a position of leadership in our endeavor to understand how life functions in this beautiful world of ours. Well, thank you very much for your kind attention and I opened the questions though. So, Dr. Ketner. Thank you. Thank you, Professor Ulaanich. So, we do have a few questions already that came in during our presentation. So, that's great. So, let me start with this question from him, Selbert. And his question is, thanks for the inside food comments about biological and physical systems. Referring back to an earlier talk, can it be... Hold on. It moves around because other questions come in. I'm going to start halfway. Referring back to an earlier talk, what is the reproducibility of these tropic level estimates across seasonal and other time frames? And what are the error estimates? Yes. The error estimates, as in most of ecology, are usually reasonably large, anywhere from 20% to 30%. The reproducibility over annual intervals is probably commensurate with that. The indices that I've provided are long-rhythmic, and errors of that magnitude tend to be compact with a little bit, so that what we find is that we have qualitative reproduction, even if we don't have precise qualitative reproductions. Thank you. Our next question is from Martin Kleinhans, who is in Europe. So, thank you for being with us, Martin. In addition to James' question, how reproducible is a visual or anthropic network in this particular system? Can it be transported to other areas? And I realize this is much more complicated than water and sand. Well, how much can be transported? Well, for example, the work with the cycles in Chesapeake Bay is something that's commonly known, and I'm sure probably most of you are familiar with it, that there tend to be two sort of sub-ecosystems within any estering system, one that is collage, and the other that is benthic and nectonic. So that does transfer to other estuaries, and it's reasonably reproducible. I have colleagues in Scotland and South Africa and so forth who come up with similar depictions of estuaries in those areas. Notice that all of this is post-dictive, not predictive. In other words, networks tell you what's your data, it tells you about how the system has been operating. You can use that to perhaps forecast what qualitatively is going to happen, but it's rather difficult, as I would maintain, it's also difficult with mechanical models to predict quantitatively what's going to happen in necosystems. Moving on to our next question, which is from Brad Murray. A very interesting talk. Thank you. Does H represent the constraints on a system or does H represent what we know about the constraints on a system? If the latter, how do you interpret the clustering of A around the peak of the curve? I'll show which slide. What I'm trying to do is to find... Oh, there it is. Okay. The clustering, okay, first of all, it's very difficult to create these networks. It takes a lot of time, of human power and time to create them. And this is actually a small fraction of the data that we have. It only represents about 17 networks that have been quantitatively defined. How do I interpret the spread? What I interpret is that ecosystems tend to cycle. They tend to creep up. They tend to creep up the area towards maximum fitness. They tend to overshoot and then they become vulnerable and then they're cut back here so that these systems essentially are probably cycling around the maximum and stay within this window of a talent. And that's the way I interpret it. I'm open to other interpretations, of course. Okay, then our next question is from Jeff. Jeff Hannavery. The Shannon waiver Wiener information metric assumes stationary and does by extension to air-condic dynamics. What are the implications of your analysis when you relax this strong assumption? Yes, so we tend to create these networks as snapshots, either annual snapshots or maybe seasonal snapshots of ecosystems. And obviously we're talking about a dynamic system. The basic algebra and information theory is readily extended into the temporal domain and into the domain of multiple media, you know, carbon, nitrogen, phosphorus and whatnot. We've done some work of the media extrapolation on Chesapeake Bay. Bob Christian at North Carolina has done some work with temporal extrapolations. But the ideas are there and they can readily be extended. The big problem is the data. I mean, just these stationary snapshots require a lot of data and a lot of estimation. To do it temporarily requires almost another order of magnitude. So we have many fewer examples that we can examine in the way we did these 17 particular examples. But conceptually, it's easily extendable. Data-wise, it's hard to extend. Okay, thank you, Professor Olanovic.