 OK, so welcome back from lunch, everyone. Hope you had an enjoyable lunch and an interesting set of clinics in the morning. For this set of keynote talks, the first talk is going to be given by Johannes Westering from the University of Notre Dame on the evolution of process and scale coupling coastal ocean hydrodynamic modeling. First off, thanks for the invitation. And I guess what I'd like to do is give our take on model integration model coupling and some of the things that might be helpful as we pave a path forward in this very rapidly moving business. So my co-authors, Rick Ludick and Clint Dawson are the original developers of the AdCerque Coastal Hydrodynamic Model. So the processes that we're interested in are predominantly the actual hydrodynamics of the coastal ocean, all the way from the outer continental shelf to the estuarine systems, the river systems, and the adjacent coastal flood plain. Obviously, the morphology is tremendously important, but we don't directly model its evolution. We do take into account, obviously, through data ingestion, as well as the ecology. It's important, too. But we're really looking at the rich scales of the hydrodynamics. So looking at the processes of the ocean, it's incredibly rich in processes. And there's a variety of forcing mechanisms. But we start off with tides, which are, of course, gravitationally driven, tsunamis tonically driven, and then weather and storms drive a lot of the circulation to energy in the ocean. And basically, we go from global circulation engines to windwaves to storm surges. And in between those, you have gravity waves. And then you have rainfall runoff. And if you're going to model this stuff, all you need to do, as Jenny said this morning, is go to the Navier-Stokes equations, and you're all set. It covers everything. The only problem is you have to go from scales of thousands of kilometers down to millimeters. And so I did a little back of the envelope calculation just for fun. And so you need to do 10 to the 34th unknowns per day of simulation time. And so I'd say, roughly, we can do 10 to the 12th to 10 to the 16th per day right now. But we're way off from 10 to the 34th. So historically, where this is gone, instead of solving the Navier-Stokes, we've really compartmentalized processes and scales quite a bit. I think it's kind of fun to really look at the origins of this. Laplace actually predates the Navier-Stokes equations. And he came up with the LTEs plus tidal equations, which are long wave equations that are non-dispersive processes. So they do a really good job at doing tides, tsunamis, for the most part anyway, and swarm surges. So that's really at the heart of a lot of coastal ocean models. Then you go next and say, okay, I want to deal with non-dispersive type waves. And you go to Bousinets class models, and there's a whole variety of those. Of course, they started off with Bousinets himself in the 1870s, although most modern theories are based up here in 1967. And that covers waves, transforming waves, infra-gravity waves really well. And for example, when tsunamis are making their way out of land. So next up, we have dealing with waves in general in the ocean. There's just no way that you resolve them sufficiently based on the length scales. You just absolutely kill yourself. And so the French actually in the 50s came up with wave energy transport type models, wave energy density or the whole bunch of variants on that. And the first really big operational model that was full of physics based and allowed the spectrum to change was Hasselman company who wrote the land model. That's pretty widely used non-phase resolving wave model. So it doesn't give you the phase, but it gives you where wave energy is going. Then next is Lighthill. He developed and derived the kinematic wave equation, dynamic wave equation for rainfall runoff. And those equations of course are used when you're trying to do over very thin layer over land flows before you jump over the shallow water equation. The shallow water equations are extremely difficult in order to resolve and compute those type of processes. And last but not least, you have global ocean circulation type models really were initiated in the late 60s and have evolved and really focused on the global heat engine and the bare-clinicity of the ocean and a lot of the energy that exists in the ocean off the shelf. So really it's this scale separation, process separation. And it's somewhat unfortunate that really all these communities live in a little bit of a silo. And it's nice to see that they're finally coming back to a little bit. And each one of the communities provides affordable resolution and covers the domain that they can. And they just alias the rest or forget about the rest and deal with it through boundary conditions. A lot of nesting goes on within these model classes. I think I'm always a little bit scared of nesting because you got to do it really right unless you have a really diffusive model because that will kind of wipe out your sins. But I think I'd like to make two points on nesting that people have a little bit of caution. First of all, the boundary conditions are always the most inaccurate. Either typically lower order than the rest of the PDE solutions inside of the domain. And second of all, if there's any kind of incongruity between the physics that you're forcing in the interior domain physics, you run into trouble. Unless again, you have a pretty diffusive model. So either you get extra energy being generated because of the mismatch between the interior and boundary energy, or you get stuff that's generated inside reflecting back. And you can just spend a lot of time with the boundary conditions. And last but not least, you have the data simulation. So we don't do things right in our little process separated world. And of course, the rest of that energy exists out there. So and we can't resolve it enough, et cetera. We don't have a large enough domain while we'll just assimilate something. And that has its own set of issues but also its own set of benefits. So really I'd say where we're coming from, enormous progress obviously since these models have been developed. But where we've been going is much more component interaction between the silos. I think for me, I'm a finite element person. So obviously unstructured grids are great. You can really resolve things where you need to and localize resolution. You can do large domains. And we are as computer capacity just grows in this wonderful age of our profession, better resolution and together with better algorithm. So the models today are a lot less numerically diffusive than they were in the past. And that goes right along with better sub grid scale physics as well as improving parallelism. The bad is that we've got largely solid communities still in development. Really, we're not talking enough to each other in my opinion. The grids have just been a lot of very suboptimal. We really haven't put all the homework into that that we should. And if you look at what kind of models existed in the 60s, for example, for my business Shell water equations, they were kind of second order accurate. And for the most part, most models are still second order accurate today. And that doesn't match well with the computer technology and often very inefficient parallel processing. So just to give you a little bit of a story about our model that's at the Absurc model. So look at the spectrum and where things are really energetic of the ocean. We have the gravity waves between, let's say one in 30 seconds and you have infragravity waves and then you have long waves. So what we do is we cover all the long ways with Absurc and we couple tightly to swan for the wind waves and we just close our eyes and don't forget to deal with the infragravity waves and say, they're not so important. And so we do have this tight coupling that works quite well. So Absurc, Solvage Shell Water Equations 2D3D, it's fully 3D, your clinic model, if you wanna make it that way, but I'll give you a little bit different twist on that. We use the lurk and finite elements, very simple linear elements and we typically do very large domains. The model is used by the Corps of Engineers to design levies, all the levies in and around New Orleans are designed using Absurc, all the post storm and storm risk studies the Corps does for hurricanes are done with Absurc. NOAA has a extra tropical and tropical now model called STOS and HTOS that is the foundation of it is Absurc. It's being coupled to WaveWatch 3 there, non-phase resolving wave model as well as to the national water model. And so then we have FEMA uses it for flood insurance studies. And so along the whole East Coast, Gulf Coast and Great Lakes Coast Absurc is used to establish flood insurance which actually can make me a target of some dislike because if your flood insurance rates are too high, you can blame me. So then we have SWAN that's a non-phase resolving wave model from Delft. It's a great model. It's more focused on the coastal ocean. They have an unstructured rendition that they developed together under a joint project for ONR and we coupled them very, very tightly. In fact, the parallel communications engines in unstructured version of SWAN are directly out of Absurc. And basically, Absurc informs SWAN in terms of water levels and in terms of current speeds. And SWAN gives Absurc back wave radiation stresses. That's that push that you get when waves break or it's the pull that you get when waves are being formed and being generated. So they work in these parallel worlds where we try to domain decompose using Parmitus or Metis. And basically we have equal size or equally loaded subdomains not equally sized per se in terms of area but equally loaded in terms of number of degrees of freedom. And again, they SWAN and Absurc pass information locally. When you look at wall clock time versus number of course on the log log plot you get this linear reduction in runtime which is what you want. So a typical model might look something like this. It's these coast model focused on southern Louisiana you see in yellow and orange continental shelf and I see the grid is much larger in the deep ocean than it is on the continental shelf and focusing in on southern Louisiana you can provide a lot of resolution or let's say southeastern Louisiana and oil is here in the Sippy River Delta, et cetera. And you can get all the features using the high-resolution finanilum grid. So this is a bathytopo chart you can see all the river systems, channel systems, the islands, the levees, et cetera. All that geographic detail is supported by the unstructured grid which is shown here. And this is a grid resolution now and for example over here the orange speed resolved at about two, three kilometers it goes all the way to five kilometers. The high energy conveyances are resolved at 30 to 50 meters. So we really try to pack in the high conveyances we try to put a lot of resolution. So a typical simulation might look something like this. We have Hurricane Gustaf winds over here with a wind model and then we have the waves computed with swan and the water surface elevations and currents computed with that sort. These two models are interacting two-way the wind model feeds down to the wave and water shell water models. And so you can see the winds coming up here and the hurricane making landfall over here to the west of the Mississippi River. You can see the waves going all the way. This slide over here up to significant wave height that's greater than 10 meters. A lot of wave radiation stresses being generated there and a lot of the storm surge here to the east of the Mississippi River is being trapped by the river on the shallow continental shelf. So kind of an interesting simulation and we do very well simulating this. So obviously we're evolving the capacity making up for the things that we're missing. And the good is that we're generating or we're advancing model integration and part of it is very heterogeneous. Part of it is actually in some aspects in terms of the miracle drivers quite homogeneous and we'll talk about that a little bit. The resolution is getting better. We're understanding better and better how to resolve our grids and where to put our costs. And we've been working a lot on discontinuous Galerkin methods. The way to think about discontinuous Galerkin it's really a finite element problem on a finite element single finite element. And you actually communicate with Riemann solvers between the elements. So it's kind of like finite volume except that your whole high order stencil is on one element of itself. So you don't have larger stencils that go across cells. And that's a really, really nice feature. The bad is that the grids are still largely static and the physics is static. We basically decide on the physics beforehand. And the load balancing is less than perfect. And another thing to worry about is only peak performance on the process. And a lot of these things are pretty ubiquitous along across all modeling communities. So here we have what's happening now. So we have let's say the wind, gel, water, wave coupling there. But we've coupled now to high com. And the way we're using high com is really as a internal mode driver to the two-dimensional answer. So NOAA runs high com or variants thereof. And so instead of trying to mine the bare-clinicity out of our own model, which we can, which would be very expensive because of the high resolution that we use, we actually mine it out of high com. And let's start this. So the bare-clinicity, i.e. the temperatures and salinities, they drive all these currents that you're seeing in deep water. And that in turn actually affects the surface water at the coast quite a bit. So the two things that we drive with it is we drive these bare-clinic currents and we drive internal tight dissipation, which is heavily dependent on bare-clinic structure. And that actually is good for us as hurricane modelers because we move the dissipation off the shelf at the shelf break. And that's a very nice thing. So this is not ad-circ, basically computing what's called the internal mode. It's high com. So what you can see is in terms of a monthly mean sea level here, this is in September. And you can see that the water is much higher in the internal part of the ocean. And then the water at the coast is actually quite a bit lower. So all that intra-annual and intra-weekly structure we can capture by feeding and ingesting that bare-clinicity from the high com model, which is pretty cool. And when you match it with the water surface elevations at the coast, it does a really good job. So what's the future? So our vision is to really have much more dynamically coupled models and do it in a much smarter way and actually have them be living and breathing things instead of these static things that we predetermined what we should pick. So we wanna have dynamic physics, dynamic grid resolution, dynamic order of interpolants and to really match this all within, support it with the dynamic load balancing. Focus areas are these frameworks that allow this coupled and dynamic physics to occur and to optimize the grids for that physics when we implement it. They have higher order methods and to advance the engines for load balancing. So the first thing that we've been doing and this is together with NSEP and NOAA coupling to the Worf Hydro National Water Model. We're coupling actually for a NOAA project to see ice as well, as well as WaveWatch3, which has great ice physics in it. And this we're all doing instead of through our own coupler series, ESMF, WAPSI, which what I understand has some of the elements and thoughts of BMI as well. So in addition now, so this is pretty heterogeneous coupling, right? Heterogeneous physics and it's heterogeneous grids and models. And so, but on some other things, we actually wanna be heterogeneous in the models and the physics, but we want to be homogeneous in the actual algorithmic frameworks. So what we're doing now is we're actually growing these algorithmic frameworks that support multiple physics within them. So for example, if you want to get dispersive type waves together with non-dispersive type waves, you're gonna have to make a choice between shallow water equations and Boussinesse class models, right? So if we now want to transcend and go to the infragravity waves and do a better job at phasor-solving waves, we can actually take a shallow water equation and couple it to a pressure Poisson solver or Boussinesse class model and that's what we've done. And then we can couple that to the standard shallow water equation model and other portions. So through this interleaving. So the way this continues Galerkin works is you actually blow up the elements and you couple them with a Riemann solver. So for example, some elements you could assign the shallow water equations, some elements you could assign the shallow water equations plus the pressure Poisson equations that then couple back into those shallow water equations. So it's a very cool mechanism for doing multi-physics within the same computational framework and selecting it when you need it. So for example, when things are calm, you might not want to solve the Boussinesse equations anywhere. When the waves get really big and you have infragravity waves being generated, you might want to actually turn on the pressure Poisson solver. And so we've played Andrew Kennedy and some of his and my students have been playing with this for a long time. And there's a whole hierarchy of orders of model that models and precision that you can generate. And just as a little example here to do a wave shouling over this, this berm here, you can match the data really well. So it becomes this dynamic type of framework. Next thing is the hydrologic models. So for example, some areas in the floodplain you might want to be actually be solving the kinematic wave and the dynamic wave equations. And so that would be a very important thing to do when you want to have an integrally coupled model. And so over here, you might want to solve the shallow water equations. Next cluster or zone, you might want to solve the Boussinesse class models. And then for example, in the flat portion of the floodplain, you might want to solve the dynamic wave equations in the higher portion of the floodplain, the kinematic wave equations. And this is all integrated into the e.g. discontinuous Galerkin-based computational framework. The other thing is that we might want to adapt with P. So as I said, we've been living with these second order models. So instead of living with just this simple second order P is equal to one model, let's jump into quadratic elements or cubic elements. And then when things get quiescent again, not as high energy to the linear elements again. The idea is that today's models are basically simply linear models, but they're very high resolution. So let's back off a little bit and have coarser models with much higher order interpolants. And of course, what the advantage is there, you have a much faster convergence rate. So you're buying convergence speed on the order of the interpolants, plus you're actually allowing yourself bigger time steps and you're allowing yourself or loading up the vector-based processors that we have today with much more work. So increasing your workload towards peak performance. So examples over here, left side is you have high resolution with a channel over here. Here you have this higher order interpolant now and a much coarser mesh if you can see that. And here you have the linear interpolant on the coarser mesh. Of course this miserably fails in the representation of both of the symmetry and the currents. Here we get identical solutions, but runs four times faster. So we're really mining this. And you can do this in a dynamic way. So when you need that high resolution, you can provide it. So we also have adaptive H networks. So basically add resolution on the fly. And another nice thing about this continues to work and is you can do it so it's non-conforming. So you don't have to mate with the same interpolant, same type of element, et cetera, as the adjacent element. You can mix and match P, you can mix and match resolution. So we can simply insert more resolution in some elements when we need it. And we can take away that resolution when the flow becomes more quiescent. So an example, a kind of cute example is here. This is a hurricane study of New York Harbor. And post Sandy we did that study. And it turns out that, for example, this level of resolution is very fine for tides. You do a really good job getting the tides right in the whole system. But when the hurricane comes along, there's the Hudson River Canyon that said, okay, I can see it a little bit there. New Onsen smoothed out because of the poor resolution. But it turns out that this is a major release valve for the hurricane. So Hurricane Sandy actually pushed water into New York Harbor by blowing the wind over here. It maximum winds just walloped New York Harbor. But because of the depth of that channel, it allows it to be a backflow valve and actually lowers the water levels. The water levels are much higher in New York Harbor if you don't resolve that feature correctly. So we don't wanna have this grid just for tides. Work at a grid and that storm comes up and you wanna put this level of resolution in. And we can do that with that dynamic age adaptivity when we need it. So of course, all this now has to be accommodated with a framework that allows for changing load balancing. We're changing the physics. Some of the physics is much more demanding. Obviously, doing a pressure Poisson solver is much more demanding. Kinematic and dynamic wave equations are much cheaper than gel water equations. So as we're changing that physics, as we're changing the P's, obviously you're increasing the cost. As we're changing the H's, you're increasing the cost as well. So we're gonna have to redistribute and rebalance that whole calculation on the fly. So we've been working with the MPI forever, this last 15, 20 years. And so we've now turned to a piece of code called Zoltan from Sandia National Labs. And it allows us to dynamically redistribute and diffuse locally the elements to be load balanced. So the idea is, let's say this is an equal number of nodes and elements, domain decomposition. Well, when it's dry or let's say you're doing the dynamic or kinematic wave equation in this land-based area, in this calculation, you actually have a much smaller workload. So you can actually put much larger subdomains over there so everything keeps synchronized. And that we have developed a lot of technology in order to smartly and intelligently redistribute those workloads. We're also under an NSF-SSI project for software innovation and sustainability. We're looking at, we've integrated in HPX with a group at LSU. And HPX is a much more fine-grained type of redistribution mechanism. So it really migrates data instead of clusters of the subdomain as we did in the MPI-based subdomain decomposition and redistribution. And this is gonna be important as we go to a larger and larger number of cores to have that greater flexibility and that greater capacity. And it accommodates all those things that we want to change the dynamic. So to sum up really, I think that we come for you from years and years of things being pretty static. So you had your geophysical system that then really translated and was supported on this unstructured grid, which was static. And then that went to the PDEs, which in the finite element method, HPC. Of course, that was fed by not only the domain, but also by the forcing function and forcing physics and put that all together. And then you basically generate solutions. There's no feedback mechanism anywhere. So basically where I think we should be going is having this very, very flexible interactive feedback and that is represented by all those green lights that are basically spaghettiing all over the place and informing things and doing it dynamically. And I think in order to have cheap, effective, and accurate calculations, we need to have that total interaction. I think we ended there. Wonderful, thank you very much, Johannes. Questions. A lot of the recent hurricane events, rainfall flooding was really important in addition to the storm surge. So you said you're working to link with Worf. Is that the approach you take? Worf hydro, yeah. Yeah, so NOAA has decided that as part of their quote unquote national water, total water model at CERC is going to do the coastal part of it and Worf hydro is going to be the, obviously the hydrologic side of it. But then we're working at this interface with them. Yeah, so right now it's point to point coupling where I think the vision is is that the group at Oklahoma is doing that work. It's where Worf hydro is competing fluxes into absurd. We're actually doing the gritting part of it and providing the grits that go far enough upstream in a really cheap way so that you can couple the rivers far enough. But it used to be no man's land, you're exactly right. And obviously the downstream water levels affect things enormously and that adds to the upstream fluxes. Other questions over here. I was just wondering for these problems where you have real bathymetry that is pretty non smooth. How much do you think it matters to have higher order accuracy methods versus just having good refinement in the region? So we've done a lot of sensitivity studies and obviously critical high conveyance regions are really important, a lot of sensitivity. And in general, the near shore seems to be fairly sensitive to less than 20 meters to what the bathymetry is there working with different bathy sets. But if you look at some of the global data sets and versus the local lidard formed ones that we have to be lucky enough to have in the United States around water penetrating lighters in the back bay and things like that, it really improves things quite a bit. So I think bathymetry is in the conveyance is going to be really important where there's a lot of energy and that's what we're finding. But the high order that we're doing there's just to be cheap so that we can have lower, bigger elements, right? And larger time steps. And so then we still capture that physical shape. That's what we're doing that in. By the way, it pays off to have the higher P because you get much more efficiency on the processor. So it's a matter of cost function. Okay, one more, Irina. I wanted to ask whether you feel that the sediment transport formulations that we use with these models are sufficient for the sort of like unique extreme cases like big storm surge events. So, yeah, it's important. Dune systems erode in every major hurricane. Obviously there's breakthroughs. You have to have that a typical tidal inlet channel will erode by as much as 50% and become really deep. So, yes, the morphology is very important. We do have sediment modules, but we haven't played them. I guess it would probably be much easier to get sediment and this is just an educated guess. Sediment transport more accurate with high energy events than it would with the one a day work a day morphology. I think as part of a team for the core for a big project called Morphous some years back and one of the sediment guys made a point that when they looked at all the sediment transport formula that 50% of them, 50% of the data points were within a, 80% were within a factor of two and 50% no, 80% were within a factor of five and 50% were within a factor of two. And of course the morphology is the gradient of the sediment flux. So it's, I think, boy, there's a lot of work to do. Okay, terrific. Thank you, Johannes.