 Another way GIS matters is by helping people prepare for and respond to emergencies and disasters. In the video prelude to this case study, you heard Professor David Madement say that floods claim more lives and cause more damage than any other kind of natural disaster. Here we'll consider flood prediction as an example of modeling with geographic information systems. In the late 1960s, U.S. state and federal government agencies began building geographic information systems for flood risk estimation and response. Lisa Warnacky, a longtime observer of GIS applications in states, tells us that the Texas Natural Resource Information System, one of the earliest state GIS's in the country, was established in 1968 in response to flooding. This image shows flooding on the Texas coast in the wake of Hurricane Bula, which took 688 lives and caused a billion dollars in damage in 1967. At about the same time, the National Flood Insurance Act of 1968 created federal subsidies to help private property owners pay for flood insurance. This map shows the distribution of subsidized insurance policies by state in 2013. Demand for insurance among property owners in flood prone areas hastened the production of elevation data needed to map floodways. This is how terrain contours used to be compiled by skilled operators who traced lines of constant elevation from ghostly stereoscopic images of the terrain that were projected from overlapping aerial photo pairs. Established in 1979, the U.S. Federal Emergency Management Agency took responsibility for administering the National Flood Insurance Program. FEMA's responsibility included overseeing production of flood rate insurance maps or firms. Eventually, firms became de-firms as maps went digital. FEMA has been active ever since in flood risk mapping and public outreach. Hazus MH is FEMA's suite of software models that quantify human property, financial, and social impacts of natural hazards, such as earthquakes, hurricanes, riverine and coastal floods, and tsunamis. Numerous flood prediction models are in use by state and local government agencies, and many private contractors provide services in spatial modeling and GIS related to flooding. To predict the impacts of a natural hazard, it's necessary to understand how the various processes that contribute to the hazard actually work. Models are computational expressions of that understanding. First and foremost, models designed to predict flooding from rivers and streams need to account for the landscape characteristics of an area of interest, including watershed size, terrain, and soils, stream channel geometry, and others. Then the model needs to factor in forcing variables like rainfall intensity and duration. GIS is useful in integrating the various data inputs that flood models require. The Hazus MH flood model, for example, is available as an extension to ArcGIS. According to Kevin Mickey, lead instructor for FEMA's Emergency Management Institute, Hazus MH offers a variety of options for defining flood hazards and modeling flood impacts. Have you worked with this or other GIS coupled flood models? If so, please add a comment to this voice thread. To make sure we all know what flood modeling with GIS entails, let's walk through a workflow recommended by the United Nations Office for Space Based Information for Disaster Management and Emergency Response. Although there are many different approaches to flood modeling and prediction, I chose this one to discuss with you because it is fairly well documented and it's in the public domain. Like other models, this one involves multiple stages, each of which produces its own intermediate outputs. In fact, although the diagram indicates that the output from workflow A becomes an input for workflow B, that's not exactly the case in the step-by-step demonstration of the model that's included in the recommended practice. A, B, and C are actually three distinct workflows in this example. Workflow A calculates potential runoff from a given landscape. Workflow B delineates a drainage network and calculates stream flow direction and rates at various points along the network. And workflow C produces an inundation polygon that represents the area and depth of flooding for a given geography. Let's look at these three stages one by one. The output of stage A is labeled multi-data CN grid. CN stands for Curve Number, the traditional name for an estimate of potential precipitation runoff. To create the Curve Number grid, the recommended practice uses ArcGIS desktop software with a spatial analysis extension and an additional toolkit called HEC-GO-HMS. HEC-GO-HMS is the geospatial hydrologic modeling system developed by the U.S. Army Corps of Engineers Hydrologic Engineering Center. The integration of HEC-GO-HMS with ArcGIS is an example of what's called a loosely coupled model. Data requirements for stage A include the USGS's land cover data, sergo soils data, digital elevation data, and a lookup table of curve numbers associated with land cover categories. How familiar are you with the characteristics of these data sets? You'll have a chance to investigate them later. Notice that the elevation data input is labeled HydroDEM. That's a digital elevation model that's been processed to remove errors like artificial sinks and peaks that can distort the flow service. More about that in a minute. The CN grid produced at this stage quantifies the amount of potential runoff at each grid cell as a function of land cover, soils, and terrain. The CN grid is one of the inputs required for stage B. Workflow B of the recommended practice involves ArcMap, Spatial Analyst, the Corps of Engineers Geospatial Hydrologic Modeling System, and an Esri toolkit called ArcHydro. ArcHydro provides tools needed to correct those anomalous sinks and peaks in DEMs. It also enables you to calculate a flow direction grid from the HydroDEM. The flow direction grid quantifies the direction of downstream flow from each raster cell to other cells in the grid. From the flow direction grid, ArcHydro generates a flow accumulation grid by calculating the number of upstream cells that flow into each cell. The larger the number of accumulated upstream cells in a given cell, the more runoff will occur at that location. Are you following this okay? By setting a threshold value for flow accumulation, you can use ArcHydro to define the cells that correspond with stream channels. From the stream grid, ArcHydro can segment the channel cells to represent individual stream reaches that have unique IDs. ArcHydro can then delineate drainage areas or catchments for each segment by backtracking the flow direction surface from channels to ridges, thus defining the area that drains to a specific location. With all those grids as inputs and with an additional grid that quantifies precipitation for a given period, the core of engineer's extension can generate a computational model that calculates stream flows for drainage points in a stream network. In the process diagram shown here, Outflow Hydrograph refers to the amount of discharge at a particular location over a given period of time. Keep in mind that the National Water Model will discuss a little later calculates 2.7 million hydrographs every hour. Workflow C demonstrates how a flood inundation polygon can be calculated from digital elevation data, digital representations of stream channel geometry, and characteristics of the floodway such as obstacles that affect flow. This stage uses another toolkit produced by the core of engineers called HECGEO-RAS, the Geospatial River Analysis System. Required inputs to HECGEO-RAS include a high resolution digital representation of terrain called a triangulated irregular network. Are you familiar with TINs? If not, don't worry, you'll have a chance to investigate them later in the lesson. In addition to a TIN, the Geospatial River Analysis System requires a number of digitized vector feature classes, including stream centerlines, river banks, stream cross sections, and structures such as bridges and culverts. The connectivity of stream segments needs to be specified explicitly. In other words, the model needs to know the topology of the drainage network. The model also needs to know the estimated upstream flows in the drainage network, such as the hydrographs produced in stage B. To predict the flood inundation polygon, the model compares a calculated water surface grid with a digital terrain grid derived from the TIN. After the terrain grid is subtracted from the water surface grid, any cells left with positive values represent flooded areas. Here we see the area and roads inundated by the Onion Creek flood that was discussed in the video we watched before the case study. Now, my description of the three interrelated flood models was grossly simplified, of course. Even so, the complexity of such models should be obvious. It follows that flood models are computationally intensive. The 81-page step-by-step demonstration of the UN's recommended practice includes several warnings about time-consuming operations in ArcMap. Until recent times, flood modeling and hazard assessment have been limited to basin or sub-basin scales due to practical computer processing limitations of desktop software and computers. That's where the National Flood Interoperability Experiment, or NIFI, described in the video, comes in. The video describes a proof-of-concept for a national-scale stream-flow model that's engineered to take advantage of distributed cloud computing capabilities. A key element of NIFI was a simulation model called RAPID, the Routing Application for Parallel Computation of Discharge. What a name. RAPID is a river-routing model comparable to those in the Corps of Engineers Hydrologic Modeling System that was used in Workflow B earlier in this presentation. In addition to being engineered for high-performance computing, the RAPID model uses the National Hydrologic Data Set to delineate the National Stream Network and its connectivity. RAPID relies on GIS to prepare the stream network data for input and to display maps and hydrographs of its flow rate computations. Not long after the successful NIFI experiment, NOAA, the National Oceanic and Atmospheric Administration, implemented a national water model that uses data from more than 8,000 USGS stream-flow gauges to produce hourly simulations for 2.7 million stream reaches in the continental United States. Previously, the press release reports, NOAA was only able to forecast stream-flow for 4,000 locations every few hours. The National Water Model is based on a framework called WRF Hydro that enables various component models, like RAPID, to be coupled with other simulation models and observations that together can simulate the complex physical processes responsible for stream-flow and floods. You can see a near-real-time display of National Water Model simulations in ArcGIS Online. This view combines 15 one-hour forecast intervals, visualized by flow rate and anomaly, compared to normal monthly flow values. The visualization is rendered from 40 million data rows that are fed by NOAA data services. The WRF Hydro framework is built to operate on the high-performance computing network that's being developed with support from the U.S. National Science Foundation. NSF envisions a distributed computing environment it calls cyber infrastructure that supports data-intensive research at government agencies and academic institutions around the country. Some envision a cyber GIS that brings geoprocessing and spatial statistics tools to high-performance computing environments for collaborative use by researchers across the network. Meanwhile, mainstream GIS architectures are maturing too, and the performance advantages of parallel computing are becoming available in desktop software like ArcGIS Pro. Of course, flood prediction isn't the only use of GIS that's scaling up to supercomputers in the cloud. Vast volumes of financial transaction data, social media interactions, and observations from ever-expanding sensor networks create opportunities for insights in commercial, national defense, infrastructure and natural resources domains that only big data analytics can realize. Data science is the emerging field that specializes in data-intensive discovery. Yet, through all this, space and place still matter. And so does GIS. Whatever you may call it.