 OK, I think it's time to get started. It's my great pleasure to welcome all of you to this year's Gilbert White Lecture. My name is Carol Hardin. I'm a professor emerita of geography at the University of Tennessee, and I chair the Geographical Sciences Committee here at the National Academies. This lecture series, the Gilbert White Lecture Series, is named for and it honors the memory of Gilbert White, a geographer, a leader in natural hazards research, and a member of the National Academies of Science. Gilbert White was an influential leader of national and international efforts to improve the response of governments to hazardous events, especially those involving flooding. And he's sometimes been called the father of floodplain management. He served in various capacities here in Washington during the Roosevelt administration. And in the last five years of his life, from 2001 to 2006, he served as a member of the FEMA Steering Committee for Evaluation of the National Flood Program. I was lucky to know him when I was a graduate student at the University of Colorado. And so it's really a special privilege for me to be able to introduce this lecture. His work in Washington predated cell phones. But if you have one in this modern age, please be sure it isn't going to make any kind of noise. I would say, put it into listening mode, or in Gilbert's phrase, I might have said, put it into Quaker meeting mode. This year's Gilbert White lecture is will be given by Buru Baduri. But before I introduce him, I'd like to take this occasion to recognize and honor Dr. Douglas Richardson, who recently completed 16 years as Executive Director of the American Association of Geographers. And I think I'll just ask Doug to come forward and have everybody beaming at him when I say all these nice things. We take this occasion to honor him because he's just made truly outstanding contributions to geography. And when he became Executive Director of the AAG, that's the American Association of Geographers, in 2003, he brought new vision and new life to the association. In the first decade alone, membership doubled, annual meeting attendance tripled, and the association attained strong financial position. And under his leadership, the organization dramatically increased its reach and reputation, not only nationally but internationally. In a pre-AAG career, Doug founded and was president of the company Geo Research Inc. He and his team invented, developed, and patented the world's first real-time, space-time integrated GPS GIS mapping technologies. Some of us have been talking about these technologies in a meeting today. These technologies transformed the collection, mapping, integration, and use of geographic information and are now at the heart of many applications familiar to us, like location-based services and cell phone navigation. The fact that he sold this company and the core patents of it in 1998 just shows us how very far ahead of the curve he was at that time, and he still is. As the AAG Executive Director, Doug actively reached out to other professional societies. Just as one example, he co-founded the Science and Human Rights Coalition of AAAS and chaired that group in its first five years. He also created the AAG's first public policy office, which greatly helps connect geographers with proposed legislation that might affect their work. His record of public outreach is astounding. He's been an active public voice for the achievements of geography and geographers. And again, as one example, he's been writing a quarterly column in the publication, ARC News, which reaches over 600,000 readers quarterly. And he's been doing this since 2003. I understand you might have written your last one. At the AAG, he brought people together around important issues and new ideas, and then in some cases edited volumes on those topics, including one on geography and drug addiction and one on the geo-humanities. One of his most ambitious projects has been his work as editor-in-chief of the International Encyclopedia of Geography. As you might imagine, he's received quite a number of prestigious awards, but now not resting on either his laurels or his hammock. He recently accepted a new position at Harvard University as a distinguished researcher at the Center for Geographic Analysis in the Institute for Quantitative Social Science. The Agui geographers are just profoundly grateful to you for all you've done for geography and for all of us. And we continue to be impressed by your intellect, vision, creativity, and your humanity. So I would like to present to you this certificate of recognition. Thank you so much for all you've done. Thank you. Thank you. I'm going to give Doug a chance to say just a few words here too. Thank you so much, Carol. I'm the one who should be thanking all of you because I've had such a wonderful career, multiple careers, and what such a wonderful life working with geographers. My wonderful family, Suzanne, is here. I've had, for me, the perfect life. I got to do exactly what I want to do. And one reason I started my own company is that I just realized I wasn't cut out to be working for other people and that I wanted to make my own decisions. I wanted to do the research I wanted to do. And I wanted to, if it meant hustling up the money to do it, I would do it. It ended up being a good thing to do for me. I learned that all of the major sectors of geography, academic geography, the private sector, government, and the NGOs. They're all doing the same things, and they're all doing a good job of it. And so to be able to work in the private sector, work in the, I've done a lot of work with federal agencies and the NGO and the ability to follow the research inclinations that I had, it's just been wonderful. And geographers are pretty good people. Not too many people sign up for being a geographer for fame and fortune. So you encounter a lot of people who are modest, who are genuinely interested in what they're doing, care about it, care about society, care about making a difference. And that's all of you. And so all I can say is that I'm very privileged and to have all of these experiences and to be able to put my, I don't seem to be able to do anything halfway. So for my next one, though, I promise, Suzanne, that I'm only going to go halfway at Harvard. And that's also a nice little career capstone for me. And I think it'll be useful for me and hopefully for them as well. So Carol and everyone else, thanks for all of what you've done for geography. And I've enjoyed it very much. Bye. Now it's my pleasure to introduce Dr. Boudendra Baduri, whom we call Boudu. His name doesn't roll off my tongue, actually, who is our 2019 Gilbert White lecturer. Dr. Baduri is a corporate research fellow at Oak Ridge National Lab in Tennessee, where he directs the National Security Emerging Technologies Division and is founding director of the Urban Dynamics Institute. His work involves novel ways of implementing geospatial science and technology in support of a variety of programs for energy, environment, and national security missions across US departments of energy, defense, and homeland security. His research has benefited US federal missions, international organizations, and private foundations. And he, at the same time, actively publishes in the scientific literature and holds professorial appointments at the University of Tennessee. His achievements have been recognized with numerous honors and awards. He's a fellow of AAAS, a recipient of the Caroline Mary Outstanding Mentor Award from the University Consortium of Geographic Information Science, and a recipient of the Anderson Medal of Honor in Applied Geography from the American Association of Geographers. He's a founding member of the Department of Energy's Geospatial Sciences Steering Committee and a recipient of that department's Outstanding Mentor Award for his dedicated service, developing future workforce for the nation. Buddha is no stranger to many of us here today and certainly no stranger to the National Academies. He currently serves on the Geographical Sciences Committee and has previously served on the Mapping Sciences Committee, the Committee on Geographic Information Science and Applications, and the Strategic Highway Research Program of the Transportation Research Board. He received his PhD in Earth and Atmospheric Sciences from Purdue University. He holds a master's degree from Kent State University and both master's and bachelor's degrees in science from the University of Calcutta. It is indeed an honor and a pleasure to have him address us today. His talk is titled Synergy Between Geography and Mapping with the Nation's Energy Mission and Oak Ridge Perspective. Please join me in welcoming Buddha. Thank you, Carol, for that generous introduction. I always compare introductions with my mother while how she would have done it, and you come very close. So it's a distinct privilege for me to be here giving the Gilbert White lecture. For those of you who do not know, my graduate advisor, John Harper, was a fellow graduate student with Carol at University of Colorado. So I had my fair share of Gilbert White dosage from reading papers to learning everything. So this is a distinct pleasure. So I want to thank my colleagues and peers for thinking that I can do justice for this talk, and I will try. And I picked a topic that I feel that I know a little bit about, but I had to look up a lot of things because I thought that this would be giving me an opportunity to tell a story that not always people know very well. So I'm guessing this is. So my plan for the next hour is to give you a little bit of background of the Manhattan Project, how DOE national labs evolved over time, and the relationship with geography from the early days of this mission. Then I want to tell you a little bit about how Oak Ridge became part of this movement and part of the space of geographical sciences and mapping sciences. And then fast forward to what are some of the things that we are doing today that are relevant for geographical sciences and mapping sciences. So at the onset, I want to remind you that some of the things that distinguishes us in this community is that there are three things that I felt that was unique at Oak Ridge. One is we do things at scale, very large. The guiding principles that I was given that if your buddy in academia is going to propose the same project, then don't think about it because then that should be done in academia, not at Oak Ridge. The second thing is everything that we do is interdisciplinary. So we have turned successfully a lot of English and biology majors into geographers by forcing them to work with geography people. And the third thing is that we are almost forced and pushed to develop solutions or do a lot more translational signs than just pure R&D. So it has to go and benefit agencies for the work that we are doing. So when I dug into the history, I found that during the Manhattan Project, the classic problem of siding with using geographic data was the number one driver in deciding where do we create these three presences. Oak Ridge, Tennessee was picked because of its location away from the coast with abandoned supply of water and abandoned supply of electricity. So at the peak of the Manhattan Project, the Oak Ridge facility, the graphite reactor and the entire complex, which is Y12, X10, and K25, the three different pathways that we were trying to enrich plutonium, it was using about 7% of nation's electricity supply, about 30% of the region's electricity supply. So TVA was extremely key for that decision. In fact, the TVA built a grid that was so resilient for that purpose that 60 years down the road, Oak Ridge was selected for siding the largest supercomputers because of that power infrastructure that you need to run the supercomputers. Los Alamos, since there's going to be testing and development of the weapon, it was selected to be placed with very sparse population and with lots of canyons for a totally different reason and same with Hanford, which was in Hanford, Washington. By the way, the best Manhattan Project exhibit is right across the street in the National Buildings Museum for $5. I saw it last fall. It is an amazing display of the entire history if you're interested. So this is the map that shows how the entire Manhattan Project and the nuclear operations slowly give rise to these dots across the nation where there was something to do with atomic energy or nuclear operations were going on, which eventually down the road became the Department of Energy's 17 national laboratories that are served the nuclear mission, the science mission, the energy efficiency mission, and the environmental management mission. So Oak Ridge in Tennessee is one of the Office of Science Labs and is the largest multipurpose science lab amongst all the six large science labs in the community. One of the things that we get to do is we serve the mission of Department of Energy, which is energy, environment, and national security. And that has allowed us to share some of the work that came out of the Department of Energy across the federal mission space. And there was lots of enrichment that came back into the science that we were doing and also both in terms of geography and mapping. And for the rest of my talk, I'm going to try to highlight that how some of these things have really come into fruition. This is the oldest map in Oak Ridge Archive that I could find. And if you notice carefully, there is a tilted north arrow with no real coordinates mentioned. So this was one of the mechanism by which exact locations during Manhattan Project were hidden. If you look, there is a laser, right? Is this the one? So if you look right here, it says this was declassified in 1958. So this map was actually classified for several years before it came out. This is the other map that I found in a colleague's office that came out of our lab's ship superintendents' wall when they were trying to clean that place. It has no date. It is hand drawn by a gentleman called E.L. Hutto. That's all you know. And it's the first sort of representation. And you can see that it says, Clinton Laboratory's external area. That's a planning map. So mapping was evolving as the Manhattan Project was taking shape. The question that I often get is that, how did geography become part of the Oak Ridge mission? And the answer, it is not validated. It's an urban legend, if you want to call it. But this is the best story I could hear was the three planned site, K25, X10, and Y12, they all had their own projection systems. And I have seen that data. And those projection systems, you needed a special software to take that data and put it into the regular geographic coordinate systems. And that was another, again, way of not just citing facilities, but disguising the exact locations of the site. So these things came out after several documents were declassified, what were the exact relations between the different projection systems that Oak Ridge was using and the real projection systems. GIS, really, in the late 60s is when the earliest evidence of a group called Geographic Data Technology that was soon turned into a name called Geographic and Computational Modeling, GCM technology. So people recognize that there is a lot of needs. These large-scale data analysis was still a theme when I look back at some of the examples that were created. So coming up with a vector map of electricity flow transmission systems across the nodes and the thickness of the lines indicates how much power is flowing. Today, it's probably an eighth grade exercise to create this thing. But I have seen what it took to make that map back in 1977. It's a big stash of punch cards, and especially printing it in a large-scale plotter. The one in between is a very interesting one that was actually demonstrated live that analysis to President Carter when he came to visit Oak Ridge about water demand in Ohio River Basin. And this was a futuristic exercise that tried to guess the numbers for 2020. So it would be very interesting to see, actually, how good we were in doing those kinds of estimates. So that's sort of a more raster-based approach that came in just following the vector support. And then early 80s, when we saw analysis of three-dimensional data. So this is airflow traffic density contours and then condensed into a 2D map. But the data was three-dimensional about how the airplanes were flying at what altitude. The map on the left, the strip mining impact is known as one of the earliest example of using gridded data on an elevation to create 3D visualization perspective for showing the impacts of strip mining. And so this was right north of Oak Ridge. The work was done with Department of Interior to demonstrate the impacts of environmental impacts and just the visual degradation of the environment, the landscape. Again, these were significantly computationally challenging tasks that was performed. Railway geography is something that Oak Ridge championed. University of Tennessee was one of the very few universities that trained students in railway geography. And most of them worked at Oak Ridge. And the need came from understanding nuclear fuel transport. It is still a strong capability that resides in the federal sector, particularly because the networks had to be programmed based upon the nuclear requirements. So you cannot use Google Maps to ship nuclear material path from New York to LA because it will take you through Colorado and Colorado will not let you transport nuclear material through it. So maintaining these kinds of networks was a huge undertaking. In fact, Oak Ridge is to produce all the basic transportation data layers that is now today part of the NITD, the CD that used to get produced in the future by Bureau of Transportation Statistics. We had a significant role to play in the national land cover chain. The Jerry Dobson and Eddie Bright were the designers of the CCAP program that had a major influence of the MRLC. The grid work that we were doing in terms of large scale data significantly influenced the evolution of the spatial data transfer standards as ETS. So it happened because of a number of people who built and spent their careers at Oak Ridge who were geographers and they brought that expertise into all aspects of Oak Ridge. So I was fortunate enough that when I arrived at Oak Ridge back in 1998, all six of them were there. And they all had significant stature in the geographical sciences community. Tom probably beat me up the most, but I learned probably the most from hanging out with him. And so did Jerry. John Sorensen taught me a lot about risks and hazards. And the risks and hazards research was instrumental in the atomic energy, in fact, the DOE world because of the nuclear operation. So whether you want to shelter in place, something goes wrong versus you want to evacuate, that science came from Department of Energy or at that time, Atomic Energy Commission. John is the one who taught me it's not the hazard, but it's the perception of the hazard. If you can see it, it's very difficult to motivate people. So that's why it was so important. A few years before I joined Oak Ridge, head out parking was mandatory. You could not park head in, which will make half of my staff today disqualified because they cannot do it. They have confessed that that will take their eligibility, including Merlin Brown, who is one of our fellow committee members on the Geographical Sciences Committee. Merlin had a tremendous impact in shaping the energy system, sustainable energy systems research with DOE. So it's an eclectic crowd. So I interacted with all of them. They taught me how to be jack of all threads in geography. And I practiced perfected how to be master of none. So it's my. So there are two things that was going on when I arrived. One is an effort called Oak Ridge base mapping program. It was the 4,400 acres of Oak Ridge reservation being mapped and spatial data being derived from aerial photographs, the foundational layers. And right across from my office was, at that time, you all know about the current smartest, largest supercomputer in the world called Summit, which sits straight down from where my office is. But at the time, the one that was the smartest and fastest was called this Intel Paragon 150 because our colleague, John Drake, was asked, how big of a machine do we need to solve the climate problem? And John and his colleagues came back and says, 150 gigaflops. And so the Intel Paragon 150 was put on the floor to solve the climate problem. And we are now at petaflops and going into excess flops. And we are still solving the climate problem. But Yang Chen, who was, I overlapped, a geographer from University of South Carolina, John Jensen student, did this marvelous experiment that I am aware of, the first geographical experiment using a supercomputer. Yang Chen took all these image chips. And you can see that as the plane, the flight path happened, the illumination changed by the time the plane came back. So you can see the differences in the image quality. And so Yang wrote a color balancing algorithm that ran very efficiently on the Intel Paragon 150. And as this thing moves, you can see that how the waters and the seams actually were taken care of with this color balancing algorithm. So computing was an attractive proposition even then. The other thing there is a long history of is understanding human dynamics in the landscape. And that also came because of assessing risk around nuclear operation. So going back half a century, this was a requirement for every nuclear facility, including universities which had nuclear material. They had to run this diagram and analysis on a periodic basis to understand how many people are at risk from these facilities. So that's how Oak Ridge got involved in creating higher resolution population data. And back in 1986, there was this intersection of geographical spatial analysis of transporting M-55 rocket heads, which had nuclear material in it from one point in the nation to another point. And the risk analysis was done as part of this exercise. And the decision was not to transport them, but to destroy them in situ. So they built an incinerator and had to destroy it. So understanding population at risk, that's the history behind it. So these are the two things I would like to kind of illustrate on. Of course, you know that we have spent the last 20 plus years creating global population data sets. So Landscan Global, which is a one kilometer resolution global population distribution of ambient population. The first version came out as I was arriving. They were working on it. My contribution was more on Landscan USA, which is time variant population using high resolution geographic data and combining that with a bottom up with a top down model. And then around mid 2000s, we started this ambitious goal saying, if we have a lot of data, what are the possibilities? How far can you go? So we said, could we scale the US data resolution to the rest of the world? So map global population at 90 meter resolution. The opportunity that came to us was the availability of the US government's data resources, mostly in terms of high resolution commercial satellites. And I am not going to go into the details, but some of the things that we used very basic principles on saying, if we could map every building on the planet, then that's a strong indicator of where people are and where their activities are restricted to. So back in 2006, we had the first paper at IGARS, or any professional meetings, which showed how you can use a supercomputer or a parallel cluster, which was called XTORC, to create these sort of high resolution settlement maps. And there are two things that happened. One is, as a community, including us, we made much progress in terms of creating better algorithms. And the computing evolved, because we had the emerging architectures, including the GPUs, that changed the game at one point for us. So we are mapping buildings. We are creating a higher level semantics from it, what we call neighborhoods. We characterize those neighborhoods based upon using imagery-driven techniques on creating high resolution elevation data, much similar to the Arctic DEM that you heard today. This is 1 and 1 half meter resolution or finer data that gives you the digital surface model. We are creating estimates of land use and population density from open source information. And that's all magically leading to a 90-meter population data. So AI has become part of this activity. We are calling it GeoAI or Geographic Artificial Intelligence. And we are a very ripe target or discipline, because we have this level of data set. So I'm very happy to tell you that Oak Ridge as a national lab, Office of Science National Lab, has an artificial intelligence initiative. And ultimately, of all the different initiatives GeoAI is investing in, this has emerged as one of the demonstrations for how you can take a very large machine and a smart machine to create artificial intelligence progress. So we are using a deep learning convolutional neural network. And we are after two things. One is the location of structures, and the other thing is the geometry of structures. The research that we have brought by doing this kind of thing at scale is very different from the community's work using what the community calls ARD, Analysis Ready Data, which is largely what NASA provides the research community. That you rarely ever do any kinds of corrections with that data sets, you just run with it. With commercial satellite images, doing it at scale, you run into these huge issues about differences in illumination, cloud cover, haze, and your algorithms are extremely fragile when you run into this data issue. So with us running experiments at scale, these are some of the research things that we have brought back into the community so as geographers, we would call this generalization issues. Like how well does your algorithm generalize? But we have been successful in generalizing one model because of the consistency and the balance in the NAPE data sets, the National Agricultural Imaginary Program data set, which is at one meter. And 10 trillion pixels, about six terabytes of data. We used enough of GPUs to map every building in the United States, the lower 48 states, in about less than three months. We are repeating that process with commercial satellite data, so at half meter resolution, for FEMA. And we are doing two things. One is we are getting the geometry. And we are also understanding the highest and the lowest intersection point with the ground surface for flooding purposes. So this is part of the flood of peaks. So you get tempted to understand, if you have a really, really large machine, what can you do? Titan, which no longer exists, was at one point of time the largest and fastest machine. In fact, when it got retired a few months back, it was still the fifth largest machine in the world. It had what my scientists call dump GPUs. So the algorithms and the software had to be dumped down because the GPUs were so much older, K30s. But we took the entire nation of imagery for the country of Yemen and used about 5,000 GPUs. And every building was mapped in about one hour and 41 minutes, which is pretty unprecedented. So it's pretty exciting that you can conquer the data flow challenge if you really had good algorithms and machines. Here is another example of commercial imagery from Caracas, Venezuela. And you extract all the buildings. But the exciting thing is that you extend those kinds of deep learning algorithms to create what we call settlement patterns, our neighborhoods. And these are visual patterns that we, in the computer vision or human vision, we distinguish between different kinds of structures. So you can see the formality, scale of formality, in a geographic region. More formal, not so formal, and very informal. Economists would dare to call them slums. I refrain from doing that because I have no economics background. But you can exploit the spatial features, hundreds of them in your data to understand the linearity, how straight the lines are, how contrast do you have between the structures and the roads. And the reason you are seeing the lack of contrast in the top right in the refugee camps is because they use the background material to build their homes or the structures they're living in. So you use the machine, train the machines with these kinds of learning mechanisms and neural networks. And you create data products like this, which we call the neighborhood maps. These are different kinds of structural, contiguous, similarity-based classification. I know that neighborhood mapping is a very strong area in geographical sciences. And there are many nuances to what a neighborhood is. But this is just based upon the spatial features of pixels. The question really is that we do not know. And let me give you another example. So we have been doing this consistently for large cities, Nigeria, Lagos, Tanzania, Nairobi, Dakar, Addis Ababa. The first question is, is this new knowledge? If you go and ask somebody living in Nairobi and showed them this map, would they be surprised? Or they would say, everybody here knows it. So is there? We do not know that yet. The only validation we have so far is we give six cities of data worth to USAID, who is doing some health research. And they kind of said, this was great. It mapped up to what we were looking for and we did it. So it is very, very important question that just because you have a lot of data and a lot of computing and you can create data sets, what does that change in terms of our scientific understanding of place and space? The second thing is, as we are producing these kinds of data sets, what is becoming evident is that we, as a community, we do not have a very good semantic understanding of what do we want to call them at a global scale. We have been happy calling slums in Lagos, Nigeria, because we had only that type of data. But once you have global scale data set, you have to come up with a consistent way of understanding these kinds of settlements, which I believe is an exciting area of research that's going to be coming out. We are providing the AI community with lots of things to think about. So are we the geographical sciences community, right? Our mapping sciences community. So this is Johannesburg, South Africa. When you look at output like this, it is actually fairly good that when you look at the underlying structures, you feel like the blue and the green essentially distinguish two different kinds of data sets. However, that excitement is pretty short lived when you start looking at other areas. And this is where we are far away from translating geographical principles or our notion of data conflation to the reality on the landscape. We have studied many parts of the world using these types of approaches. When you look carefully, the data is not lying because that's what it doesn't do. The challenge is, as we built our environment and infrastructure, we rarely think about these kinds of boundaries could be useful. So if you go to vast parts of the world, you will see that there is a one type of settlement pattern that extends to a road and just spills over that road, which essentially makes land use and stops. There is no way we will draw a boundary that will not conflate to that road because that's what we do. We draw administrative boundaries conflating to a structure that is easy to map and manage. So it's going to be a nightmare in terms of our own cognition of space that how we are going to handle these types of data sets. So I had a PhD student who now works with Dan Brown, who looked at validating these kinds of neighborhood in terms of using nighttime lights data, so electricity usage. Do you really see a difference in these different types of neighborhoods in terms of how much electricity gets used in there? And this is just scratching the surface. The short answer is, there is a positive trend, but you cannot really jump into conclusions looking at three cities just yet in one continent. So it's an exciting area. We did end up detecting swimming pools at scale by looking at a very simple example of NDWI, National Differential Water Index, and then scaled it to, again, Titan and mapped every swimming pool in Texas in about 10 minutes. And you can actually see the difference in quality of the water in the pools. And this was actually driven by CDC's request to look at Puerto Rico after Maria, because all the waterlogged areas, essentially, the locally flooded and water stagnant areas, are the hotspots for mosquitoes. So mapping these small water bodies is a very, very high priority. So that was one of the exercises that was done. So you can extend these. DOE encourages us to map solar panels, because they wanted to know how easily you can penetrate the market. The cost of customer acquisition is the highest. That's the private sector told DOE, because as they drive around, they have no idea who has a solar panel and who doesn't, because you may have it on the other side from the street on your roof. So by exposing these kinds of data, the industry would know how to market themselves very effectively. In fact, we did a prototype, and then it was followed up at a large scale implementation at Stanford later on. The other thing that we ended up doing is mapping mobile home parks, which were a collection of particular types of structures. And the nation did not have a consistent data sets for mobile home parks. So this was originally done for the emergency response community. I learned a lot on how they look so different in Florida as opposed to in Michigan. They are hard to detect in Alabama, because there's so much canopy. And that doesn't shed in winter. The other thing is there were two data byproduct, false positives that came out and created a fantastic data set, which are all the truck stops and the railway yards, because they look exactly like mobile home parks with those structures. Believe it or not, the biggest beneficiary of this work was not the emergency response community. It was Census Bureau, because they have to go and survey and account for all of these things. The way it was working is they would hire contractors who would drive around and account for these mobile home parks. And then the government gets billed by the miles driven. So it was a very efficient system. So this came across like magic. In fact, one of our postdocs abstract at AAG, one of the AAG annual meetings, was what triggered somebody in Census. And they called and said, is this true? And we said, yes, it is. And we had lots of email exchanges and meetings. And hopefully, this is being implemented. One of the joys of being at a national lab is that you get to participate in the federal mission. You are part of the federal mission. So these mobile home parks, we produce about 47 different basic infrastructure, critical infrastructure layer for the nation. So a lot of the data that you see at the USGS national map actually gets produced at Oak Ridge. So these mobile home parks data, we got a call that this data was needed within a couple of hours to create these cartographic products, information products for hurricane response. And FEMA essentially worked with us taking the data. And about 24 hours later, we got a validation saying, what you did actually was pretty useful. I will tell you, when you are a national lab, this counts 10 times more than a publication you can have. You can have 10 publications. But if this doesn't happen, then your existence will be severely questioned. Large-scale spatiotemporal data has been one of the big interests for us. And this is what I call the changing face of the nation. This is the modus 16-day NDVI composite. So you can see that it's going every day, 2012, every 16 days. And you can see how vastly the bread basket changes over the seasons, not the southeast. The southeast remains the greenest for most of the time. But you can see winter in action in the northern half where most of our agriculture essentially happens. So spatiotemporal data flows and data streams are extremely attractive, especially in terms of running machine learning algorithms to understand what is going on today. So today, we call this the 100 trillion pixel challenge. So if you believe the industry's promise of giving you 5-meter pixel optical data every day, that is 100 trillion pixel covering the planet. So that's what Planet Labs is promising. 5-meter pixel every day for the entire planet. That means 100 trillion pixel. So what can you do to understand the pulse of the planet in 24 hours before the next 100 trillion pixels arrive? And mine, I'm simplifying the problem because 100 trillion is really with no overlap. But when you are collecting satellite images, there is tons of overlap. So at the minimum 100 trillion pixel every 24 hours, what is the realm of possibility here? And I'm showing you this visualization. And if you did not pick it up already, that's what we call not an endemic. Everybody thinks it's an endemic, but it's a march of Walmart. So it starts in 62 in Arkansas and then just takes over the country. But what we demonstrated is that if you can use that MODIS time series and machine learning to automatically detect Walmart constructions on the landscape. So there are three different eco regions. Apple Valley, California, there is one in North Carolina and one in Maine. And the machines can learn from these temporal patterns. And essentially, you're looking at an EKG, right? The moment you strip all the vegetation and then you put concrete and asphalt, the patient is dead. So the signal just dies from that standpoint. But you can do this automatically. So in 2004, we published a paper at SPRS and then PENRS that showed that this is the possibility. But I think the reality is that 15 years down the road, we will have an opportunity to really test it at scale that are we ready to do this. So collecting real-time open source data at scale is something that Oak Ridge has been really interested in, in supporting a lot of the things we do, particularly human dynamics. PlanetSense is a capability that we have stood up for supporting the national security community, particularly NGA, that collects real-time information from 20-year difference open source outlets at a pretty astronomical rate. So it's about 3,000 records per second you are collecting. And about 2 terabytes of photographs per day. So it's a huge body of open source data. The goal we have is to create not a data resource that is only useful for forensic analysis, but it's going to define the human dynamics at global scale or creating the baseline that how do places behave in terms of human activities. And here is the reason we were pushed towards doing that. We produce a data set called population density tables, which is 55 different facility types. And it has three other columns. One is daytime, nighttime, and episodic events. And every cell represents people per 1,000 square feet. And that is the lifeline for our defense department in terms of understanding casualty estimation. So if there is going to be a strike anywhere, what is the collateral damage? So they call it collateral damage estimate. And those numbers are the ones. So if the threshold is above something, it would never get approved because of the collateral damage. So at one of the meetings, the question was asked, saying when somebody says day, or they asked, asked, when you say day, what does that actually mean? Is it day break? Is it sunrise? So afternoon, what are the bounds of afternoon? These are very semantic descriptions of time. So what we ended up doing is that we basically said that the only way we know that we can scale this kind of a model consistently globally is if we define those boundaries based upon human activities. So evening would be defined by certain human activities that are typical of evenings. So we do not expect a lot of population to be at nightclubs at 9 in the morning. That's just the nature of how we name these things. So we started looking at these activities. And then we started defining these evening hours. So evening hours in Doha, Qatar doesn't even start till about 9 30 PM till the temperature drops to some meaningful, tolerable things. 9 30 here, people are starting to go to bed, almost, right, most of the time. Some of the unique things that we found out is that characterizing people and their behavior. So this is retail space across three different nations. A, B, and C. And since I'm in a room full of geographers, I do not want to cruise you, but this is clearly Spain, because you can see the siesta, right? And then you can see these kinds of similar kinds of patterns. But over collecting data over time, you start understanding baselines of what do humans do. So this is a new aspect of human geography that's coming out with these kinds of open-source data and open-source data analysis. On the same vein, we have been able to create what we are calling the World Spatial Temporal Analysis and Mapping Project, or WSTAMP. Again, this is an effort that was started with NGA. It's still part of NGA. But it's an open capability that researchers are taking advantage of across the world. May Yuan is running a special issue of IJGIS. And it's a competition of using these data sets to create knowledge. What the data set is, it started with the CIA World Factbook as the core. And then we added 30 different sources. So now we have 200-plus geographies, which are nations, continents, 50-plus years, 16 million plus records, 14,000 attributes. So this is the largest socioeconomic data cube that the US government has ever created. And it allows you to run spatiotemporal mapping things. So it trends in your data. It allows you to ask questions when you look at data. And you can create any kinds of clusters of countries in the world. I'll give you a simple example. You can say, I'm interested in these 10 African nations for these 15 years. And I want to just look at incidents of malaria and see how these countries have behaved. And you immediately, with a few clicks, get the trends and patterns in the data that allows you to ask questions like, why do these countries behave similarly and the others don't? Although they are contiguous to each other, right? So it's a very amazing. We are using the World Bank Open Data Initiative, their data. And the funny thing is now they are using our analytical API to use their own data sets, which is a good success for us. We do take part in operational missions, as I mentioned. Oak Ridge, we created this capability right after the famous 2003 blackout. You remember a lot of the. And this is a capability called Eagle Eye, Environment for Analysis of Geolocated Energy Information that reports people. Right now it reports people without electricity for every county, not every county. 90-some percentage of the customers are covered. So there are small co-ops that provide. And we do not get that data. This is a real-time operational system. It runs 24-7-365, soon to become an operational system for the rest of the federal mission. Right now it's a DOE. And if I can mention funny things that happen, I get a text saying there was a blip with Eagle Eye from my deputy lab director. And it was a Tuesday night. And there was no hurricane, no natural hazards, nothing. And so I call. And I said, what is going on here? And they said, we got a message from the White House Situation Room. They have an account, and the president is giving the State of the Union address. Don't ask me what the relation is between the State of the Union address and some people not having electricity. But people were watching, and it was an important thing for them to know that the system doesn't go down. So we learned a lesson. So now we pay attention to the State of the Union address, not just whether. But these kinds of data integration efforts are becoming very important, collecting data from bottom up and serving to support the federal mission. Low altitude remote sensing was a big part of the Oak Ridge base mapping effort, particularly flying aircraft to collect aerial photographs, and then flying these what used to be called remotely piloted vehicles. That was what I found in the documents, not nothing called UAS or UAV, because people, you do not want to go for detecting radiation over the reservation where there was no existence of radiation. So that was the real driver for creating these unmanned or autonomous systems. By the way, I should mention that Oak Ridge base mapping effort that I mentioned. The person who was right before me, I became the base map manager. The person who was right before me managed that effort, left the lab, went to the State of Tennessee, and started the first state base mapping program in the nation that significantly influenced the eventual creation of the national map by USGS. So it goes back to the Oak Ridge base mapping effort. So some of the cool examples of was K-25, which was one of the big buildings, had a leaky roof. And for detecting that leakage, the pre-dawn thermal images were flown using US over that rooftop. And that got successfully commercialized to a company in Florida. So this was late 80s, early 90s, that sort of time frame. So fast forward today, I have a group of people who work on this US technologies. There are two important aspects of their work. One is integrating different kinds of sensors on the single platform. And the second one is command and control of a swarm of UAS. So right now, what you see is one person with a remote controlling one. But this is if you had 10 at the same time, how did they communicate with each other and move data around? And so it clearly is a significant step towards the HD mapping. But I want to give you another example that we are getting drawn into. This group of people gets called frequently to help with the search and rescue missions in the Smoky Mountain National Park. And there was a hiker called Susan Clemens. She went missing on September 25 of 2018. She and her daughter, 15-year-old daughter, they started this hike on this yellow path. And the goal was to go up to the Clemens dome, which is the highest point in the Smokies, and then come back. So of course, Susan was slower. The 15-year-old had did not have patience. And she felt that she might miss the sunset from Clemens dome. So she said, Susan said, why don't you go up? I will meet you down at this parking lot from which the path goes. Unfortunately, if you are not careful, and if the light is not good, you miss this hairpin bend that actually takes you back. So she missed that, and she kept continuing, and eventually got. So these were the two probable paths where she got lost. So the daughter came back. She didn't show up, so she reported to the authorities. So this group showed up with the sensors and flew the sensors and collected a lot of data. Eventually, Susan was found at this location. It was a spectacular failure of technology, partly because even though they had a thermal sensor, she had hypothermia, so she started taking off her clothes. So the gradient between her body and the background diminished very fast. So collecting data really, really fast in environments that look like this is extremely challenging. So we cannot quite map every environment that we would like to, but we are making progress in that respect. This is the last thing I'm going to talk about, about elevation data and how we are translating this to this translaterary part of the science. Knoxville, or Knox County, they have a partnership between the city, the county, and the local utility to collect LIDAR, one meter LIDAR, for the entire county, mapped every building. So six or seven years back, we first gave them an estimate about if the city decided to put solar panels on all the city-owned buildings, how much energy can they generate? Very obvious. But about five years back, we started working on this, what we call precision deicing, which is calculating solar potential for every street in Knoxville. And it's not really solar potential, although there is a long-term initiative, because one of the technologies that Oak Ridge is working for, the Department of Energy, is wireless charging of cars. So as you are driving over the substrate, the cars will wirelessly charge, just like they are charging our phones to their wirelessly. So you could potentially think that the street's surface will become energy generators, and you can wirelessly transmit. But this one, because it was LIDAR, and you could calculate the amount of solar radiation, irradiation, and shadows, along with the slope, we came up with what we call a risk map. So higher the slope and more the shade, you get a score of red, as opposed to you can see the reds and the greens and the blues. The reason we partnered with City of Oak Ridge is because City of Oak Ridge has $180,000 in their budget for deicing every year, which, on average, serves two and a half winter stormy bends. They have 23 trucks. There are 200 in the greater Washington area. There are two in Atlanta. So it's an order of magnitude. The 23 trucks work on this amazing engineering, which is when the driver starts moving, the valve opens, and the brine and the salt goes on the surface. And if it's a break, it stops. So there is no science behind why would I want to put the same amount of brine and salt on a stretch of a road that would get a lot of sunlight, will melt very easily, as opposed to the other ones. And it has tremendous economic impact in the community, because your schools are closed. You cannot run the school buses. Productivity loss, you name it. So we created a lab prototype as a national lab, and then the University of Tennessee group of business school students, using less than $1,500, and borrowed one of the dad's trucks, created a field prototype. Last week, the city of Knoxville implemented the second truck. And what I'm going to do is I'm going to show you a video that sort of came out, and it's been covered in the Weather Channel nationally to show 90% of US counties that get snow or ice break their budget by trying to clean it, because the last mile access becomes a huge thing. So how do you stretch? So here is what they created, multiple valves on a truck mounted on the tank mounted on the back of the truck. They took an iPad program using that baseline data of risk. So as the truck moves over those area, the valves automatically gets controlled, and it treats the street surfaces carefully. So this is a huge. The environmental cost is also non-negligible, because the water utilities will tell you that the number one cause of fish cancer is the brine running into the water. So it's a tremendous. So that's the fun part. So I have to thank all the people that do the work, and I come and get to tell you about it. And you will be happy to know that a large section of these people are geographers. We have a very healthy population of geographers at Oak Ridge right now. And we are constantly having interactions with the mapping and geography and the federal mission that we are tasked to support. So with that, I will thank you all for your attention. Thank you. I think we might have time for just a couple of burning questions. And other than that, I imagine we could ask Buddha to stand here for just a few minutes in case somebody wanted to follow up with the question. Any one from? Yes. So that's where the partnership with the University of Tennessee's Business School came through. So these marketing students are the ones that are trying to, every semester, they make some progress. And there is at least one person. I have to quit my job if I want to chase that technology out and make it successful in the market. That's one thing that we have a really hard time being at the National Lab to do the commercialization part or forming a company. So we are making steps. But it's really the city is the one that has to, you know, my biggest worry was whether they will have a second truck. Because they were doing it out of goodwill at the beginning, right? So they want to try something out. But that's a great sign when you go from one to two. That means they are seeing some value. So I think there is still potential. And there are people interested in it. As part of our DOE mission, we commercialize these technologies. So if somebody is interested in running out with it to the market, they will have an opportunity. Thank you.