 Hi, everyone. Excuse me for my voice. It's been great running the conference for two days. So this is my journey of how I built a satellite water monitoring tool for Chennai City. So when I did a Google image search for Chennai water, this is what I found. It's good in one way. It's bad in another way. And a quick fact is Chennai needs 830 million liters of water a day. And most of these has been provided by the metro water. And in various ways, like groundwater, lakes, reservoirs, and desalination plants. So the motivation of my work, I was working with MIT Media Lab on an urban planning tool called City Sciences. And we've been using these tools, like Geopandas, QGIS, GDAL, Projection, and OSM. And early September of this year, I'm also a part of this Facebook group called Tech for Cities. And early September, I found a web app where Chennai Corporation has listed 290 water bodies. I mean, they are major water bodies. And this can be undertaken by any organization or institution. And then they can maintain them. And then fast forwarding two weeks, I found this paper from Sama Technologies in Chennai where they were trying to find parking spaces using satellite images. So they basically used DL models. And most of the tools that I used in city scope projects were used here. So idea is this. So I have locations of all the water points. I have an approach on how to find something on a base. And then to people who don't know what is the third image, during early this year, Chennai City was running out of water. And then we had to get in water from a nearby place called Jolarpet through trains. And all this combined together. And then that is what I was building to call Chennai Water Monitor. So step one, right? I tried to get all the required jaysons. I mean, that is nothing but points and polygons of all the water points in that 290 water bodies from Scrapping Tech for Cities, that web app, and OpenStreetMaps did some manipulation to fit my use case. And then I used a lot of free satellite image trees constellations of Sentinel or all these big things. And then step two is what's the approach that I'm taking to build this. And then I tried to replicate some team's paper. So I had USGS is nothing but a portal where you can get open sourced satellite images. These are almost real time. They take a snapshot once in every month. And then I parcel Jaysons. That's nothing but, say, what are my spots, interest spots in these things, annotation, augmentation, and then try to apply it. There's always this. When you try to code for a research paper, you always fail first. So that's what happened. So USGS has a 30 meter square per pixel spatial resolution. Whereas you need something like 1 to 3 meter per square per pixel resolution for something to do with DL. So that's when I found EvoLearn, which is Earth's observatory learning module in Python from Sentinel Hub. It's used to process image sequence acquired by satellite fleet over a time series. It is used to build workflows. You have hundreds of a same area of satellite images over a time. And then you have the same workflow happening on that over the time. So this is very good to build over it. And then it is easily connected to Copernicus and Landsat, which is the best open sourced satellite imageries you can get. So I had two options going further. So I can either do image processing or I can use bands, which are available in satellite imageries. So of course, image processing I couldn't do because of the spatial resolution and also because you have a lot of noise that is there. And then I had to stick with normalized difference in water index, which is based on. So satellite image is nothing but different bands of colored images that you take with sensor inputs of near infrared and short wave infrared. So I built a workflow like this, Sentinel 2 satellite images. Then I took the data set from 2010 to 2019, then found water through NDWI and removed packages, which has packages nothing but a variable that stores your single satellite image. And then I took out which has more than 30% of cloud cover. So I don't need them. I don't get any inference from it. And this is a sample output that I got. So the red one indicates my interspace and then the blue one indicates the water available in each of them. And I played around. So I wanted to add weather data to it and see how it correlates with the water available in a single place and then wrapped around with a web app. So this is the final POC that I made where you can select any of the 290 water spots and then it's a D3 graph where you can. So the black one is the water level and the blue one is the rainfall in that particular region over the time. And then what you have on the right is an image over that time. So if I select 12, then it is that image of that time. So what I want to do with this, so monitor water bodies that has been maintained. Add more data points like population, pollution, and see how it correlates. And finally, as a computer scientist, I'm going to write an ML model to see if there will be another chamber ambakam happening or not. So this is me. I'm an urban mobility enthusiast. I build products around it. I co-founded Flita, an enterprise mobility company. We deal with global relocations. And I'm a proud alum of Hindustan University. Thank you.