 So our next talk has a very different topic. We all know the problem. We go to a grocery store and have to queue in the line. We have to wait long times. And Mickey, Mickey Lombardi has a solution for that. Hi, Mickey. Hi, hi, everyone. How's it going? How are you? I'm fine. Thank you, Adil. I'm fine too. I'm from Switzerland, streaming from Switzerland here. And where are you streaming from? I'm from Florence, Italy. OK, so not that far. Yeah, not that far. OK, so please start to talk about the map of the supermarket wait times. Thank you. There we go. Well, I hope that until the end that this talk wanted to be physically in front of our public. But unfortunately, we are still in this situation. So I'm going to see a screen in front of me. But I hope to see you next time, presence, and to meet you all. I hope to have time to answer your question, otherwise you will find me in the room to reply as I can. So in this talk, I'm going to speak about on my 2020 project, an open source project, to trying to help people during the COVID-19 lockdown that everyone affected. So let's start the talk. Please let me introduce myself first. I am Mickey. I'm from Florence, in Italy, as you may know from my accent. Actually, I'm a food stock engineer, a growing digital company in Italy. And I am also a co-founder of the Schrodinger Hub podcast and the organization in GitHub, where we are having fun on building something open source in Python and all their languages. Of course, I'm in love with Python, but especially as I am Italian, I'm in love with pizza. I will not post any link, any social link, but you can find me everywhere as the join or the join Antifa. And if you like, you can pass on my website. And the link is michelombarti.com. Before to start, let's have a look what we are going to see in this talk. I divided the talk in four steps. The first one is an intro, just to give you the context of everything. Then we are going to start going deeper and just to show you how the architecture works and how the cloud was. And then we're going to show you some APIs and especially Redis that saved my life in this project, definitely. And then in the end, we're going to have a look into the economical part and the conclusion of this project. I know that we are just programmer. We don't think about money. But I think that for this project, money is a little something that we need to talk about because there is a plot twist. So let's start. As you may know, in 2020, the things just fall apart. I love these GIFs. They are found on GIFI that produce some of these things up. We are a mask and get the bugs. It's the mantra of this one year and a half. And when the pandemic hit, everyone was like panicking. No one was knowing what to do. No one was having a mask and stuff like that. So not create big deal. And for that, everyone, especially in Italy, the first European country to have a lockdown, a national lockdown, we were just panicking. And everyone started to take everything from the supermarket, the grocery store, the pharmacies, literally everything. And when you went to those places, you can have a look at the line and just think about how much time do I need to just enter and just to find something if there is something to take. I think that this issue comes to every places during the pandemic. I talk about GIFI because I'm living here. But reading about the newspaper and from the video all around the world, back in April 2020, we were just in the same boat. So the issue was that the day following the global lockdown, every supermarket pharmacy were attacked by people in search of basic goods and just immediately after the announcement of the lockdown in the country. So just to give you a little context about this project, it wasn't a project for profit, neither for commercial use, but just to try to help people on staying safe and stay away from the others, unfortunately. Because the longer you stay out and in cocktail with other people, especially in closing and ventilated environments, the more likely you are to contract the virus. So my idea for this project was that. I'll just leave you the link. You can find it in the slide, just that I leave you in the EuroPython website. Of course, the project is now offline, but you can find the open source code that even the front-end side and also the back-end side. So what was the idea? My idea started from Google traffic API. Because I was thinking, well, people are like cars. And inside the car, there is a person physically. So my idea was to collect some data from also Google API for places to just retrieve the average time that one person stay into a place. Like, I'm going to the supermarket. I stay there for 20 minutes average for Google. And I was thinking, well, that's a good metrics, but yet it's not that much of data. So my thought was, what if I realize a map with those API and also older API than we're going to look after? And just give the possibility to the user, to the end user, to just give a feedback, a temporal feedback, of the store where he is. And just with those feedback, just try to understand how much time we need to wait to enter those places. And this is called car sourcing. It's a pattern that now we can find in the modern application. And I managed to give the time almost correctly, at least to my local markets and also around the world, by mixing up those data from the Google API and from the user feedback. So you may wonder, what's the map? The map is, it was like that. It's showing up with just a full screen map using leaflet and just a model to have a getting started guide to the user. Showing up the icon that the user can click and how the website working and how the progressive web app were working. Giving also a frequently asked questions, just to give the possibility to everyone to navigate the website and find the data that they wanted. It turns out that this website, this application, went around the globe. And this is a graph. The numbers are no matter for the numbers. But from the late April to mid June, where the website was alive for the emergency, it collected those numbers from all around the world. And I'm so thankful to those users. And I hope that I gave a little help to those people. Starting from April, just for me, my family, my friends in Florence, and then it went kind of viral, I can say, around the Europe first and then around the globe worldwide. So let's dive into the architecture and the cloud environment. Before doing that, I cannot talk about the technology that I use, the stack. So of course, Python, the version that I use was 3.7. It's not that much more important because you can run the project in every Python measure than 3.4. Of course, Flask for the API layer, Mooswitch and Nginx for the web server. Netlify for the front-end side for giving the possibility to go in cloud and find the gem stack using the CDN. Redis, a big thanks to Redis and Progressive Web App from Google. Let's back to April 2020 at the start and beginning of this journey. I was using Raspberry for put online everything and literally everything into this Raspberry. I love Raspberry, by the way. I was running front-end Redis server and Nginx, Mooswitch, Flask, API, literally everything. And of course, for testing news, testing proposals, and for beta testing, it's very good. It was quite fast for me. And also, I mean, the beta testers were just friends, my mom, friends of friends, my grandma, the local priest, and so on. After that, after the first two weeks of the project life, it started to growing and went viral a little bit without doing anything, actually. And I needed to find a way to go worldwide to give the possibility to every user to find out the place nearby them or just find the place by searching. So I went worldwide and I needed to make a decision if go to AWS for the cloud environment and spend all my money on that or try in a different way. I choose OVH cloud and also Digital Ocean, as you may probably know both of them. I choose, in each case, with a private server, with pretty the same capacity and performance, 2 gigabytes of RAM and 30 gigabytes of disk memory. Why I decided to go in cloud? Well, because after two weeks, first of April, I got more than 100 contemporary users and more than 20,000 page view per day. So Raspberry 4, I mean, I love a lot Raspberry, but it was dying. I needed to do something. And also, because going in cloud, it made me to have much better performance, much better scaling horizontally, of course, availability, and have a look into the future. Because as you may saw from the graph of Google Analytics, around May, where literally the world went in a lockdown, even more users came visit the website, the app. And for the future, it means around 1,600 contemporary user and almost 90,000 page view per day from all over the world. So Raspberry can't add for that, unfortunately. So let's have a look also to a schema and how the management of those instances. As you can see from the schema on the right, I centralize Redis just for a comfortable use for me, just to have all the data into just one Redis, not just on in each instance. But I had four APIs instance with Mojwiji, Nginx, and Flask API layer. And of course, when I went worldwide, I decided to buy the instance in different data center in Digital Ocean, one in New York, one in UK, and one in San Francisco Bay, just to try to serve with low DNS lookup to those domain. It just sort of optimization and performance, just to try to care about the details. Maybe I say too much that Redis saved my life on this project, but it's true. And now you maybe see the why. Just to give you a recap of what the endpoint were, on the right, you can see all the endpoints that I used. For geocoding, based on here and Google API, here is some Microsoft alternative to Google Maps. And the most important endpoint to get the estimated waiting time for a place, and then hold the place's endpoints to retrieve data for the place, such as the name, the location, the coordinates, the geocoding. Everything about the place. And of course, the feedback endpoint to the cloud sourcing feature that contributes a lot to make this project very useful for the user and for all the people. And in the end, just a little endpoint to login activities, such as trace error or messaging info just for the back it proposes. Of course, every user data was anonymous, compliance to the GDPR. And so no worry about it, because I don't know who was on the website or where. And on the API side, thanks to Movedji, everything was multithreading. And of course, this is a good improvement for the performance. And this is a very good feature from Movedji that is also implemented in Flask. Talk about Redis. I decided I tried a lot. Different solution to make things workable with a good performance. And we were talking about 30 second timeouts for most of the time. When I found out the good solution, it comes with Redis. And Redis saved my life by using the Geospatial index and Geospatial commands, such as GeoAd, GeoRadius. That's for searching purpose. And for setting also a caching layer in Redis. And as you can see from the right side, I put a little snippet of code. And it's the core of the project. It's basically the place in Redis is basically find the waiting times in our Redis from starting from a coordinates. And the coordinates were from the user. So even if a place or just the nearby places of the user. And Redis very, very fast. This query, it took just, I don't know, maybe less than a second to retrieve the result. And it can have up to 150 results. And it's a lot to make to put a market into the map. 150 market on the map are quite a lot, trust me. So on every request that the user made to the back end, to the API layer, every request was on Redis database and cache it in Redis database with a TTL. Just to give the performance at its best for the user. Because with those large amount of data, I never saw that amount of users in just one progressive web app or just a website that I made. So I was worried about it and said, OK, well, what if I did not return the correct result or the request starting to go in time out? Oh my god. So I found this way, this solution, it was the correct way to deal with those amount of data and with those amount of traffic for my side. Let's have a look also about the cost. Because I talk about all this data and all this performance issue, but also about the traffic. And I know it's not, sometimes it's not a real topic for a programmer to look into the money. But it's something that we should take care about to have also more knowledge to build this kind of project. And I hope that I would be a sort of inspiration for you or for the new developer and especially for the open source community. So let's have a look of this entire journey from a Raspberry Pi to a cloud environment, apart from having a couple of Raspberry Pi, actually. The cost in 2020 was around $30, maybe, because it was before the Raspberry Pi 4 came in. So I can say that the cost was around $30. And also for the domain, I say that it wasn't for commercial use or any kind of money proposed. So I reused one of my domain and just create a subdomain on that, because it was just for personal use at the beginning and then have them know where it went viral and worldwide. This was the prerequisite. After that, for the cloud part, I spent about $25 per month for the cloud usage for the four instances in digital ocean and OVH. And the total is around $75 for three months of usage. It's quite a lot, if you think, but for those data, for the goal of this project, it seems good and fine to me. The proposed was higher than the money. Also, a little note about this. The traffic was quite sustainable for this project, even if there are so many users on the page, maybe for the caching layer. But anyway, even in digital ocean and also in OVH, the traffic inbound and outbound was free until 10 gigabytes. So that was a big advantage for me just to find out and try to make everything better and better every day. And in the end, having also the front-end side that I'm not talk about in this speech, but there was, on cloud, on a CDN, it allowed me to satisfy more than 1 million of user and more than 2.1 million of session in three months of life with more than 1,500 on average contemporary user. And those are very big number for me. Number, I mean, they're just numbers. The proposed of the project was higher than money and number. I'm very happy to have the chance to help the people from all around the world and also people of the open source community help me to build this. And I would like to thanks also one of them, that is Daniel Defoe from Canada, for helping me at the beginning of this journey for the accessibility on the website and into the design of the first model that you saw in the previous slide. And I would like also to thanks the open source community, that is give me almost everything I can say. And I'm very happy to give it back as much as I can. And of course, again, I'm really pleased to have the chance of helping people all over the world. And I hope that with this talk, I can, in my little, inspire someone else to do the same and share the code, share the knowledge, help the next one. And yeah. And of course, the code is still available on the GitHub, as the link show you into the slide before and after that. Thank you for listening. If there are any questions, thank you. Yeah. Thank you very much. Very interesting talk, very interesting project. I hate queues too, of course, and perfect solution. So I see people are typing questions. There is already a first question about geocoding. Why didn't you use geocoding from OpenStreetMap? I think you used Google and what was the second? Here. Here, yeah, exactly. Yeah. OpenStreetMap was having some issue for geocoding for the same place compared to Google and here. So I decided to go safely to those API of Google and Microsoft for just data alignment for the end user. So that's why. OK. At the moment, there are no more questions coming. But comments like super inspiring, very inspiring talk. Thanks. Oh, thank you guys. So I think there are no more questions at the moment. So thank you very much again. Thank you too. Have a great day. Have a great day.