 We have our next speaker up. So next session is about Serverless Machine Learning 101 by Tanya and from Microsoft. Tanya, can you just come up on stream? Hi, Tanya. Can you hear me? I can hear you. Okay. Awesome. So hi, and welcome to Python India 2020, and we can get started with your talk. So for everybody who's new to Delhi Stage, we will have a 25-minute talk by Tanya, followed by a five-minute Q&A session. So please feel free to put your questions on Hopin and Tanya would be glad to answer those questions. Does that work, Tanya? Yeah, that's perfect. Thank you very much. Okay. All the best. Thank you. So thank you everyone, and I am very, very happy to be joining PyCon India. I'm going to be giving a very brief demo on how you can use Azure Functions with machine learning libraries, like TensorFlow, so you can serve and deploy your models in a very seamless way. So let me share my screen, and you should be able to see that. So as I've already been introduced, I'm a senior developer advocate at Microsoft, and I specialize in all things, machine learning, data science, scientific computing. Something that I get asked a lot when I talk about serverless computing is, because the name is very, very misleading. So a lot of folks often get confused, and they are not sure if the real thing is that there are no servers. But yeah, there are servers. The only thing that it changes from a more traditional architecture or a more traditional serving part-time is that you are not responsible for the management or maintenance, meaning that you don't have to worry about the servers, hardware being updated every so often, or the software being updated, because your Cloud provider takes care of all of the provisioning. In this case, because we're talking about Microsoft Azure, Microsoft Azure takes care of all of this. There are still a few things that you have to take care of, things like your, of course, your libraries, your code and some security constraints and the security of whatever you're deploying as well as your environments. But most of the infrastructure load is carried by whoever is provisioning your Cloud service. This opens a lot of opportunities, because instead of you spending a lot of time dealing with stuff like Kubernetes, for example, if you need highly scalable elastic compute, do you have much more time to focus on the code and focus on what you're developing rather than on infrastructure? This frees up a lot of developer time. Again, I've already mentioned that one of the main characteristics or one of the main advantages of serverless computing is that it's managed. But another super important thing that can lead to very significant improvements is that it all operates on a pay-as-you-go model. You only pay for what you use. In more traditional paradigms where you have your local servers or you maintain your own servers, you're paying for electricity, maintenance, you're paying for different services whether you're using them or whether you're using them to their max capacity or not. In serverless computing, you only pay for what you are actually using. If you don't have customers or you are not sending requests to your API, then your function goes idle and you don't have to pay for that compute time. The time as soon as somebody else or another customer needs to access your function or your service, then this Azure function goes from an idle state to an active state, and you can access the compute. Another major advantage is that it's highly scalable. It allows you to go from being able to serve one user to a few thousand users or a hundred users very easily. You don't have to worry about, as I said before, things like Kubernetes and all of that. This is especially important if you only have a small team, for example, or you are in a very small company, or really your scalability demands only relate to one product or one service. Probably some of you will have heard or I've mentioned before Azure Functions, and Azure Functions is our Microsoft offering for managed serverless. Well, our offering not only allows you to take advantage of serverless architecture, but as I mentioned before, we take care or Microsoft sure takes care of the software. It gives you tools and allows you to do monitoring in real time. It's by default enabled to scaling up and down, and the hardware is maintained, updated, and upgraded by Microsoft, sure. We'll also ask you deployed your function, let's say that you are generating an API or an HTTP API endpoint to do your machine learning predictions. Once you've deployed that you also, we also take care of the house management. Within this serverless approach, do you, in this case, the developer or the data scientist or machine learning engineer are responsible, of course, for the application code, developing the solution or your product, making sure that everything works okay, and again, decide what services to integrate. Azure Functions has a lot of bindings, so you can directly integrate your Azure Functions with services like love storage, email provider is databases. You can start integrating it in a lot of workflows, and especially when I talk about databases, it doesn't necessarily have to be Azure hosted. You can hook up your Azure function to whatever data warehouse or database that you are currently using, whatever that is, whether it's Postgres SQL or else. So again, it seems that these terms can be easily confused. I've already talked about Azure Functions, about serverless, white serverless and Azure Functions, and you're going to find out that as I go into demo, I'm also going to make reference to a function project or into a function itself, and these terms can be quite confusing. So just let's focus again, serverless is also called function as a service because it consists mostly of self-contained code snippets or self-contained code applications, and by self-contained, this means that you provide the runtime environment or the requirements and dependencies in this case for Python. We bundle up your requirement text file from which we create a virtual environment, you're saying your Python runtime environment of choice, it can be Python 36, 37 or 38. We're currently working on support for 39, and you have your code, your runtime environment and dependencies so that it all can't be deployed as a SIP straight away, and it will be directly serveable on the cloud. And because of this approach of having self-contained code snippets or self-contained scripts, that unit is called the function, and that's why the whole serverless approach is called function as a service, because it's readily deployable. Again, when we talk about serverless, there are a lot of things that are very, very interesting, and apart from all the characteristics that I mentioned before about scalability, pay as you go, managing infrastructure, I also mentioned that they can be idle at the time, and you can trigger an event that will wake up, let's say, during your function. And again, you can link to many, many different events. Through this demo, we're gonna be creating a HTTPA API, so what is gonna trigger a response or a function is gonna be a post request to the API endpoint, but it can be something else. You can have a chronological, for example, if you need to do certain data analysis, or data processing every day at a certain time, or every week, for example, you can also hook up to changes to your databases or your blob storage, and this gives you the ability to couple a lot of services. We also now have a share of functions that allows you to do much more complex paradigms, so for example, if you're familiar with tools like Airflow, where you have a DAG that can branch into multiple processes and then find in again or branch in again, you can now integrate those kind of approaches and you can couple with many, many services for that process using durable functions. And also by definition, and this is something that we have to be very, very careful is serverless by default is meant to be stateless and short-lived. So whenever you are creating a function or something on serverless, you have to remember that whatever assets you are creating within your runtime environment is ephemeral. So once your function gets shut down, that disappears as well. So if there's anything that you need to persist, do you need to hook up to another service or save it in another way? And also one of the, because all of this is meant to be triggering events and be very, very efficient, serverless functions are also meant to be short-lived. So be very careful because in the short functions, you have a time span of about 10 to 30 minutes, depending on the plan that you are, if it's a consumption plan or a premium plan. But if you are performing operations that take longer than 30 minutes, let's say, you will need to use durable functions so that you can use longer processing time. And again, finally, a sure function, sorry, serverless functions are meant to be asynchronous, meaning that you don't have to wait for a response for whatever process you're kicking in to kick in and be performed. And because of all of these characteristics, so before of all of these advantages, functions as a service are really good for image and video processing. If you are doing some sort of image capture and video capture and you need to use things like TensorFlow or PyTorch and Keras or whatever, functions as a service are very, very convenient because you can imagine that it will only be triggered when you add a new image, for example, to your storage or a new image is captured. And because of this, also, if you are working on things, like internet of things, I'm sure functions as a service are also very suitable, especially because you can have multiple sensors, you can be recording or collecting information from many inputs at different times, but as that information comes in, it can be processed in a storage in a seamless way and also data pipelines, data processing pipelines are a very, very good case scenario for functions as a service because of the different part times that I mentioned before where you can hook up different services, you can create now using dual functions, part times in which you can find in, branch in, branch out and can do some sort of parallel data processing and then send out directly to another database or another data source. So this has all been about serverless so far and if you have worked in machine learning or have tried to put machine learning systems in production, you're gonna find, well, straight away find out that machine learning systems can be very complex, both injury setups and the infrastructure that they need, which also makes putting machine learning in production and non-trivial tasks. It can be very, very convoluted because of the infrastructure, the setup, but also because of the different libraries, packages and dependencies that we currently use in the world of machine learning. And again, unless you're doing very, very large scale machine learning where your product is embedded in another product that is constantly used all the time by millions of people or hundreds of people, your resources usage is gonna be very variable, meaning that you're gonna have periods of time in which you'll have a lot of requests, for example, or it's gonna be loaded onto compute time and some other areas you're gonna have long periods of idle compute time. So these machine learning systems are very suitable for pay as you go to make sure that you're only paying for what you're using in terms of your infrastructure and your compute. So I'm gonna move on to a live demo and hopefully everything will run smoothly because demos not always do so. So we're gonna be using TensorFlow. So far, I'm not gonna be doing the retraining but I've used transfer learning to retrain an Inception B3 model to classify cats and dogs. It is a very simple machine learning, well, deep learning example, but I want to focus on how do we translate this into serverless paradigm. And for that, we're gonna be creating or I'm gonna be creating HTTP trigger a sure function locally. I'm gonna show you how to do that using things like VS Code, the sure functions extension. And then I'm gonna integrate this TensorFlow model that I retrained using transfer learning to create predictions based on the HTTP API endpoint. And finally, I'm gonna show you how to use again a sure functions extension to deploy the function to Azure. Just as these requirements, you would need if you wanted to do something similar and I'm gonna share these slides as well as a GitHub repository and some other resources later on to follow up if you wanted to do something similar. You would need Python VS Code extension and the sure code functions code extension. And exit here. And I'm gonna start by creating, fantastic. So I'm gonna start by creating a single directory called ML functions. Oh, I didn't, it changed to it. And I'm gonna open code here because that's what we're gonna be using for all of the demo. And you're gonna notice that my VS Code is very tuned to my liking. And that is one of the things that I liked the most about it because you can make it your own. So you're gonna see here on the right-hand side that I already have a sure. So I have a few assure extensions installed that I use very frequently. But in this case, I'm gonna be using Azure Functions. I've already logged in. We're gonna ignore that because the extension is gonna take care of all of that for me. I am logged in with my Azure account by default. If you've logged in before, it's gonna persist that. So I didn't have to log in again. And now what I'm gonna do here is I'm gonna create, sorry, an Azure Functions project in there. So I'm gonna use the folder that I have, use Python, and I'm gonna be using Python 3.7. Just waiting for this to load templates by default. This extension should give you a few templates to pull that in from. Again, as expected, it's not working. It's taking a bit longer. So let me see what the debugging says, problems. So let me see. Of course, because it is a live demo, it doesn't work. And I don't know why. Okay, so I'm gonna jump this for it and I'm gonna be using the Sure Functions. CLI, I already have that installed. For some reason, my, I think it is my VS Code that is dying. So I'm gonna try again and if it doesn't work, oh, there you go. So I'm gonna create a classify a function. I'm gonna use anonymous. That doesn't matter too much right now because that is just for, this is for a development purpose. And you're gonna see straight away that it creates a virtual environment. You're gonna see a big bunch of files here. And probably the most important ones, if I close this, is the house.json where it indicates the version of the Sure Functions that we are using. We also have a settings.json file in case you are binding to your databases and other stuff. And by default, it also comes with a requirements file. It's gonna create here the virtual environment. If I go to the, to the terminal, you should be able to see the logs in output. Once that is completed, there you go. And you tell C that you agree. Hello everyone. I think we have, we are facing some technical problem and seems that Tanya has disconnected and there might be some technical issues from a site. Hi, Shashankya. I think at the time Tanya is back, I was just going through a few comments. So there are no questions as of yet, but there is one comment which says the speaker's headset is very cool. So probably we can tell that to Tanya. Sure. And feel free to join Delhi stage on Zulip. So all the speakers that you see today will be present on Delhi stage on Zulip. So you can ask any questions you have regarding the talk and feel free to connect. Just to add one more thing, let's wait for a couple of minutes to see if we can reach out to her. Well, let's go through something that you can do during the networking session that we have. So there is a stream, yeah, networking session, right? Shashankya, I think she's up. We can just add her to stream. Hi, Tanya, are you up? Yes, StreamYard kicked me out somehow. Oh, okay. Anyway, we brought you back and let's... And you wouldn't let me in again until now. So, okay, fine. Okay, you can continue here. Sure. So sorry about that, but I don't know what happened. Anyway, I'm getting go super, super quick. So I didn't run into the other session, but you're gonna see by default that there are lots of new files that were created in VS Code. And the most important ones, I've already mentioned it, but there is a Dunder in a Dunder.py file where we are actually gonna be creating, defining the methods and defining our function so that we can use them. I also went ahead while StreamYard was playing with me and imported my labels, my pre-trained model and my predict script. So now what is left is I'm gonna modify this. So I'm gonna add JSON and I'm also gonna import my helper script. And then you'll notice that the main function is that of the type of HTTP response of HTTP request, sorry, because that's the trigger that we created. And I'm gonna ask, because we want to do a prediction on an image, I'm gonna get the parameters from the request. So when we do a post request to the API endpoint, we're gonna pass a URL from there, whatever we're using, we can use a command line or I'm gonna just get on, just gonna print out what image we got. Now I'm gonna remove all of this because I don't need it, this is quite simple. And I'm gonna create a new variable called Results where I'm gonna call my method and because this is all going to be a HTTP response, I'm gonna send some headers. If you're familiar with requests, you're gonna recognize that I am doing this. So my response can be of type JSON and I am adding another access control allow origin because I also want to add a front end and if I don't add that, I can't hook up my front end with a deshoer function. So I have that, I have imported the packages that I need, I have created my main function and if everything is successful, I'm not adding any try and catch at the moment. It's gonna return the response and just to have it here, this is a helper function that I have. I'm already using the model and the labels in this case, I am predicting or classifying ducks and cats. And this is a lot of extra code. It is a lot of manipulation so that we can use the images and thanks of this such, all of this is gonna be in our repository and you're gonna be able to check it. So now if I go back to a sure function, it seems like this is still not very happy, but I can use F5 and F5 is gonna, sorry, debug mode. I'm gonna actually put here debug mode so I can start up my sure function. Gonna see that it's installing a lot of requirements in this case, it's TensorFlow, Pillow, NumPy and all of that. Ooh, let me, I don't know why my function is not working. It might be, let me do a five. This is for some reason. Let me see if I can find why it's not allowing me to, I should have this installed unless something horrible happened. I need to try again because I was running this demo this morning and it seems that it definitely doesn't want to work. So let me see if I can run it from here and I'm gonna send an image. I'm gonna get just a random dog image here and it just doesn't want to run locally, which is a bit annoying. Well, this is not working and it should work. So what I'm gonna try and do is deploy directly. And see if this works using 37 and I am gonna deploy to West Europe. Something must have gone really wrong with uninstalling stuff recently. And you're gonna see one of the nice things is that I can just deploy directly from my sure functions from VS Code and if I go to my dashboard, if I go to my dashboard on Azure, you're gonna see that it's already creating all of these resources that I need. You can see the logs out here and now it's creating the application to, what did I do? I did function app. It is still being made or being deployed and you can see here that it is using Python 3.7 and it's doing a lot of, well, it's sipping so that it can deploy all of this. It's collecting and downloading the different packages. I should be able here to see my function app. Let me maximize this a bit and I can go and look directly. Here by default, it gives you a URL for your function app and you can see the different activity logs because we've not yet created, we've not yet created any requests or anything. There are no events up there. Let me go back and this might take a while because it's downloading again and creating the virtual environment here and I am gonna wrap up here because I've had so many technical issues this morning, I don't know why, but I'm gonna be in the off-scale booth session for the rest of the day. I don't want to take more time off any other sessions and I'm very, very sorry for all of these issues. It sometimes just happens. No problem, Tanya, thanks a lot and yes, it's completely offline so we assume that this thing can happen so we can't do anything about this. We request all the attendees who are watching this session right now to go on Zulip, go to the Delhi stage screen and all the speakers from the Delhi stage will be present there so you can have a conversation, ask any questions regarding the talk that you want to and the speakers can also share resources that they were using, the presentations, the code base or whatever they want to share with the attendees on Zulip chat. So feel free to go there and thanks a lot, Tanya, any closing words? Thank you and again, this is what happens when you're trying to do live demos, it's okay, thank you. Thanks a lot, Tanya, thank you.