 Okay welcome everybody we're glad that you came to Pi Ohio we're having Eric Appelt who's going to be talking about co concurrency and co-routines so if you didn't know all about it before he'll tell you everything you need to know right okay thank you and thank you to Pi Ohio for giving me the opportunity to talk about concurrency and co-routines sorry okay I'd also like to thank my mentors who have been teaching me an intensive course in concurrency for the last two and a half years Jacob and Sophia they're great they can be very tough instructors but they're loving and fair so it's good okay so let me go start with the setup for this tutorial and outline of how this should go what we're doing in the expectations and then we'll do some warm-up exercises with the one thing you need basically this is a co-routines are our new feature but a core feature of the language so all you really need is Python itself and a terminal to the type in but it needs to be at least Python 3.5 because any earlier version of Python does not have does not have the co-routines and the syntax that we'll be using there's two third-party libraries that you can just pip install one is requests I think that's pretty commonly used the other is AIO HTTP so you can just pip install both of those if you're unable to install both of those you could still follow along with I'd say approximately 60% of what we'll do there'll be a little bit where you'll need to connect to the web there's a repository for this tutorial on my github account that's spelled A P P E L T E L co-routine tutorial in the examples directory all the codes that will type out together are there you don't have to check it out but you can if you want to save yourself some typing or you just don't want to type a lot you can and you can use these I've printed it out so I'll be using it as a cheat sheet so any questions so far if I'm going too fast if I talk too fast if I type too fast please feel free to let me know and also if I'm failing to speak in the microphone so here's the we'll do a little warm-up with something I call the animals API just to frame what will the example it will use to show how you can really use concurrency and then I have a metaphor for concurrency that part will be more like just a mini talk I wanted to make it interactive where everyone would cook something but I don't think the fire marshal I don't think that would work well with then it'd be expensive to and so then we'll we'll actually do some running of co-routines by hand that's not something you normally do with Python usually have a scheduler to run for you but in order to understand the syntax and what Python actually does I think it's helpful to really just just run them run a couple by hand and then we'll go through the the scheduler that comes with Python which is called I think I oh and then I'll use we'll use the AIO HTTP library put write a client write a concurrent web server and then if there's time do some cool stuff with streaming web connections and publish and subscribe okay the other thing is I meant to bring a C and say I don't know if everyone's familiar with C and say this is a piece of hardware you pull the cord and the arrow goes around and it tells you what the animal says so example if it lends on the cow it will tell you that the cow says move so if you're a business and you need to know what the animals say this is an important piece of hardware but being a piece of physical hardware you can run into problems like for example I left mine in the hotel so I don't have it here to show you and and then if you own one for your business you have to you need an iron operator to make sure it has batteries to pull the cord for you and so I've offered an enterprise cloud solution to determine what the animals say so and and you could compare this to having this on-premise solution is that with the with my solution there's no capital expenditure requirement we have capacity billing so you know you pay as you go there's a very easy to use restful web service API and let's just do something as an example so let's get out the terminal and just run Python now for this I do we do need the requests library does everyone have requests oh yeah lots of nodding so I'm going to import and then let's just do type in my name animals and then cow and this should return to us it takes a while you have to you have to pull it you have to the cord has to be pulled and has to go around okay so I messed up I forgot to assign the response so I'm gonna have to do it again ah good call okay so yeah so we can access in the in the Python repl you can access the previous return value with underscore variable so now we have the response and what I want is the text of that response which is new so there you go so you can try this with your own animals it's but yeah the cow generally says new so let's make an example application given a list of animals will connect to the animals API to get the animal sound for each animal will say the X says why where X is the animal and Y is the sound and what I want to do is a structure it so there's a subroutine a function that retrieves and prints a given animal we'll just call that function over and over for each animal so so I'm going to make a module called animals dot pi py so first thing we need to do is import requests to make my life easier I'm going to make a constant string for the base URL okay now I'll make a function which will fetch what an animal says and return it so I'll call this function speak it will take as an argument the name of an animal and I'm going to give it a request session object so what I need to do is just get the animal so I'll do a format string I'll construct a string from the base URL and the animal and let's go ahead and throw an exception if you get a bad if you if we get a bad status so if the animal doesn't exist in my database or my virtual speak and spay say then this will throw this will raise an exception okay and then the text of that will be our animal sound so I simply return a string that says the animal says whatever sound we got I'm going to go ahead and run it test this function the way I'll do it is with the Python dash I that will load up the module actually run the module and then dump me into an interactive loop and I can make a recession oops okay that works let me go back to the code so if you're still writing it okay now I want a main routine that calls this and this main routine will have a list of animals I'm going to do a cow a pig and a chicken we'll make a session object and then I'll loop over my list of animals for each one I will call the speak function and pass it the name my animal and my session object and then I'll just print it and at the end of this I'll just I'll just have this run the main function and while everyone's typing the point this is actually I better not say anything okay can I go ahead and run it shake my head if you're not ready now okay give it another minute okay I'm going to go ahead and try it if you if you still want to bring it up it's it's in the repository under examples and you can look at it while we're doing this the cow says moo the pig says oink and the chicken says clock yes why is there such a lag that's actually okay so the real reason the pretend reason is it takes a while for the speaker to say the real reason is this is the web service that just takes some time and you can imagine if you're if you're making a call to some database you know you may make that network call and then the database just takes two or three seconds to process the query and get back to you and and this is basically a simulation of that and just sort of a silly way right now I'm pretending on my web server there's just asleep but you can imagine a web service where it takes a couple seconds to get back and the important thing is us on the client side we're not doing anything but waiting right we're wasting time the server is actually doing some work and and that's that's kind of the key point here okay did that work for everyone I'll put this up here for a second any other questions so yeah this was this was clearly slow and it takes a little bit of time to get each one it'd be nice to fetch them all at the same time and that's it that's ultimately what we'll do but now for something completely different I want to talk about cooking with co-routines and then we'll come back to actually using co-routines because co-routines I think are very much like using recipes and cooking I would like to talk about a very simple recipe that I make after work when I don't have much time to cook and the family is hungry this first recipe is orange ginger salmon by the way if I fail at this tutorial and you learn nothing about how to use co-routines you'll at least have a nutritious meal that you can make so it won't be a complete waste of your time either way so I preheat the oven to 350 degrees you just put the salmon fillets on a cooking sheet you take this orange ginger dressing I've got that they sell at my grocery store put about two tablespoons on each one and you throw it in the oven and bake it for 18 minutes that's it with that I have the boxed rice pilaf this is good stuff too all you have to do is put about a cup and three quarters water and two tablespoons butter in a two quart pot bring it to a boil then you stir in the spice package it's all pretty done for you and the rice pilaf covers set the set it to low you let it simmer for 25 minutes and then you fluff it with the fork and let it stand for five minutes and it's good to go and this is my favorite the steam and bag green beans to make these you take a fork and you poke holes in the bag and then you put it on a microwave safe plate and your microwave for five minutes and then voila dinner in about 35 minutes and this is great and the best thing about it is most of that 35 minutes I'm not doing anything right I mean I can do something else while all these things are cooking you know I don't have to just stare at the oven and wait for it and so you know with the kids I can go play with them if I'm feeling like super dad or if I'm worn out you know maybe we'll put an episode of Paw Patrol so but this is Python so let's talk about automation and what would happen if we tried to automate dinner I want to reimagine these recipes as if they were Python functions so here's our orange ginger salmon we need a sleep function from time and from the kitchen module we'll just import all the stuff the food and tools we need the oven the baking sheet the salmon and the orange dressing we have our cooked salmon routine it's preheats the oven you see it sleeps for 18 minutes and then at the end it returns everything that was in the oven here's a recipe for rice pilaf again Python is batteries included everything's in the standard library and again there's lots of if you notice there's just lots of waiting around sleep for 22 minutes sleep for five minutes and here's a recipe for my favorite the green beans you just call poke on the green beans insert them in the microwave call this cook function and then extract return what the result of extracting all of it okay then we could put it all together in one function called make salmon dinner meat equals cook fish starch equals cook rice veggie equals cook beans emory return a tuple of meat starch and veggies which you can then eat okay and what's great about functions and ordering and encapsulation the details of cooking a fish are encapsulated in that function right when you call a function all that really matters aside from some side effects is what you pass to it and what is returned it doesn't matter who called it we can make new dinners by switching out different recipes this is all very nice and one thing about programming and function I don't think I think this is so this has become so ingrained the the use of functions to organize code that it's easy to forget just how much simplicity that buys you and that I can imagine going through the code with a stack of note cards and when I start reading a function I write down that function what I pass to it and start writing down the note card what the functions doing and I put that on the table and when that function calls another function I just pick up another note card I write down what parameters it has and I start working through that function and when it returns I would just take that card away and go to the one below it and so you can imagine making a stack of note cards with these function calls and you can understand the order of the code in the order of execution very well and so when you start talking about doing different had dealing with multiple things at the same time that's what you want to preserve that ability to understand the order of your code as best as possible now it said let me look at the time analysis if I have my kitchen robot doing this this is not good it's going to cook the fish right and it's just going to wait and do nothing else for the fish is cooking it will literally just stare at the oven for 20 minutes and then return and then I have my fish right and then it calls cook rice and returns back and I get my starch and so the overall is just too long it takes almost an hour where I human being can do it in 30 minutes and that's the difference between what a function and what we'll get to a co routine a function you call it it does everything and then it returns control but when you're cooking with recipes recipes look like functions but when you get to a point where you don't have to do anything like you're just staring at the oven you'll pop out of thinking about that recipe being an intelligent human being and then go to a different one so recipes unlike functions you can return control to something else and then come back to it later and finish that's the what co routines by you you can write down phone phone code that looks like functions but the order in which things are done is not sequential and you can take advantage of times when you're just waiting on stuff to happen and then finally one thing that I should bring up is the difference between parallelism and concurrency and if you wanted parallelism I'm not selling it here we're not going parallelism is doing multiple things at the same time it's when it's where if you have two CPU course you do it both and in the cooking metaphor you would need parallelism if you had a lot of work to do like if I wanted to chop dozens and dozens of onions I need help I need another CPU core IE you know family member or friend to come chop onions with me and we can get it done twice as fast concurrency is just dealing with multiple things going on at the same time I can call my brother to help me chop dozens of onions and we can get it done twice as fast cooking the same at dinner I don't need to call him I all I need to be able to do is switch between tasks intelligently so well one is in a waiting state I can work on something else so so that's what we're doing and that's why this works well with IEA where there's a ton of waiting CPUs are extremely fast gigaflots billions of floating point operations per second that is billions of mathematical calculations can be performed every second a CPU can get a lot done but there are fundamental constraints on how fast we can do networking if I want to send a message to France I'm limited by the speed of light I cannot send a message to France faster than I think it's about 40 milliseconds so round trip if I want to do some IO across the Atlantic that's 80 milliseconds that's almost a tenth of a second right there and in reality it's it's more like a latency of maybe 150 or 200 milliseconds okay any questions about concurrency in general or about what we were or about what you we hope to achieve with a coroutine that is a coroutine is like a function it's like a subroutine except instead of just calling it it runs and then it returns you call it it will return control when it's waiting and you can go back to it later okay so so for this section we'll understand the coroutine syntax and Python understand how they're run and see what the libraries are available for scheduling coroutines so what I want to do is make some again rather pointless and silly functions to serve as examples so I'm going to create a module called by hand dot pi and in this I'll import time because I want the sleep function and I'm going to make a function for squaring two numbers this is a very simple function to write all I have to do is return x times x but I want to do a little I want this function to be verbose and slow so when it begins it will say what it's doing it will say it's starting the square function for whatever its argument was x then let's go to sleep for a good three seconds oops and then we'll say we're finishing and let's do one more one we'll recube a number that is multiplied by itself three times we'll go to sleep for three seconds and in order to cube things one thing one way I could do that is I could call the square function right and then I could just take that result and multiply it by x again take it nice and slow here I'm glad no one's left yet because I know this has been because they're by promise you we're about to get into the cool new syntax and turn these into coroutines okay I think I'm gonna go ahead and run this let's cube a number see it called square now that has to do its thing all right that's the cube of five is 125 now that's a function it takes its argument you call it runs and then it returns its value it cannot you can't stop it and come back to it later so what I want to do is take the square function and turn it into a coroutine function so here's the part if you don't apply Python 3 5 you're about to you'll you'll get an error a coroutine function is defined by async def a regular function just by def so now square is a coroutine function now this isn't going to work but let's see how it breaks so what do I do with the coroutine function I could try calling it and I do not get 16 I get back coroutine object square at some memory address and this is the key difference when you call a function in Python it instantiates and it creates an instance of that function with the parameters you gave it it runs the function and it returns to you the result when you call a coroutine function it creates an instance of that coroutine with your parameters and then it just hands it back to you it does not run it it just makes the coroutine and gives it to you so we should so let's let's assign that to something so we can work with it okay so now coro is our coroutine object that we got when we called square with the argument for so the question is how do you run a coroutine and this is the part where in actual practice you won't have to do this but we're running these things by hand if you want to run a coroutine by hand you call it send method this is used to communicate with the scheduler we're gonna send it none and that's interesting it did its thing right it executed the code the suite of instructions and then it raised to stop iteration exception and in that exception is the value so if you really wanted to do this you'd have to catch that stop iteration exception and then get your 16 back so far so good okay let's try calling cube and see what happens okay yeah anyone care to explain why that what happened here I'll say it okay it says type error unsupported multiplication operand for types coroutine and int let's take a look at at the code and see what's wrong with this here we said y equals square x what happens when you call a coroutine function you get back a coroutine object so why is the coroutine object it is not the number because we never ran the coroutine so that didn't work and in general you don't run a coroutine from a function you can delegate or in effect call and run a coroutine from another coroutine so what we need to do is change cube into a coroutine and now will this work no yeah and the reason it's no work is we're still we're making a coroutine object here when we say y equals square x we need for this thing to make the coroutine object and in effect run those instructions and then get the result back and the way that a coroutine runs another coroutine delegates to it lets it run and takes its return value is with the await the statement of the it's is it a keyword yet to be a keyword away so when you await on another coroutine it creates the instance it runs it and then it gives you the result now this is a little different than falling calling a function because it's possible that in the middle of running it will stop and let the and let something else take over we haven't gotten to how it does that but we're almost there any questions okay let's try it now oops remember calling a coroutine gives you a coroutine object so we'll assign that to that and then to run the coroutine we'll send it none and you see it ran square so coroutines run other coroutines by awaiting on them and get the result and then ultimately the result was handed back to us in this stop iteration exception which was raised when it was done so far this is not very useful we haven't gone over how to actually how the coroutine could do what it really is supposed to do which is in the middle of what it's doing send back control so when it's sleeping what we want it to do is just return to us let us do something else and then call it back in three seconds rather than just blocking everything for three seconds that's what it's doing now it's called it's blocking everything for three seconds we want it to still wait three seconds but we want to be able to do something else while it's waiting and so I'm not going to explain this you need a special code type of coroutine that actually interacts with the schedule with the schedule that runs these coroutines so I'm going to get rid of this import time and we're going to make a new sleep a special sleep coroutine this is this is what's called a generator based coroutine and unless you plan on writing a library like async IO unless you want to write your own scheduler you don't have to understand this but we need to do it to make this tutorial work what this special coroutine will do when it's called is it will yield control back to whatever's running the coroutines and it will tell it to call to start me back up in our many seconds so I'm going to change these time dot sleeps to a weight on a manual sleep and I'm calling it manual sleep because we are a manual scheduler we're scheduling coroutines by ourselves yeah so this is right the manual sleep is a generator if you're familiar with Python generators that decorator changes some flags in the code object to allow it to work with coroutines with native coroutines and this is this is a this is something you need to know if you're building a scheduler we're going to use a scheduler from the scantander library that provides all that for us so when you're typically in an application writing coroutines you would never need to do that all right so far so good should we try it so I'll make a coroutine object and then I'm going to run it and look I'm out of the way it returned control back to me and gave me the value of this string that says please call me back in three seconds well I've used up my three seconds talking so I should call it back so I'm going to send it none again to start it back up so it can continue on its way doing its work it delegate it awaited on the square coroutine which awaited on the sleep which return control to me so I can keep talking again I've used up the three seconds talking so let's let it continue and now it finishes so far so good so here's the real magic here's what I can do I can do coroutine one equals cube two and I could do coroutine two equals cube ten I can start running coroutine one okay well now it's waiting so what can I do I can get coroutine two running because I have I'm able to so now coroutine one and coroutine two have done some work they're both in they're both waiting to be called back I'll call row coroutine one back it's been three seconds I'll call coroutine two back you know and you could see I had I had both these coroutines doing stuff concurrently and what happens is when the coroutine gets in a state where it's waiting for something to happen in this case just waiting for three seconds to elapse it lets me do something else I'm the scheduler and I can run all these coroutines so far so good okay so you would never you never actually have to call send when you're writing applications you use a scheduler library which does all of that for you let me go back to the slides oops slides are out of order I apologize there are a number of scheduler libraries available in Python which run coroutines for you and provide a lot of operations one of the most well-known and well-established one is twisted and twisted is really the the sort of grand parent of all these ideas a lot of a lot of the the syntax itself and part not the syntax but the ideas of how this works in python owe themselves to this development and twisted is still way well maintained today and supports the new syntax one will look at is async IO because this is this is the standard library there's a commitment from the python community to support this as of python 3.6 it is a it is no longer provisional it's a full member of the python standard library you install python you get async IO for free it does predate the modern coroutines and it has some additional abstraction layers for doing things like callbacks and then there's some other new libraries that I'd like to mention curio is a newer library with a simple design just based on the the new features of python 3.5 and focusing on coroutines alone and similarly there's one called trio and my apologies to everyone else who's riding cool schedulers and I haven't mentioned you or has written them in the past okay so with that let's change this to something that uses async IO so I'm going to delete out this manual thing and I'll import async IO one thing async IO comes with is a sleepco routine and notice now these now the code that I've written is sort of bound up to it's it's async now what I've written has become async IO specific code I think one thing that people's a misconception that's been happening is that async and await have to use async IO their features of the language itself and you can use all these variety of schedulers but once you start using the coroutine specific to async IO in your code then aside from some compatibility efforts your code is now is now specific to that scheduler okay so how do I use async IO well the scheduler for async IO is called the event loop so I make a call to async IO get event loop to get a reference to that loop and now if I want to run the and now I can take a coroutine I can make a coroutine object and I run until complete what it does is that the function I give it a coroutine object and it will run the scheduler until that coroutine object has completed running and give me the return value and then it will stop the scheduler again so far so good but we're missing the big key thing right we want to be able to run multiple these at the same time or have them started up and running concurrently that there's they're all going on and the schedulers running multiple ones of them and while it's waiting there is there is a method there is a coroutine function in async IO called gather and what this does is it takes coroutine objects as arguments and it returns you a co a coroutine that when you run it it will run it will run all of the ones you gave it concurrently so for example I'll give it okay so now group is a coroutine that will cause all three of these to be run by the scheduler concurrently so if I call if I call run until complete on this go routine this should run them all concurrently and you can see that the scheduler is doing its job it's and now it returns to me a list of the cubes of these three numbers yes it's a good question it's in the order you you pass them together yeah what order they actually execute in I do not believe that's had there's any guarantee so they could be executed in any order which one finishes first and you could kind of see this it started one then it did three then it did two but your responses are in the order that you gave them it's in the order you you pass them yeah so you don't get any guarantees on who finishes first but by by specifying the order of the arguments you specify what order you get back yeah otherwise it'd be a mess right how could how could you use this okay so that's the basic syntax but there's a couple more pieces of basic Python syntax that I need to talk about to sort of complete and so let me go back to normal iterators like hopefully we're you know for loops are pretty standard but I want to go into how four loops really work you have a container and what a for loop does is it call it actually calls a special method on that container object to return something called an iterator and it takes that iterator and it calls a special method called next that gives you an item and then it executes the code below in this do stuff right in the suite of instructions and that's a function call what would be useful in terms of asynchronous IO is you could imagine a container that's not really a list or a tuple or a dictionary but it actually in order to fetch things that has to make a database call so in that process of fetching items it's calling out to the network and in this case we want to do that asynchronously while it's doing that we want to be able to yield control back to the scheduler and let's something else run while we're waiting and for this there's asynchronous iterators so if you have a special container that's in its asynchronous iterator you can use this syntax async for that will only work on an asynchronous container and what it does is every time you call something call something called a next and instead of giving you a met instead of running a function it returns a co-routine that the scheduler will run so the important thing to know that if you have to use async for every time you get to the top of that loop your code make a return control to the scheduler and allow other things to run while the next item is being fetched any questions on that right well yeah one thing though it's this this is a sequential in that if I'm a co-routine doing an async for it's happening one after the other so so I'm not I'm not yeah second thing is context managers and these are really useful the way a context manager works is when you say with manager as M the manager object it calls an enter method upon entering the block which could do cool things like if you use file open can open files set up network connections and give you back an object M that you can use to do stuff with it and then when you exit this indented block of code or if you raise an exception this exit method is going to eventually be called to clean things up and so here in the context of asynchronous IO when you set up the network connection and when you clean it up we don't want that to block we want those to be co-routine so you can yield back control to the scheduler and that's what the the syntax async with is for it's you have to have a special context manager which is it which which has these other special methods which will give you a co-routine back which will be run so the important thing is if you have a a synchronous context manager when you enter that block of code and when you leave you may return control back to the scheduler allow other things to run and that's the basic syntax one feature of this whole system is that when you look at a co-routine and this is if you're familiar with threads at all this is a very different program of the threads you know looking at your co-routine something about the order of execution unless there's an async for an async with or an await your code is going to execute in the order you see it it's only when you await that that some the scheduler will allow other things to run it's cooperative in that sense so it can make understanding the order of execution and the safety of your code a little bit simpler because you know when when other things might happen around you and when we know when the order of your code happens as it's written so there's a couple more things to really be able to use async IO that you need to know I think we've we've seen some of them one is the gather command the gather method which you pass a number of co-routine objects and it will give you the results in a list we've seen async IO get event loop that gives you the event loop and then you can run methods on that event loop we've seen run until complete run there's three more things I'd like to go over briefly just because this set forms like the minimal things I think that as a as a novice one should know when actually using this but I'd like to let me first go back to gather one thing about gather that you might ask is what happens if something throws an exception so let's make that happen what I'm going to do is change the square co-routine to just fail if you give it seven all right so the square function co-routine here will admit that it's not really all that good at its job it doesn't know how to square seven so so if I want to run a lot of these let's see what happens what happened is we didn't get anything back except for an exception that uncaught exception was was the only thing returned you could change that behavior if you want to when you're running gather so let's make another group of co-routines you can call gather with the keyword argument return exceptions and set it to true if you do that if any of the co-routines being run concurrently raises an exception it'll come back to you in the list of results okay so let's have a new group of co-routine objects and we'll run this other one okay yeah so now you could see it was able to do five six and it returned the the actual exception object so that's something if you if you want to just blow up you can blow up let things blow up if you want return exceptions to be returned so you can get the values of the things that actually worked you could do that too there's a if you see here we go when I when I called gather I gave it a keyword argument return exceptions equals true and did I wait I'm wondering let's see if in the interest of time in the interest of time I'd like to come back to create tasks and running executor if we have time let me say one thing about run forever run forever just makes the run event loop run forever the way to stop that is to have a co-routine that calls loop stop so you can run a loop forever when you when you want co-routines to continue executing till they decide it's time for it's a time to shut down create task returns to a task object which you can use to you can use to potentially you can wait on when that task is done that that has being a scheduled co-routine that's running you can check its result you can you can cancel it run in executor is something for legacy code if you have a piece of blocking IO a blocking IO function you can run it in executor and what it will do is launch a thread to run it separately so the scheduler can still run a asynchronous IO without that interfering okay but yeah I'm not I'm gonna skip a couple examples because I want to get to the important part the meaningful part how do we do actual IO sleeping is great but it gets boring after a while so async IO provides two different APIs one called streams and one called transport protocol for socket programming this is the I'm trying to make this a novice tutorial so I don't think we rather not get into socket programming it's not in scope so but the thing is what you could do is rather than using that sockets layer there's lots of third party add-on libraries for popular application protocols and a lot of them are contained in the aio Libs prop project for example there's aio HTTP for doing HTTP requests it's it's has a syntax that's very similar to the popular requests library it also has a web server with a framework that's kind of flask like and you can see there's other ones I like aio Redis because I like to use Redis there's my SQL so on let's do some stuff with aio HTTP and go back to where we started the animals example animals and aio HTTP I lost my notes so we really have to let's go back to the animals thing we wrote and so for this you will need aio HTTP installed and what I want to do is rewrite this so that it it uses co-routines and uses aio HTTP to make all these requests concurrently so we don't have to wait another five seconds for every for every animal sound we can get them all you know in the same in a group okay so first thing I'll do is I'm going to let me start from the main routine and change it back so I'm going to make this to a co-routine we'll have a list of animals now instead of making a session object I'm going to use an asynchronous context manager and before I do that let's make a list of create an empty list to put our co-routines in and then I'm do async with aio and this is how you get a session in aio HTTP it still looks kind of annoying and request you could just say request session and assign it to something why do we need an asynchronous context manager that's so what we're setting up the session and tearing it down doing any cleanup that's done asynchronously and the scheduler can still run other things should it need to okay so I need to indent this a little more and change what I'm doing so for each animal in the animals I will get a co-routine I will call this speak co-routine function which will have to go back and with the animal in the session and we'll go back and turn speak into a co-routine and let's append that to our list you could use a list comprehension I decided not to just keep things simple in terms of syntax okay okay and then we'll run all those co-routines concurrently and then to pass a list you don't pass a list together you pass several arguments so we'll use the splat if you're not familiar with the splat you may not be I think I was doing Python for years before I finally figured it out and what it does is if you put an asterisk be in front of a function parameter it takes that list or any sort of iterable and explodes it out into one argument for each item in the list okay and then we'll just iterate through those responses and print each line okay so that's that's the main routine any questions while people are still typing yeah why does ah you're right okay I yeah yeah very yeah I had indented this wrong you're right okay so I was wondering like how did this ever work um no you're absolutely right this would not have run because the session goes away once you get out of that with yeah and all these co-routines would be talk it would be having having a session that expired good quit any other questions sorry about that okay let's go and fix up speak so that it's a cover routine so we'll change it to async def and here it's a little different too than how you get when you do a request and with the HIO HTTP the preferred preferred way to make that request is as a synchronous context manager the reason for doing this rather than just sort of a waiting on it is that an HTTP response can can do things like stream once you get the status code there's you could you might you might want to you know get the initial response and then do other things and then get the body of the response so and then the other difference here is that text does not an attribute of an AIO HTTP response it's a method I mean it's a co-routine method so you have to await on it okay right yeah that's a good question async when you async I gather respond returns a co-routine object I used run until complete because that's the that's the car routine I wanted the loop to run until it was done if I'm calling a gather from within a co-routine you'll wait on it so for example what I'm going to do is oops I need to go to the top I also need to import we'll need to import async I because at the bottom here if we want to run this what I need to do so yeah when I do run until complete that main co-routine object will be scheduled and when it's done the loop will terminate I could I could I have to have some coroutine that for the loop to run on I have to schedule something but once main is scheduled when when the main co-routine object awaits that automatically those are automatically scheduled as objects that are delegated to I don't sorry I'm not coming up with a better answer to the question oh good question yeah I think I did I got rid of the main because now we can't just yeah if we just called main we'd be having we'd be sitting here with the co-routine object we need to get this we need to get the event loop and pass main to to run until complete to have it actually run the co-routine exactly yeah and you can also schedule with create task so yeah if you're not and if you're in a co-routine you can await on other co-routines to get them moving and then get the result if you're not in a co-routine you can't to wait so there has to be some mechanism to get the ball rolling so if you're in a function you can you can cause a co-routine to be scheduled with create task or you could do run until complete which will schedule the co-routine and it will run the event loop run whatever co-routines that has scheduled until that one's done okay so let's run it and see if I messed up there we go get all the responses at once we caused the web service to check out the cow the pig and the chicken concurrently mm-hmm yes okay so I believe on yeah I believe there's in principle there's implementation dependence for any TCP IP network IO it's it's going to use I it's not it doesn't need to use threads it can use a select call on sockets so under the hood I'm 99.9% certain it's single threaded and it's using it's using a select now on Linux or on Linux or I think any POSIX system if you want to do a synchronous file IO under the hood it has to it has to fire up a thread there's no way around it but for network IO yeah the basic network IO with sockets it's using it's doing select calls underneath with a single thread yeah yeah so that's and this is a good point there's there's there's there's you can you can do things with multi-threading there's things you can get into where you have multiple there's reasons why you might want multiple event loops to be run and that can cause this this has been improved in 3.6 when you do you know async I'll get event loop that gives you the default event loop in some context you have to be careful about which event loop you're getting did someone else have a question that I skipped over apologize yeah okay so this is why can you not call close you can yeah calling close I have it I didn't I didn't put it here but I put it in the typed examples calling close you should do when after the scheduler has stopped running and once you call close it's done you can never start it back you can never start that event loop back up it will fully clean up so that's a clean up method so it doesn't it doesn't make sense to call close within a co-routine right oh yeah as far as I understand async IO get event loop is does not support context management is that does that answer it oh why yeah why can you not use get event loop as a context manager I don't know that's a good question I have to think about like if that would yeah in that context maybe it maybe a context manager makes sense I don't know it hasn't been written yet so yes and let's look at the docs oh sorry yeah can you is there a way to get multiple event loops it is there a way to get different event loops if you wanted to have multiple loops running you know is there a way to get a second and have different named loops I believe there is an argument to that and I have to now I think if you're if you're in like a different thread and you call get event loop you okay so alright I'm gonna I'm gonna admit my own ignorance I do not remember how to do this offhand and I'm not a hundred percent sure I ever knew how to do this but I want to believe I did but I can't remember how to do that yeah there there are contexts in which that makes sense like I so for example you could have a multi-threaded web server where you have eight CPU cores you want one event loop running on because this is a single threaded idea so you want one asynchronous worker running on each thread but you want some shared states so you want it to be a single process so that'd be one example there are much better examples that I don't know okay so a couple words of caution when you make lots of concurrent requests make sure the server that you're you know when you make five you now have the power to make 500 requests at once and not wait for the results to come back so you could do things like pushing all the buttons on the elevator and so make sure whatever web service you're connecting to and giving a hundred requests at the same time expects to handle that sort of load or that's that's an approved way of using it and there's also some interesting things you can get to what's called back threat the need for back pressure a sick making asynchronous calls like this you're telling something else to start doing something and not waiting for it to get done so you could give it more work than it can handle very easily and it may need a way to tell you I can't handle any more work or it will fall over so the next piece of this was to take the server side I was reluctant to do this as I run the successful enterprise service with the but as this is in the spirit of open source I figure I should open source a rudimentary version of the of the of the cloud speak and say so just as an example here's how you can use AI OH TPT here server so we'll make this actual animal API service okay oops I went too fast with that I apologize so I'm going to make a module called web service animals and this is going to be a web server which can handle incoming relet requests concurrently what's nice about this is I can have one single threaded worker that can still handle several dozens of requests coming in before they're completed because when it's starting one handling one request coming in that will be a co-routine and it can just schedule another for each request the worker doesn't have to wait for one request to finish and return before it can start work on the next one okay so I will need the sleep method and I'm going to import the web method from AI OH TP which I can use to make a server now I'll make a dictionary called the farm and I'm not going to give away all of my secrets but I will let you know what a couple of the animals say you have to subscribe to the service to get all of them and the premium animals um no but you can add them to your farm dictionary on your local web server I should at least have llamas I mean yeah I think you could get the llamas and the alpacas are on the $5 a month tier if you want like the Komodo dragons and you know certain salamanders you got to go up to the 25 tier let me make a very simple hello world request handler so a request handler on this server is written this way it's a co-routine function so um so this is our handler so we'll make this and we'll get to the animals later but when someone connects to this web service with this the well we'll have it return to them the text welcome to the farm and you notice that this handler is a co-routine so this web service even though it's single threaded one worker can handle multiple incoming requests coming in concurrently before the first one is even finished okay so here's how you make an app we make an application object we will add a get method to the animals route and that get method will call our hello co-routine function and then you call web run app on your app on your web application and that will run a server the AI HTTP server documentation has some good more complex examples so far so good can I do it all right so if I call this it will run on port 8080 so I'm going to go to another terminal so we're not anyone familiar with curl you can also use your browser you could you could type this into your browser or use curl or use python requests and there it goes welcome to the farm so far so good okay I'm going to stop this now and add another handler which will actually give you what the animal says so to get the name from the URL we'll call animal the request object that's passed to the co-routine function has a match info which will get the name if the animal is not in our farm let's return let's give them a 404 not found you can set the status no you're right thank you no it's not it's not a new pipeline for you that is a syntax error okay right we want to return a response object that the server will then handle and hand to the client but the animal is in the farm will await on a sink I was sleep for five seconds giving that artificial and if this was a real serious web service here this would be some database connection right we'd be doing some complex query that took like 10 seconds to do and and we don't want to clog up this web server to okay and we can return a response which is just the text of we will fetch the animal from the farm dictionary and that's it but we've got to add the route so here's the we have to create a resource and then we'll add a get method oops sorry add route no no no sorry I did that wrong and for that resource we'll add the route for the get method which we'll call the speak handler that's a go if you if you forget the await let's let's do that after we run it yeah that's a good enough question so the question is what if we forgot to write a wait there what if we just wrote sleep and that's that's such a good question that I want to go and run this when everyone's done typing and then go back to it I think we need to use the resource in order to to have that name that we can pass into the handler right exactly it's the parameterized URL I don't think you can do a parameterized URL with ad get there may be a way to do it but it wasn't in the in the documentation or I didn't see it in the documentation okay so with that added let's go ahead and run it and it said move one thing let's go up I'm going to go to another terminal and look this web server can handle multiple requests at the same time even though it has a single worker because it's it's executing those co-routines concurrently so it can handle more than one request at a time it can work with multiple once simultaneously or concurrently let me go back and make mess this up so the question was what if we forgot to await that sleep object we just said sleep 5 sleep calling sleep 5 makes a co-routine object right so because sleep 5 is a sleep is a co-routine function calling a co-routine function just makes the co-routine object you have to await it to actually run it so in this expression we make the co-routine object and then we just throw it away so this is going to instantiate a sleep co-routine object and then do nothing with it so what we expect to happen is that it's not actually going to sleep it's nothing's actually going to get that that co-routine object will never be scheduled it's never awaited and this will just return so I'll go here see now it's instantaneous because that co-routine is not we're making the co-routine object and just throwing it out yeah another good question let me quit out of this okay so that this is a pitfall if you forget to await if you forget to write await and you just call a co-routine you won't actually delegate to it and run it okay so what if we just use time sleep this is a great question that gets the fact that this is all cooperative multitasking if your co-routine does something for a long time and does not return control to the scheduler nothing can run so this would be yeah this would be a big mistake when you call time sleep you're telling the colonel to shut to put your whole thread to sleep so that that all this stuff is just one thread if you call time sleep the scheduler can't it never wakes up so let's run it using time sleep so watch this if I get cow I get chicken I get new you notice this is taken longer than five seconds that's because it was while it was waiting on the cow to say moo it wasn't able to to wake up and handle chicken concurrently it was completely asleep yeah yeah so yeah and that's the problem if you accidentally insert blocking regular IO you could you could block the whole thing down or if you just do too much computation for too long you can you can make it to where no other co-routine has a chance to run if you're doing heavy computation and you want to say okay I'm taking a long time to actually do work I want to let other co-routines one you could do a sync IO sleep zero sleep zero will return control to the schedule scheduler but have you called back as soon as possible this is a nice thing to do it let's other stuff run let me put back up the code just in case anyone else was typing yeah oops yeah so let's fix it let's change it back to yes these are some good questions so we're running we're running low on time maybe we're yeah we're running a little low on time and I forgot to give you all a break I apologize I've never done a format I've never given any sort of presentation or talk over 70 minutes I used to do 70-minute labs when I high school physics but 110 minutes is sort of new to me so okay so yeah what's a good way to communicate between co-routines and I was gonna I think we may we could try doing this but we may run out of time do a streaming web service where you publish and subscribe where you could do a get call and you will subscribe to something and other people can post messages that's a cue cues are fantastic and they're a great way of communicating between co-routines they're also great if you walk out of here and you say you know what I don't want to use a out sync IO I'd rather use threads that's fine there's a threading there's a cue object for threads as well for multi-processing no matter how you're doing concurrent or even parallel programming this is a very useful object so a cue is it is an or effectively an order list of items one co-routine could put items on a cue and another one could take them off after they've been added and this is usually a safe way to communicate in that these operations of putting something onto a cue or taking it off for atomic it doesn't get screwed up by different threads or different processes or different co-routines running a sick IO cues are implemented in the following way so if you want to make a cue you say cue equals a sync IO cue you could give it a max size keyword it defaults to zero if you give it zero that means you could put as much stuff on the cue as you have memory for if you give it a max size then it won't let you put more than that many items on it has two co-routine functions and this is cool cue dot get gets the last item off the queue gets the you know the first the last item gets the item that was added first to the cue but if there's nothing on the cue it will wait and return to the scheduler so you could do a cue dot get you can loop over cue dot get and just wait for messages so you can have a co-routine that has a cue and that can can loop I can have a loop can call cue dot get and if there's nothing there it will just let other things run until it's time exactly there's the put you put an item on the cue it could be any python object and again this is oops wait a full yeah it should be way to full so this is a co-routine if the cue is full if you've set a max size then it will just wait until there's room so that's that's nice too now if you don't feel like waiting you can do a get no wait which will get and that's a regular function so that will not return control to the scheduler it will get an item or if there's nothing in the cue it will raise an exception there's also a put no wait which will put an item on the cue or raise an exception if the cue is full and let's see I think we have 15 minutes I think we can do this do the question is do you not have to do a task done like on a thread cue worker I either don't I don't think I fully understand or and or I don't know the answer to that okay yeah I think depending on what your worker model is you may need to do some sort of cleanup but the cue object itself to my knowledge there's no there's no cleanup required there there are some methods let's see there are some additional methods that I didn't that I didn't put on here but they're in the documentation that I think can aid in that sort of if you need if you if your model requires that there's some there's some additional methods that that may be of interest right yeah it's all within one thread yeah I yeah to my knowledge yeah you can put any any object in it so here's a streaming web service again this is a great I think streaming stuff is is a great example of where this sort of programming is extremely useful because you have a lot of connections that are just waiting around so you can imagine a web service where people do a get it opens a streaming connection and then they just you know they just wait like maybe you're waiting for Twitter messages or stock prices to change so the client will just sit on an open connection and wait for text to come in and so what we'll do is make another endpoint where you could send a message to post so ideas we have a lot we can have lots of clients that have opened that have done a get request and open that connection and left it open and then when someone does a post that message will go to all of our clients okay so so I'll make a new module called pub sub you can call it you want we'll import async IO so to have a publication and subscription model I need to think what is a subscription a subscription will be a queue so if a client doesn't get will create a queue for them and that will be their subscription and the publisher will throw a message on that queue so we'll first I'm going to make a helper class called a hub this hub will be the hub of all subscription and in it's a knit I'm going to give it one attribute and that's going to be an empty set and what this is going to be allowed me to do is that when we do a get request that handler will make a queue object and it will add it to that set of subscriptions it will and then the cool thing about a set is you could call the remove method when you want to unsubscribe and take that you off of the set this helper class will have a method a published method which will take a message and put it on all of the queues so each queue is a subscriber will put the message on every queue and we're going to have these queues be unlimited side you notice I'm going to do a put no wait function so none of this there's no awaits here that means while it's looping over these queues you don't have to worry about anything else running in the middle of that right this will this function will not yield back to the scheduler while it's doing this nothing else runs so we don't have to necessarily worry about any side if anything messing with anything while we're in the process of looping through those I think that would be okay anyway but I need another helper class which is going to represent a subscription I'm going to pass it the hub and then that subscriptions also going to have an async I'll queue object let's make this a context manager right let's make this a context manager so that when you create it you could create a subscription be subscribed in your with block and then as soon as you leave or you haven't for any reason or raise an exception we can clean up and take remove this queue well I'm going to make a special enter method and that will just add the queue to the set of subscriptions and then it will return the queue so that so that the the code calling this context manager can can take things off of it oops and then for the exit we're going to keep this simple the exit method has to take for our three additional arguments just because that's how it may be called when you exit if there's an exception okay so I'm going to make a singleton hub object be used application-wide and we'll make two I'm going to make two handler methods a subscribe method when you make a get when you make a get request it's going to call subscribe and this is going to use the stream response we'll set the header to for the content type this will just help some clients we're going to set the status code to do a hundred this is successful and then what you have to do is a wait on a response prepare and what this is actually doing is it's setting up it's actually sending this information to the client and now now that you've got the response set up the rear client will receive 200 and we'll be waiting for the body which will be this text that continues to come we can create a subscription pass us our hub and then we can loop forever and we'll fit while we're looping we'll await and and get messages off the queue and then we will write to the response we have to change this we have to tag on a carriage return line feed that's an HTTP thing to know that so that will cause the client to write it this has to be converted to bytes and at the end return response although we'll never really get here okay I'm going to apologize for going too fast this example is on the examples get hub so now if I want to publish I will get a message from the query string I'll call publish on the hub and then I'll return a response that just says okay and then we just have to add these to our router so we'll add the subscribe handler get thank you okay so we'll run it and let me start up one client here oh it's my URL correct oh yeah you're right there we go okay now let me send a message it says okay and you could see I got the message and we can open up more clients so you can open up more clients and they'll also get the message I'm okay let me open up one more tab I'll create another client and let's send post another message to the and then you could see this client got the message and the new client also got the message so it was added to both their cues and then those co-routines pulled the sorry oh yeah here's here's sending the message with curl I just send a post with the query string message equals bar yeah and so this this example is called the pub sub a OHGTP so you can look up that's so I think we're out of time so thanks everyone I hope this was useful and it made some sense out of the new syntax