 Hello. So, that was hi mom in Bulgarian. My name is Pavlin and I work as a software developer in Hacksoft, which is an outsourcing company. And I do mainly Python, which is Django, and JavaScript, which is React. And you can find me in Twitter, just like this. So I want to start with this. This slide was not initially in my presentation. And you might already know, but my talk was shared for Wednesday. But, sadly, my fight got delayed and we missed our connection. So I want to thank, firstly, to Alexander Handorf and for the assistance and to Owen Campbell for stepping in and taking my slot, which made this talk possible. So if you can do a round of applause for them, it will be really good. And a little disclaimer, my talk was shared for 40 minutes, but now I have only 30, so bear with me and let's get started. So Python is synchronous. What does that mean? It means that each line executes in order from top to bottom. And knowing that, let's take a look at the following example. You have a line of code that makes a code to a remote server. And this means that the program is doing nothing while it's waiting for the response. It just waits. And the program blocks and waits for the response in order to continue. So what if you have several line of codes that do the same? Each line will execute when the previous line is done waiting. And in a short, you will wait more. And you will be blocked more time. So how can we solve this? How can we speed up the program? Can we make it non-blocking? Of course you can. And one way to do this is with threads. Threads are dead. If you haven't seen this talk by David Beasley, I highly recommend you to do it. So you can speed up multiple threads. Each thread doing one thing at a time. And when multiple threads are running, each CPU can run one thread at a time. So in order to allow, if you have more threads, in order to allow all threads to share resources, the CPU very often context switches between the threads in a non-deterministic order. And threads are great, but sometimes they can make the code more complicated and harder to understand. And there are also race conditions and dead locks. And about the non-blocking thing, there is another way to make it. We can make our code asynchronous. With asynchronous programming, the software, e.g., our application, manages the threads and the context switching rather than the CPU, which means that the context is going to be switched at defined points rather than a non-deterministic interval. And one way to do asynchronous programming is with an event loop. So there's an event loop and there's a queue of events, so called tasks, and the loop constantly pulls tasks from the queue and runs them. One way to do asynchronous programming in Python is with Async.io. Async.io is a standard library, which was added in Python 3.4 and provides an event loop. So let's take a look how can we use Async.io to write asynchronous code in Python? So this is what we have to do. In this example, we import Async.io at the top and then we define a coroutine using the Async.def keywords. So this function is the coroutine. It's like a normal function, but it has Async in front of the dev and it has an await expression and we're going to talk about that in a moment. So what this coroutine does is it prints hello, it waits several seconds and then prints well with the respective delay and then finishes. So the next thing we do is we get the event loop, we create tasks and shadow them in the event loop and then we run the event loop forever. Let's see what is going to happen if we execute this code in Python 3.6.3. So hello, we print twice. As we saw, we have hello at the top. So hello, we print twice, after one second, we delay one, we print it and after another second, we delay two. So both tasks start simultaneously and the second task finishes one second later than the first task. So this is exactly what we wanted, right? I mean, the program didn't block. So, yeah, I forgot. So the program didn't block. We shadowed two tasks and they started at the same time. The second task didn't have to wait for the first task to finish and we achieved that by refining a coroutine with, I think, Dev keywords and we used a wait expression inside it, which we're going to talk about in a moment. But what exactly is a coroutine? I mean, what does that term mean? Easiest thing to do, let's ask Google about that. So Google tells us a coroutine, the coroutines are computer program components that generalize subroutines for multi tasking by allowing multiple entry points for suspending and resuming execution at certain locations. Well, more understanding about this definition hasn't changed since the first time I read it. I mean, I remember looking at my screen just like the cat in the picture. And my first thoughts were, what if I have to explain this to someone else? I mean, this is a picture of me and my niece, she's two years old and she's really cute and I love her so much. So in this picture, she's teaching me how to draw here and believe me, she's better than me even though she's two years old. So what if I have to explain to her what a coroutine is? I will have to start with the basics. In this talk, we are going to talk about order of execution, the iterator protocol, generator functions, we're going to talk about yield from, and then we will make a definition of a coroutine in Python. And then we're going to look at the async.coroutine and types.coroutine decorators. And finally, last but not least, we're going to talk about async and await. So order of execution, as we already said, in the beginning, each line executes from top to bottom in order. So until, an exception is raised, for example. In this example, the message after the race will never be executed. Another way to end the function's execution is when a return statement is reached. In this example, the print message below the return will never be executed. So return implies that the function is returning control back to the point from where it was initially called from. And another way to, another way to, yeah, there's yield. I mean, yield implies that the transfer of control is temporarily and the function expects to regain it in the future. We will get back to yield in a minute, but before that, we have to mention a few more things. First, iterable. What's an iterable? So the iterable is an object that is capable of returning its members one at a time. For example, a list at your poll, dict, string, et cetera. So an iterable is an object of any class that has defined the underscore, underscore, et cetera, underscore, underscore method. So in a short, an iterable is something that can be used on the right side of the for loop. So iterator. This was iterable. This is iterator. What's an iterator? That's an object representing a stream of data. And repeated calls to the iterator's next method return successive items from the stream. And when no more data is available, a stop iteration exception is raised instead. Iterators are required to have an iterable method that returns the iterator itself. Let's check out an example. So this is a pretty useless iterator class. Here, that iterator, we will turn the numbers from one to three. And as we already said, we have an iterable method that returns self. And in the next method, we will return the numbers from one to three. And if the next number is higher than three, we raise a stop iteration exception. Let's see how can we use it. So first, we have to make an instance of our iterator class. And then we can manually call the next method on it. The first time we call next, it returns one. Next time, it returns two. Third time, it returns three. And if we call it again, it will raise a stop iteration exception, which is really nice. So another way to get its values is to iterate over the iterator with a for loop. So what the for loop does is it calls the iterator method and then consecutively calls the next method of the iterator. And when a stop iteration exception is raised, the loop is executed successfully. So let's get back to the key word. So what does that word mean? What does that word do? Let's talk about generator functions. So if the body of a function contains ilt, that function automatically becomes a generator function. And once we call the generator function, it returns generator iterator. And we can get the values from the generator iterator by calling next on it. And once the generator is exhausted, it will raise a stop iteration. And you can consume the values of a generator only once. Let's check out an example. So this is a normal function, but instead of returning, we return values by yielding them. And since the function has ilt inside it, that function is a generator function. Let's see how can we use it. So when we call the generator function, nothing will get printed. It will return a generator iterator. And from now on, when I say a generator, I will be referring to the value returned after calling the generator function. So gen is a generator iterator, and I will call it a generator. So how can we get the values? We have to call next. If you saw the function, you will see that it prints a string, returns one, then we can... And what happens is that... Let's see the function again. What happens is that at the point where it returns one, the function pauses. It's not destroyed, it pauses, and it keeps its context. So at this point, when it returns one, when we call next, it will resume from the place where it was paused, e.g. the first ilt, then it will print ilt in second value, it will return two, and finally it will raise a stop iteration exception. So generators have a send method. What does the send method do? Well, it resumes the execution of a generator and passes a value into it. I mean, sending none in the send method is equivalent to calling the next method of a generator. And the send will return the value yielded by the generator, just like next does. Let's see an example. Everything is the same except this line over there. So what we've done is we've assigned the ilt expression to a variable, and the value that we're going to pass to the send method will be assigned to that variable. Let's see how it works. We have to instantiate our generator, which returns a generator iterator, which I call a generator, and we can call send with none, which is equivalent to calling next on the generator, which is going to print, going to yield value, and it's going to return 42. So the next time, we can call the send method with a string, which will be assigned to the variable received, and it will be printed. And the next time we call send with none or with a string, it doesn't matter. A stop iteration exception is raised. So in Python 3.3, a new feature comes in play, ilt from. The ilt from expression. So if we loop over an iterator and ilt each value, this will be equivalent to yielding from the iterator. And when the iterator is a generator, it will be exhausted first, and it will get every value from that generator, and then the execution continues. How is this helpful? It's helpful because ilt from allows us to chain generators, which is really cool. Let's see an example. Okay. This is a little bit complicated. We have a generator for a generator, which ills one, prints some string, and ills two. And in the second generator, we make an instance of that generator function. We ilt from it, we ilt from the generator, then we print a string, and we ilt three. So what happens when we try to run that code? We get the second generator, and the first next code pauses the second generator, oops, pauses the second generator at the point of ilt from, and goes into the first generator, which returns one. So the next time we call next. The string from the first generator is going to be printed, which is in the middle of the first generator, and two is going to be returned. And the next time we call next, in the middle of the second generator, three is returned, and then it raises the top iteration exception. Knowing the capabilities of generators, let's take a look at the definition from the beginning. All things are computer program components that generalize several things for non-primitive multitasking by allowing multiple entry points for spending and resuming execution at certain locations. It looks like a coroutine is an object which implements the methods of a generator, and so certainly all generators implement the coroutine interface, but the generators were not meant to be in use that fashion. So with its arrival in Python 3.4, async.io provides a useful decorator in order to solve this issue. So what does decorator, what this decorator does is, well, it is used to label a function as acting as a coroutine that was meant to be used with async.io, and async.io requires all generators that are going to be used with async.io to be decorated with this decorator. So let's check out the example with async.io, which was from the beginning, and was using Python 3.6363. This is the example. Same thing. So let's compare it with how async.io was used back in Python 3.4 when async.io originated. The only difference here, as you can see, is that we are not using async.dev, but we are using a normal function, which is a generator, because it has yield from, and we decorate it with async.io, and instead of awaiting, we use yield from. And with generators support and an event loop in the form of async.io, at the point of Python 3.4 there was enough to support asynchronous programming in Python in the form of concurrent programming. What does concurrent programming mean? It means that with concurrent programming we write code that is supposed to be executed independently of other parts, but it all executes in a single thread. I mean, the event loop is running in a single thread. And we create tasks and shadow them in the loop, and when the task then is monitored by that loop, and when the task is done, its result is sent back to the coroutine via the send method that we already looked at. And in Python 3.5, the types coroutine decorator comes, which what it does is it flags a generator as a coroutine just like async.io.coroutine does. And it looks like finally we have arrived. I guess all of you are here for this slide and the next one. So the async keyword goes before def. I guess I have to click, yeah. Async, the async keyword goes before def in order to show that a method is asynchronous. And async def is used to define a function as being a coroutine. And the key thing about async and types.coroutine decorator do is that they tighten the definition of what a coroutine is. It takes coroutines from simply being an interface to an actual object. And if you use the inspect model and use the function iscoroutine, it will only return true for the functions that are defined with async def. And this makes the distinction between a generator and a generator that is meant to be a coroutine a lot more strict. So the next slide, await. The await expression also comes in Python 3.5. And the await keyword is only valid inside on async def functions. Await operates much like it's from, but the acceptable objects to an await expression are a little bit different. When you call await on an object, that object needs to be awaitable. What does that mean? Well, I guess you can, yeah. It was easy. I mean, awaitable object is object of a class that defines the await method. And the await method should return an iterator, but that iterator should not be a coroutine itself. I mean, coroutines are awaitable and they are awaitable objects and they can be used on the right side of the await expression. The key thing here is that await will not accept a generator that is not flagged as a coroutine. I mean, even though generators kind of support the coroutine interface, this is the key thing because the await expression protects us from accidentally using a generator instead of a coroutine. But there's a catch, as always. You cannot use it in async def function. And the only way to return a value in an async def function, which we can call coroutine, is by using return, either return or await. So async allows us to distinct coroutines from generators and the await expression makes sure that one does not mix and match objects that match the same interface, I mean, the API of coroutines and makes it more clear that these objects are awaitable, that these objects are waiting for coroutines to finish. And the await that we saw is the switch point. I mean, this practically means go do something else while I'm waiting for something to finish and when I'm done, come back here and continue. And async and await in general make it more clear that the code is asynchronous. So once methods are not confused with generators, and the follow up question is what's next? I mean, the most async and await give us, provide us an API for asynchronous programming in Python, and the cool thing is that people are using it. David Beasley, he made an awesome talk and he made Curio, which is an alternative for Async.io, and from what I saw in his keynote, which I highly recommend you to watch, is that what he did there was based on Curio and Nathaniel Smith made Trio, which is another library for asynchronous programming. And the cool thing is that Async.io is, I mean, it's a core library in Python and it's adapting and it's evolving and it's looking and Curio and Trio are helping the asynchronous Python world to evolve. And for me, what it means is that I have to make another talk because as a software developer, my job is to write good code and to improve it and evolve it and support all the functionalities. And this is what I do in my work project and this is what Python core developers are doing with the language that I'm using in order to write my project in work. And that's the definition of software development, actually. I don't know if I said it, but I'm a physics graduate. I haven't been in a software development school. And when we learn new things, we apply them and we refactor our code and while we refactor the code, we learn new things and repeat again the process. I mean, learn something new, refactor, learn something new, refactor, evolve, improve. And personally, I observe how Python is evolving with great anticipation and I really can't wait to see what's next. And thank you. Thank you, Pavel. Yeah, you can find all the code snippets in this repository. Okay, we have time for one, maybe two questions. Anyone? Here. Okay. Hi. Great talk. If we want to call out to something outside of Python like C++ code written and bound with something like PyBind or something, can we also use asynchronous functions and how do we go about doing that? Well, let me check if I understood the question correctly. You are asking how to call a synchronous code from a synchronous code. Asynchronous from synchronous or the other way around? I'm asking how to make an asynchronous function that calls out outside of Python. It calls something outside. It could be making a request. The thing that it calls, the asynchronous function has to call another asynchronous library. I mean, your best bet is to have a library written in a synchronous manner. So that's what I'm asking. How do I do that? You have to write your library to be asynchronous. And how do I do that? I don't know. Okay. Any other questions? Hi. Thanks for the talk. I'm curious if there's a use case for using the generator way to do it or to use the decorators. Are those just obsolete and we should just always use a sync and await? At the current moment, you should always use a sync and await. I don't have much experience with the synchronous code. I haven't written a lot of code with it. But at the moment, you have a sync and await and you should use that. I mean, the generators were there because they were needed. But they are not needed anymore. Okay. One more quick question. Here. Okay. So a similar question about use cases. So I was curious, what is the use case for using yield from as opposed to using iter tools chain? Can you repeat? Iter tools chain and using yield from. What's the use case for using yield from? I'm not sure. I don't know. I can't answer that question. Thanks. Okay. So let's thank