 Okay, hi everyone. Let's talk a little bit about the kind of ideas we can steal from Askel to make better libraries. So I'm George, you can find me on Twitter, and as a way to procrastinate, I like learning new stuff, learning new languages, and this is a picture of me during the Django Girls workshop in Paris, and where I was coaching, and when I was entering Preston's, I was working on a bit of Askel code, so this is my face when I work on Askel. I learned a few interesting stuff, and I'm here to talk to you about some of the ideas I discovered. So last year, after European Python, there was a keynote about Askel. It was called What Python Can Learn from Askel. It was about Askel the language, the strongly typed language with the huge compiler, and, well, Python has learned from Askel in the meantime. We now have typed annotations, and Guido has talked about it yesterday. So we have learned from Askel, but talking about Askel, the language, it's very last year. It's a bit outdated, and we are not going to talk about the language. We're even going to suppose that it's the exact same language as Python. Let's say Askel is just dialect of Python with slightly different syntax, but we don't care at all. What is interesting to us today is Askel the ecosystem. The community is a community, and the kind of libraries they produce. On the left, you have the Python package index, you probably know, and on the right, you have the Askel equivalent package, Askel package index. So let's see how the community differs, or the mindset differs, and what kind of ideas we can get from that. So one of the ideas you'll find very often in the Askel community is the idea of design space. The idea of design space is the idea that for a given problem, you have a lot of solutions. You have a full space of solutions for one single problem. Some of those solutions we know, and some we don't know yet. And one idea that is very present in the Askel community is that it is worth exploring the design space in search for new solutions that we don't know yet. And that might be more interesting than the one we have in regards to... Well, maybe those are different solutions with different trade-offs that work better in some situations or are more general, so are faster to run or just easier to learn and to use. So it's really the idea that there are spaces we have to discover. In the Zen of Python, we say there should be one and preferably only one obvious way to do it. And it's like Askel adds, okay, let's keep looking for that obvious way. And so you'll often find new libraries for stuff that already exists but which are different, make different assumptions and choose different trade-offs. And each time this library, we try to compare as much as possible with the existing libraries. So it's just not reinventing the wheel or doing something new because the other stuff were not invented here. It's really searching for a new, better, more interesting or just trying. And that is not completely foreign to the Python ecosystem because a few years ago, Kinesi Ritz released the Request Library which is a library for during HTTP clients. There was already at the time many libraries to do HTTP clients but this was a new one that tried to be as easy to use as possible. He has since launched a movement called Python for Humans where he tries to encourage all of us to follow the idea of making libraries that are as simple and easy to use as possible. And in a way that's kind of trying to explore some new parts of the design space, some part where the ease of use is the most important. So we are now going to take a look at a few SQL libraries to see some parts of the design space we don't usually see in Python but that are explored in the SQL community. So here you have on the left the top downloaded libraries in the Python package index and on the right the top downloaded libraries in the SQL package index. When I said SQL and Python are similar, I wasn't lying. You see the same packages are downloaded in both cases. The most downloaded package is the JSON package and in both languages you have multiple items to solve the packaging problem and you have some HTTP client libraries. But on the SQL side you'll find a few libraries that do not have a Python equivalent. Those are Lens, Atoparsec or Conduit. I'm going to talk a little bit about them. So first one is Atoparsec. I had to use Atoparsec in a little project of mine which is a Slackbot that answers movie quotes. Someone tells a line from a movie, any line, and the bot answers with the next line. My girlfriend insisted that it's not a real life use case, so let's just call it a real life use case. The input of the bot is subtitled files, the one you download when you download a movie. That's really the way to pass those subtitles. And there is actually a library in SQL that does that which is called subtitle parser, and this is the full definition of the package. When you open the documentation, this is all you will see. So just a little word about how to read this. When you have something with two columns, like parser-asserti, two columns, parser-subtitle, it means there's something called parser-asserti and parser-subtitles. Subtitles is defined on the right. It's just a list of lines and lines and objects with the following properties. And all that interests me in this package is the parser of subtitle. So I just told you what the subtitle is, and a parser is kind of an object that is defined in the Atoparsec package. And the way to use it is by using a function which is defined in the Atoparsec package, the package where the subtitle parser live. That function is called parser-leaning, and it takes a parser-subtitle and byte string, which is just a string of bytes, and it returns either a neural message or the subtitles I actually want. And that's it. I use the parser from this library and the function from the other library, and I'm done. To be fair, this is not exactly how the function looks like because it's polymorphic on the type of object I want to parse. So it takes any parser and a byte string and returns either the neural message or what the parser is supposed to parse. And that is not the only way to use the parser. Atoparsec provides me a few other ways, like incremental parsing. Incremental parsing is when you have a very big set of data to parse and you don't want to put it all in memory before parsing it, so you feed it to the parser a little bit at a time. And I can do that with the same parser I used in the previous function. And yet the way to use it is by using that parser to build other parts for bigger parser. There's a lot of functions I can use to do that, and here are a few examples. I can use many to take a parser and get a parser that will parse many a certain number of the single parser does. Or I can use all two with two parsers to get a parser that will parse either something or another thing, trying both and give me what matches. And all this way to use the parser are given me by Atoparsec. And the person who wrote the subtitle parser didn't care about any of that. That person wrote a 40 line library to parse subtitles and I get a lot of ways to use it for free. And there is a whole lot of other parsers built on the same principle. You'll find in the package repository a lot of packages to do with TSV. So you get a parser of TSV or a parser of JSON value, or a parser of ChromeTab if you want to read ChromeTab files, or a parser of email address that will return you a parser string that only if it matches an email address. Or if you need to work with normal files you'll find a parser for that. And in those are different libraries made by different peoples and in every single of those libraries there's only one thing that is useful and one thing that is defined it is the parser of the thing you want. And each of those libraries you can use with each of the ways to call it. I just saw you. So in every language a good library simplifies the implementation, helps you write less lines of code. But in this situation, a good library helps you simplify the interface and help you write less documentation and less interface. And in the end having less things to know before using your library. So we have a generic solution which is at the parser, the parsing solution and a lot of specific building blocks that you just can plug in and plug in and use without knowing any new things. The second library I would like to show in our exploration of that part of the design space is Conduit. Conduit is a streaming library. Streaming meaning we have a long stream a sequence of objects, of something and we want to treat them, handle them but we don't want to load them all into memory when we do so. And there are three concepts in Conduit. The producers, we just produce a stream of values and the consumers, we just consume a stream of value and do something with it. And Conduits which both consume in one side and produce something else on the other side. And once again you will find a lot of those Conduit constructs in a lot of different libraries. For example, you can find somewhere in the Askeli repository a Conduit that gives you, sorry, a function that gives you a producer from a socket, a normal standard unique socket. And you will find a Conduit, the underneath that is used to decompress a stream of bytes. And a sync meaning a consumer that will just put what it gets into a file. And Conduit provides an operator which is a pipe plugging operator and you just plug the different parts you have found in different libraries. You plug them together and you get this program that actually read from the socket and just zip it and write it in a file. And every single of those parts was written by different people and can be found in a different library. Another way, another Conduit construct you can find is a function that just take the parser we've seen five minutes ago and turn it into a Conduit. So now, with the 40 line little subtitle parsing library I have a high performance subtitle streaming library which is not very useful but still. And I can read from a subtitle file and parse every line as it goes and post it straight to IRC. There is actually an IRC consumer in the package repository. Don't do that, you're going to get banned from any IRC server you try it, you could try. So once again we find the same thing you have a general solution for streaming which is Conduit and a lot of building blocks that all have the same interface and are exchangeable and reusable built by a lot of different people. And the last library I'm going to show still into this exploration is Lens. The problem Lens tries to solve is data manipulation. So here is some data we can manipulate. It's a block post because everyone loves the example with block posts whose title is made up example considered harmful and it has an annotation and some comments. And what Lens give you is the abstraction of the idea of manipulating a piece of the data. In this case we have title which is a Lens that points to the title of a block post. And I can use the view function with the Lens title and the block post and it's just a getter. It just returns the value pointed by the Lens. But I can also combine the Lens with the dot. So the syntax is quite familiar. I have a Lens that points to the hover of a block post and Lens that points to the name of an author. I combine them with the dot and pass them to the view function and I get the name of the author. And I can use them as a setter. Just use the set function pass it a Lens a new value and it returns a new block post with the value changed. And where it gets interesting is that Lenses can also be used as getter setters for multiple values. In that case we have comments which is a Lens that points to the list of comments but each is not exactly a Lens and that means the resulting comments dot each points to every single comment in that list and I can still combine it with a Lens with also to get something which is called a traversal that points to every single author of a command in the block post object. I use the two-list function to get that list. The interesting part in that is that Lenses and traversal are values so if there are values I can store them into a variable that I call command contents. So here is the traversal that points to every single content of a comment in the block post. And I can use two-list of on that value to get the list of comments and I can use sets to change the content of every single comment to blah blah blah because you shouldn't read the comments anyway. And if this is a value I can put it in a variable I can have a function return it and I can have a library return it so I can put it in library and have an abstraction of data manipulation in library. And for example there is one such library which lets you manipulate JSON with Lenses. It provides just a handful of Lenses which points to every single value in a JSON list and key which takes an argument which is an attribute and points to the attribute of a JSON object. So with the same manipulation function provided by Lenses you can now manipulate JSON directly. Another example which is provided is HTML. So we see HTML at these with a lot of nodes and we now abstract data manipulation functions objects those Lenses which let us look into the HTML so simply we have one traversal name all named which points us to every single node in the HTML tree that has given name and we combine it with Lens contents which provided contents and so this line is just all we need to define something that can get every single title in an HTML document. And so one last time we have a general solution of data manipulation and a lot of building blocks in different libraries. So that's really a theme. That's something we can see a lot once again interface for a lot of different libraries. So why am I telling you that? Just not because it's nice but also because I think boring stealing ideas from other communities or the languages is the good way to produce excellent results. For example I've talked about request a little earlier. Requests provide very simple interface to each HTTP request. It provides a get function which takes a string which is the address of what you want and gives you a response object which is exactly what you need. And REC is a Haskell library which is heavily inspired by request. It provides the exact same get function which has the same interface. It combines it with Lens for the data manipulation part which is doing in request R.Responsedder. You use the Lens.Responsedder to get the content type but that's exactly the same principle. And this is one of this is the best Haskell library for manipulating for making HTTP requests. And it's stolen shamelessly from Python because the Python library is an excellent idea. So taking ideas from other languages and libraries. And in the other direction there is hypothesis which is a library for property based testing. It is inspired by a library in Haskell called QuickCheck and the main idea it's a bit like unit test instead of making one example and checking your code work on that example it will you just say the property I have a decoded and encoded function I can't expect them to well if I decoded and encoded something I want to get the initial stuff back and hypothesis will make sure this is true for every single every single text it can't inherit and it has really good generating characteristics and it will find every single corner cases this is an excellent testing library in Python and I can use it right now it's awesome and it was inspired by looking at what Haskell does so as a conclusion more time for questions I encourage you to explore the design space look at what exists and what doesn't exist and make new stuff to solve existing problems but differently you can look into that we don't usually explore in Python is factorize library interfaces make libraries that make libraries that use other libraries for the interface so that a new library is just a single little stuff that you don't have to learn you can just use it with what you already know and as a conclusion go learn something new Haskell anything that's unusual see what part of the design space explore and if we could fit into Python and come back at your Python next year and make a presentation about what you learned in that new language thank you great talk does anybody have any questions it seems to me that in Python there's several libraries that have the same interface PEP comes up that sort of standardizes the interface like for databases we have a PEP that standardizes an interface for HTTP servers we have WSGI now there's async.io which kind of standardizes all the async async stuff do you have any comments in that do you think the Haskell Haskell way is better the problem with that is we can you can just release a new library that will be that common interface you have to go through the PEP process which means that I don't think I can create a PEP but I can create a library for common interface the other the main reason why we do it that way in Python is because Haskell has a compiler that is able to check that the interface is respected by the library in Python if you get a library that doesn't have the exact same interface you would expect you'll just get an error at runtime so we need a PEP because we need a strong source of definition of specification of how the interface should behave thank you I was thinking about the same thing without a strong I mixed them up without a possibility to define interfaces or protocols or however you call them do you even see a possibility to do something like this in Python or is it just a strong Haskell and those types of languages and Python can do it but can maybe do other things better the compiler checking type certainly helps but it's not necessary in fact we already have a few of those common interfaces in Python what you mentioned the databases or whisky are some plugins for web framework our plugins for web framework all conform to one single interface and that works so I think it is completely possible with what we currently have in Python to have one single interface can be reused as long as that interface is specified and doesn't change my question is kind of asking you to validate that Haskell has this a lot that like a design pattern and software is encoded in a type but it's usually a best practice so monads are one example lenses are another example you think Python world would benefit more from having a standardized way to do things standardized pattern always write classes in this way or something I think that's a strong side of Haskell so what do you think about this? I'm not sure I completely get what you said sorry? it's a little bit of the same one of the ideas hidden behind what I talked about is finding common patterns in different stuff and giving them a common interface which is not something we do that often in Python or maybe we do it but it emerge more organically multiple libraries converging to a common similar set of interface rather than one single formalized interface the theme you draw out about these generic libraries that provide general solution and also the means of composition for the specific libraries I'm wondering do you think that there's something in Haskell that makes that more natural to do or easy to do is there something other like language features of Python that we should be using more or less in order to make that happen? as I said I think the compiler telling you right away if you have the right interface helps but I think it's also a big part because of the mindset that's how people do it the tools help sure but it's also the community and the way to do it any more questions? problems Haskell? a project nowadays because I don't work either with Python or Haskell most of my projects are personal projects though it's more take one technology and build something on it so I tend to alternate still I always use Python for small little thing I need done in half an hour to build bigger project or project I think will get bigger because it's the way it works kind of helps to make something small then turn it practice in Python to be Pythonic is holding the Python community back and it's looking into other communities for solutions for common problems that all programming languages and communities have can you repeat your question so in Python if you there's this notion of code being Pythonic so well good Python code is Pythonic what people say and so is this this wish to have a Pythonic solution holding us back in terms of looking for solutions in other languages that we can import back to Python I don't know you have two meanings to Pythonic you have a low level Pythonic is a single line higher level it's it's architecture of solution Pythonic which is a solution that we would call Pythonic elegant structures and elegant solutions to problem so maybe the fact that the name is Pythonic prevent us in some way to look out but no I think you can find a solution another language and find it just as Pythonic as if it was written in Python one more quick question come from me did you like the talk