 Thank you, thanks for coming. So this talks to me about writing domain-specific languages. So I think the first question we need to answer is what is a domain-specific language? So a domain is like a business domain, a problem space. So a domain-specific language is going to be a kind of language that is more suited to solving problems in that space than a general purpose programming language. So Python is a general purpose programming language, a domain-specific language... Well, let's meet a few examples. So SQL is a domain-specific language. It's much more efficient for writing a query against a relational database than constructing some objects in Python that represent the things that you want to query and the tables that you want to query from and so on. CSS, domain-specific language. Regular expressions. Who knew regular expressions were a domain-specific language? Config parser, also in the standard library, a domain-specific language for configuring things. Intended in this case for people who don't know Python to change configuration. String formatting. So you've got two domain-specific languages for free here. The format method on a string is a domain-specific language that will interpolate the things in curly brackets. And date times have a special under-format method. That means that you can use the strf f times syntax within the curly brackets. Restructured text. That's a domain-specific language for writing technical documentation. So what are the problems that those domain-specific languages are trying to solve? So one, I think, is the readability of code using those languages is going to be, or systems using those languages, is going to be better than a nest of Python expressions and syntax. And also as a corollary, it's going to reduce the repetition in the code and as you reduce the repetition and you reduce the amount of syntax, the errors, you're likely to make fewer errors, so the writeability of the code improves. One of the other reasons that we might want to do this is to sanitise or manipulate the data as it's loaded into memory, or the structure as it's loaded from a DSL into program memory. And then, as we saw, there's a use case where non-technical people are going to be, or people who aren't familiar with Python, are going to be manipulating these things. So as Python esters, I think we have more constraints on what exactly we're going to want because we want to use Python as our general purpose implementation language and we're just going to want to use the DSLs to simplify and improve our Python. So we need to use Python for implementation, we need to trade off where we want to inject extra expressiveness into our Python code and sometimes we're going to use triple quoted strings to put in a small structure. Some of that later. And we're going to want to work with the kind of patterns that are familiar to Python developers. So you don't want to surprise people by completely changing the semantics of square brackets or something if you're working in an environment where you're going to be switching between those two contexts. So when we talk about building a DSL, you're generally going to be converting some piece of text which might be within your program code but could be from another file into a structure in memory. So that looks a bit like this. This is a made up language that you can imagine is for a service that greps your logs and does things when a regular expression matches. And in this case, I'm kind of thinking of this as a language that would be intended for non-Pythian people to write. So this is like the user-facing interface of a Python system. And so we're simplifying the... and removing bugs from the configuration of this by putting it into a language that's sort of more easily expressed what we want to say and then confuse people with Python problems. So the act of turning this plain text into a structure is parsing. So what we're mainly talking about with domain-specific languages is parsing. And so this talk could have been called like a guide to parsing. But I think once we draw out what we want to get to with domain-specific languages, we can sort of construct a more coherent narrative that will sort of help us to understand how to design these things and work with them. So the first approach to getting a parser, I guess, is there is one built into Python. So any code that you write in Python will be sort of parsed and will have structure in its own way or when it executes it will construct that structure. So we can manipulate the... we can use all of Python's dynamic tricks to construct structure in a more elegant way but that doesn't necessarily execute with Python semantics. So the first thing I'm going to talk about first metaprogramming approach I'm going to talk about is metaclasses. So you know what a class does in Python. It is... you define it and then the object itself is a factory for instances of that type. So this here is a class called duck and when you call duck it returns a duck instance and you can call a method on the instance. That's Python semantics. With metaclasses you can completely change what that class definition does and what the duck object that is sort of assigned to the name duck by the class definition what that object will do. So here's one for example. This is actually something I wrote a few years ago and I guess this is kind of similar to Scrapy but what we're going to do here is define a number of X-path expressions that will match against a HTML document or something that has been retrieved from a URL. And if you look at the example where it's actually called it's not even returning a Scrapy review instance it's going to return a dictionary. Who knew you could do that in Python? A very small number of people. OK, so this is kind of an approach to combining like creating say a function or you're creating one piece of functionality but you're doing it in such a way that you can declaratively say specify parts of the implementation and there's even like this part where you can inject custom functionality like cleaning a value by stripping a colon. So how would we implement that? Well first of all you need the fact so the facts got subclass of facts but they basically all take an X-path expression and you get an LXML document and you can then just evaluate the X-path and return the result. The meta class itself so there are two phases at which this is going to inject code into the class definition so I guess the under new method is called at the point where your class definition ends so at the point where the indentation goes back and the name gets assigned at this point the under new method will get called on the meta class and in this case it's just going to collect up the facts the instances of fact that were defined in the class namespace into a dictionary for easier use later and then the under call overloading the call, the parentheses operator if you want on the class object itself the default implementation will return an instance obviously and by changing this completely we're just returning a dictionary so this is going to evaluate all of those X-path expressions against whatever we pulled down from the URL and then run any cleaners and so on and then here's how you create the base class so you could assign this on every instance of the scraper but it's somewhat nicer to create a base class and then extend from it and that's what you need so moving on, here's another approach for DSLs in Python in Python meta programming and this is using context managers to construct structure there is a library that does this I don't know what it's called off the top of my head I don't know, I don't like this that much it's a bit sort of verbose and forced isn't it so also there's an implementation problem here that you have to be careful about which is that this is then not threaded this is requiring global state that is all of these functions that have been defined here are somehow soaring the state which they are in constructing a result document unless they're sort of outputting it to a stream immediately so that brings concerns about thread safety and race conditions into the mix another approach is operator overloading so this is Django's ORM so there's obviously the DSL where you keep doing double underscourses and double underscourses and you're constructing an expression which is frankly awful but what I'm going to talk about here is the use of the pipe to mean OR because in Python you can't overload the logical OR operator you can overload the bitwise OR operator which is the pipe and lots of things do this there are problems with this so this is a... this is spotted in a real code base we're using... there's some horrible stuff in here frankly so this is using the ampersand operator to mean not bitwise AND but logical AND and it's overloaded the comparison operators so greater than or equals to construct an object that sort of matches that thing that is a representation of the expression that would be... I guess it's kind of deferring the act of doing that comparison and we've got this amazing like in-list thing you could spell an infix operator like that in Python so... it turns out like that is actually a left shift operator followed by a right shift operator and now you take the spaces out and it looks like an infix operator and you'd implement it something like that but don't look at that because don't do it please the place that this really breaks down is that the... so we can't overload the logical AND and OR operators we can overload the bitwise AND and OR operators but those are different... they have different operator precedents so that means that the... whereas the AND and OR are at the right precedence to make sort of simple obvious code work the ampersand binds more tightly if you will and so the... I've actually seen cases where naive people have written the top one and meant the bottom one wrote the top one and meant what was on that slide so bracketed like that and obviously that doesn't work so that's not very readable so what are we doing here? we're aiming for readability but we've constructed a situation where you need to put in so many brackets and get the brackets right that you're not really achieving the readability we say how to here's another example it's like a different example kind of related in our code base but this is where... so the top example is how it's supposed to work a table object and accessing the name of a column on a table also returns a table and then the intention is that you can filter down the table by an expression in square brackets like that you filter... first you compare... overloaded comparison operator that will return an array of the rows that you want and then you can select using that list but that means that table operators when you compare them to a value will return a list that is the same length as the table in number of rows and that means that we've had people writing tests where they're trying to compare a return value of a function to none and obviously so if you did assert is none that would work but that test passes even if you have a table so... overloading the operators is a... overloading comparison operators in a way that changes the semantics will cause problems if you want to overload comparison operators then make sure you return a boolean that makes sense in the way that Python normally works or otherwise construct a situation where you can't write that test and have it incorrectly pass okay so then I guess moving on to a way of writing DSLs where you are using Python's own parser but not sort of inline in the code so just as we've seen overloading operators and things but by using the AST module to construct the Python structure and then completely change the way that actually executes this is maybe slightly nicer than the way that we saw writing this previously it means that the operator precedence is correct and it's not very surprising to a Python developer in how this executes I don't know, this is one of those tricks where there's like a certain elegance in it but it's not extensible so you can't ever take this beyond what Python can do in its parser Python has a very complicated parser and you can abuse all of those features up to a point and then no further the code here is like how you'd implement that so AST.pars will give you Python's parse tree for a piece of code and then there is an AST.node visitor which will allow you to sort of walk through the AST and do something with it so in this case we're just sort of returning a I think we're returning something like a like an AST but of a different type of thing where the semantics are sort of sequel semantics this is even more insidious so this is a very similar to the last thing but actually it's been completely hidden within Python code so this is don't think this is any different to this this is explicitly like this is a string so it doesn't look like Python it's going to be executed as Python in a certain sense it doesn't look like it's going to be executed as Python but over here this looks like it's going to be executed as Python but for certain reasons what this DSL is doing is sort of constructing a sort of static analysing what the code will do for various reasons but wow that's horrible so exceptions don't work in that and the behaviour is completely different to the way that Python would execute that code that's very surprising to users but this is another one that does a similar kind of trick maybe a little more successfully I guess because you pass in something that looks like a generator what that actually does is gets decompiled to a structure and then that gets turned into a sequel query that represents what the bytecode would do in evaluating that generator but over a database so I don't know I don't really have any particular feelings about that if it works it works but an interesting thing to note is that decompilation for turning a bytecode into a structure and then finally into a sequel expression is very similar to parsing a DSL because you're converting a stream of bytes or characters into a structure and then converting that structure into whatever else you want evaluating that structure so in summary over Python metaprogramming tricks I guess there are some clever things in there but then there's a lot of surprising things that or pitfalls I guess that can bite you as you start working with these things and obviously that won't be apparent immediately you might define this thing and go wow this is great throw in a really complicated expression and it's like I hate this there's a really good quote that I saw in a blog post from Mozilla last weekend which was that with good abstractions are a continuing source of greatness whereas bad abstractions are a continuing source of pain so you really need to get your abstractions right and this seems like if you get the you accidentally pick a painful path you can continue to suffer pain for a long time of those approaches I like metaclasses the most because they don't really suffer from the operator overloading things or you can they don't suffer from those pitfalls in the same way it's just a way of fashioning a piece of functionality out of a piping class statement and that's sort of the intended use of metaclasses one of the intended uses of metaclasses so then moving on to it like a different category of how we get a structure how we get a structure out of text so how we get a parser there are plenty of parsers that we have access to so here are some Jason and Yannel are widely used config parser is intended specifically for this but you could abuse it you could create something completely different out of config parser but we'll see some examples of this so this is the elastic search DSL anybody familiar with this? is it pleasant working with this? so there are loads of curly brackets in here and it's not always clear where the curly brackets should go so my favourite bit is you get to the end and there's an empty dictionary because I don't even know what to put in there you need the dictionary there you need exactly the nesting that you have there it's not a very elegant thing I guess you could take the view that for elastic search this is more like a wire level protocol than a DSL it may not be intended for human for writing as a programmer you may expect to wrap it completely up in a nice pythonic API but all of the elastic search documentation is written with these so you have no choice but to engage with this unless your API provider re-documents all of elastic search using their own components and they don't this is an Ansible playbook so this is a real mishmash of yamol and ginger there are some loops in there and it's like this magical look up thing so I don't know Ansible that well but this is the reason that I've not got into Ansible this is just at the point where you break free of this and write something new puppet is good so on the subject of Ansible sorry, on the subject of yamol for DSLs I guess this is kind of tangential but who can spot an error in this yamol document? right, colon, yes so the colon in the middle of that line as a sort of non-technical user perhaps I'm writing just some metadata for a thing and I write the title as I've been taught to and I put in a colon and suddenly I've not got a list of strings I've got a list of some strings and some dictionaries so that's a potential pitfall who can spot the bug in this one no, that's a valid string so this will be a dictionary containing a dictionary where the values are strings I don't know, indentation is fine so, the, I'll tell you Ontario is keyed with the string om an om evaluates to true ha so I guess that's, are you likely to run into that? maybe not but I kind of think yamol has tried to do way too much it's like a super set of JSON and it has a ton of features in there and nobody understands it that well because the spec isn't documented that well and I think it's more of a sort of pseudo-readable serialisation format for more complicated stuff it works really well as that I think it works maybe in really simple cases as a DSL so in summary of the off the shelf parsers we can get our hands on there are I guess it's not very suited to your domain it doesn't really fit that's the overriding thing isn't it that your shoehorning a structure into just because it's off the shelf and you're able to use that doesn't mean that it's really going to fit your domain well and as your feature set grows you're sort of struggling more to shoehorn all of that into the the expressiveness of those languages so that brings us on to the last category that I'm going to cover and I guess the biggest category which is parsing our own DSL so this gives you complete freedom to break free from any of the existing abstractions and start afresh and come up with a language that is completely suited to what you want to do so I've written a few DSLs in my time and this is kind of the approach that I take so you sit down with a blank text editor and you start thinking about concepts from other languages or from other things that might be familiar to your target audience as kind of the syntactic elements and then you sort of knowing your problem domain well you sort of start to encode some of the examples from your domain into a text file and then you iterate over it and eventually come up with a few examples of the kind of language that you want to create and then having done that you could be very proud of your language that looks really nice and then you can straightaway take those examples break them into test cases and you can start coding so I covered some of this stuff when you are designing a DSL you want to make it familiar to people you want to make it work in sort of the same way as Python one of the problems of the puppet DSL is that scoping is just completely wrong so you ought to try and make it work as people expect the principle of least establishment and then how you parse this language I guess is another thing so you can't if you can't parse it, it's not going to work so once you understand a bit about parsers then you can design for the parts that you can write as well there's one thing in particular there which is that it's very useful to embed DSL sometimes in Python string literals and so you ought to have some sort of concept of what is going to be convenient to embed in a string literal and what isn't and obviously you've got raw strings and so on but I guess that's something to pay attention to I'll come on to that a bit later I guess so I think the first approach that I usually take or the first approach that will work for the simplest approach that will work for parsing your own DSL is to do it line-wise so this is a DSL that I wrote this is so we have a tabular structure in my organisation and there was a lot of code and particularly test code written like this and it was written as the before so there's a lot of syntax in there there's a lot of quotes and what also happens is those two lines start to drift apart and before long you've got like a schema defined elsewhere and what you're appending to it is no longer clear it's no longer clear how the fields match up so I wrote this simple sort of line-wise DSL this is inspired more or less by kind of lettuce gherkin syntax but we have types in our table so it needs types added to it and then that will always say it becomes a very sort of literal style of a testing where you say this is my input and then I'm going to run a function and I'll get an output and you can visually compare what the input and output are expected to be that's a really great way of writing a test with these structures but then to draw another point from what I was saying before so I was saying that if you're working within string literals consider that as a consideration this is sort of tolerant of indentation and that kind of thing and also you can put comments inside the literal on the end of the lines or before the lines but not in the middle of the table actually so it's flexible in those situations that allows you to lay out your code however you want so to parse something line wise the way that so this is sort of basically I'm considering writing a parser from scratch the way that I would do it the way that I start doing it is to write a finite state machine effectively so you start in a state and then consider each line in the context of the state that you're in and each line will cause a transition to another state or maybe output some value and that looks a bit like this a language that has headers and a body so there are four states there of which two are sort of terminal states that maybe aren't even encoded as states they just sort of kick you out of the parser or return an exception so in the expect header state you get a header and you stay in the expect header state but if you get a blank line you move to the expect body state and then any line is body and this is like something maybe sampling implementation of this in a loop I don't want to dwell too much on that but I think the thing to note is that you're switching on first of all you're switching on the state so you're not sort of evaluating the line and then considering it in the context of the state you're in I think it's better to do it the other way around where you take a line and then switch immediately into different blocks of code depending on the current state which you can do as a class though this is a slightly more organized way of doing the same thing I guess which is that your current state is just encoded as a method and the by switching out the method in this case the bound method that you're currently using in the instance you can transition states that way so a finite state machine is technically powered to pass regular grammars I don't even know what that means but I guess I don't know what that means because you can sort of start adding a stack and you can maintain state and you can continue building up this clumsy parser for a certain amount of time and continue to pass increasingly powerful stuff you're considering one line at a time but that doesn't mean that the structure is one line at a time it means the structure can move, can span multiple lines and can switch states across those lines so I guess we need to dig into that a little bit further why can it span multiple lines what does it mean to parse line-wise in the context of a bit more parsing theory quickly detouring into parsing theory there is a book on this and it is very mathematical and I don't really suggest you read it because there are libraries that will just sort of bake all of that mathematics down into much more simple stuff and we'll meet some of those later but to summarise the basic parsing theory is usually split into two phases so there is the lexical analysis tokenisation phase where you are given just like a flat bit of text you start breaking it into the words or symbols that make up the language and then given that sequence of symbols you can start assembling it into a tree lexical analysis is usually done with regular expressions and just sort of what regular expression will match next indicates what token type and value it has and then the syntax analysis has there are various algorithms that are really mathematical bit so this is an example of the tokenised module built into python that provides the tokenised of python it's rather simplified because it actually emits like all kinds of data about each token on an expression like x to the power of y plus one it will output the tokens that you see there so left bracket left parenthesis x so note that it has the type as well so just the sequence of the words but white space is not one of those words and then ostensibly consuming the output of tokenised behind the scenes AST will convert that into a structure so you've got like a nested bin op where op is a class and name is a class and so on so coming back to linewise parsers this is all there is to distinguishing linewise parsers from anything else that you're just taking a line as a token but then all of the same practices apply so if you've ever written a linewise parser you're not actually that far from writing a parser that isn't acting linewise and then I guess this brings me on to a brief interlude where I wrote a linewise parser for a game so in October 2014 I won Pi Week which is a weeklong games programming competition with an adventure game called Legend of Goblet which is a kind of adventure stage play so there's a narrative and the characters are sort of moving around on a stage it's almost defined as a stage so there's like stage right stage left and that's all scripted and it's scripted with a language like this so it literally sat down at the start of Pi Week on the Sunday and was like I don't know how to write an adventure game but I'll start by writing a script that I want to have and very soon I realised that this script needed to be executable so I added all of the features that I need so choice came afterwards but the I started building a structure even before I written a code was like this needs to evaluate and move people around the stage and at this point I'm going to break out and have a puzzle and so on if we have time later I might show you that game so as I said we're nearly at sort of full parsers if we're considering a line as a token we've written something that can construct a whole AST out of line wise things but there are tools that make this even easier so we need to understand a little bit about how you instruct one of these parser generators to parse an arbitrary DSL this is the kind of stuff that you'll see when you're describing languages so this is a sequence of productions that define the grammar for a DSL and in this case this is like a simple calculator expression language it seems to be like the de facto standard for demonstrating parsers although it's kind of not that effective it's not that useful a thing to write in the real world often but you notice that you've got sort of recursive productions where an expression will match an expression plus a term so that means that you can nest arbitrarily many sort of plus signs for example term plus sign, term plus sign and so on and that will create a structure that is sort of recursives on the left hand side and the way that this grammar has been written is such that the precedence of the plus and minus operators is correct as in all mathematical syntax so what does that like left for recursion mean there is something we need to know in order to construct a parser is about associativity so if you have an expression that has multiple operators of the same precedence then whether they bracket to the left or the right so a left associative operator as I say there is brackets the left hand set of expressions and Python is left associative for all of its operators okay so all but one of its operators so we need to know that that actually sort of makes a difference on how our expressions are evaluated and the other thing that matters is operator precedence we've already met in a sense but that we need to know that the expression that we write so if plus and asterisk had the same precedence for example it would construct the wrong structure so we need to know that when we write sort of simple unracted code then it does that as it gets parsed effectively gets bracketed in the way that you would expect I kind of mention that so I'm going to skip over where I'm running low on time so this is a parser called PLY for Python lexiac lexeniac are well known C parser generators so this is a Python implementation of those things and this is the tokenizer phase which is equivalent to lex so notice this is a DSL each of these is a regular expression that will define a token type and so the capital letters in the the names of the things are the token type we've got to have a list of the tokens also included in the module and then there's some magic that turns that into a lexer the number is slightly more complicated because the dock string is the regular expression and then you can provide a mapping between what we actually parsed which is a string and the value that we want to return as like our tokens value and then PLY allows you to construct a parser like either in the same module or a different module you just import the tokens and the lexer and then you're defining the grammar productions another DSL as you can see in the P expression bin op for defining what the grammar is and that's in very much the same kind of language as the like BNF that grammars are often described in so to talk through this code a bit more this is not constructing an AST in this case this is constructing a parse tree collapsing it down to a result as it parses it so the operator has actually been pulled out and executed as it's being parsed that's I guess that's just the rest of the the parser so again it will pick up everything in the module namespace and you call the act.yack and you get a parser and then you use it a bit like this so parser.pars and you give it the lexer again and then you can evaluate an expression because the expressions being evaluated as the syntax pre-collapses just simply parsing the result of parsing is the result so moving on to another library PyParsing takes a different approach so whereas PLY implements a couple of the really fast parsing algorithms which are Al, R1 and SLR PyParsing is recursive descent which is more powerful but also potentially much slower so recursive descent allows the parser to backtrack if it failed to match and in fact it can backtrack even if it didn't fail to match so it's trying to match tokens and it can you can have it look for the best possible match of those tokens and try all of the routes through the code which can be incredibly expensive my first PyParsing ran incredibly slowly I just used the wrong or I didn't mean or I meant match first so to talk you through PyParsing there are these classes there are lots of these classes and they all have kind of a similar API these ones are the ones that sort of match tokens so there's a quoted string that means that you can match like a python quotes or from a quote up to in this case it's from a single quote up to the next single quote excluding single quotes it's going to escape with a backslash and unquote results there will mean that when we actually get the constant it's just been converted into the string that was represented by that so that's actually quite a powerful little bit of functionality there and then there's also regular expression matches and just like a literal comma and you can combine those this is another DSL the operator overloading DSL where the pipe operator means match first there is a class called match first but you can short circuit it by short cut for writing that is to use the pipe to match any of those things this is kind of also defining something very similar to the grammar rules in that sort of grammar but defined in python syntax so the productions here are the combination of those things assigned to another value and the thing that they get assigned to is also an instance of one of these parsing objects there's some stuff there about so this value thing there's some magic there because it needs to be recursive which means we've got to define it and then get its value later so that's where the left shift equals comes in it's a bit ugly and this is the problem with operator overloading I guess that you run into these kind of situations each of those parsers allows you to in the same way that we saw in PLY you can map the token as it is returned from the match in this case you can map map the constant production which was either a string or a float or an int and because I handily describe those in python syntax you can use ast.literal of al to evaluate its actual in python value and a list is very similar but because of the way that I've defined a list which is with interleaved with commas as in python you need to exclude both the leading square bracket and the commas and the trailing square bracket which is what that slice does and then you use it just by calling on any one of those things so they all provide the same API and they can all parse some input that matches so here's another one there are many of these but they will all take more or less the same kind of approach which is matching tokens and providing a grammar that defines the language and parsley has chosen a different approach for its DSL for defining the grammar there of parsing its own string parsing a triple quoted string so with these this is something I was able to write I was able to rewrite that stuff that we saw at the beginning with all of those weird operators into something that would parse with exactly the semantics that I wanted and proper error handling and matching things like is not null which is not something that's valid in python you could never write is not null sorry you could write is not null you can't write some of the sequel things like like and so on so that's much cleaner and avoids all of the problems that we saw with operator overloading and missing brackets and so on I also use ply to parse a metric definition language that's kind of based on puppet and that was quite useful at the time as well so I guess moving on from just to like sum up the practice of writing and working with DSL there's some stuff that you generally want to have that are nice to have when you're working with these things which is first of all the ability to having constructed an AST to convert that back into some source that might represent that AST turns out to be very handy and it's very handy for writing tests as well and also you need to work quite hard to ensure that the errors come out clearly with the line number because it's really horrible to have DSL that you're working with and it just says there's an error somewhere in there you've given it a thousand lines it doesn't help me at all and things like syntax highlighting and linting this for example is how you syntax highlight in BIM it's just a tokenizer so it's just a set of regular expressions that match certain types of token and then the colors that you assign it to I don't really want to go into BIM syntax but there you go so I guess summing up there are tons of advantages to doing this I found it very useful for improving the quality of code for making it more literate what you mean and for reducing bugs but only if it's done very well you can also use for example the where expression that you saw takes something that looks very much like SQL but completely validates it so there's no possibility of injecting any code in there so that can be used to protect yourself from all kinds of vulnerabilities on the other hand if you encounter a project that is using tons of DSLs that you've never met before and you don't know what they mean and aren't very well documented then that's a potential pitfall for new developers that makes it difficult to work with your project and also by doing this if you side step Python you're presenting a DSL that implements something that is you're going to be working a lot with to produce large amounts of code in then you're going to start missing the IDE support that you have for Python which may just be Jedi is brilliant but it may just be syntax highlighting or autodoc so there you go I encourage you to go out and do it thank you very much two things I guess mostly irrelevant one are you now demanding that Python sprout magic methods for overloading logical and in order? no I'm encouraging people not to try and overload operators generally and two don't leave us hanging how do you fix those YAML errors do you use a backslash on both of them? no you use a different way of expressing yourself in YAML so you use quotes if you use quoted strings then they will automatically be strings and there are other pitfalls and again sort of switching to another form of expressing yourself in YAML will circumvent those pitfalls but if you added quotes around both of the colon problems and the on problem that would have fixed that so YAML is a super set of JSON so you can use quoted strings and curly brackets and square brackets for lists and all of those things about the YAML in fact when you use a Docker compose configuration you almost at the end everything quoted just in fear something fails the question was how do you feel Python compares to other languages like Ruby or Haskell in order to build these else I don't know about Haskell but I've seen some horrible stuff in Ruby as well so it turns out that lots of the well-known DSLs in Ruby are written by monkey patching the object the object class which is the root of all objects object hierarchy and adding methods to that and then they're just available whenever you want but monkey patching built-in objects seems like a terrible idea but it's highly encouraged in Ruby in fact the syntax in Ruby to like just open up a class and stuff more stuff into it perhaps I missed it but which module did you use for your own DSL the act for which DSL for your game so that was just my own parser written line wise with a state machine so I guess I could show you the code for that but it runs to quite a lot of lines the important thing for that is that so indentation is important and it needs to maintain like the state of indentation so each token so it's not just a line a line is there's a tokenizer which turns a line into the amount of indentation it has and one of the regular expression patterns that it matches so the token type and then the value of that token and that feeds into a finite state machine so this bit of a plant question really can we see your game so we can see the demonstration of the DSL so we can see what a wonderful thing it is that you built somewhere in the middle can we have less light in the audience please? Act 1 I have to click so this is the introduction I will let any of the middle bits for you there is music music is not working music is not working music is not going to work so fast forward there is lots of dialogue it is telling the story suggestions for things to do pick up the sock pick up the sock another one the animation is not working normally the skull I can't pick up the kettle at this point it just says look at the kettle again talk to the parrot we could play through the whole thing here but it takes about half an hour an hour that's not even a thing actually it's just a scenery that's something interesting to look at how many lines in your DSL for this game? so the DSL looks like this I want to give too much of the plot away last line 1100 and how many placement lines? oh ok 3000 let's see some more of the DSL sorry it validates as it passes it passes the entire script for the game at the beginning and I think it probably will say on what line it runs into a problem if it runs into a problem we get a few seconds for last question if you want if you're more interested in that in the code right ok ok so there's a system of allow and deny that are bindings for actions that can happen in the game so exactly the same way that you can define a script at top level and it will just play for each of these bindings when you perform those actions it will play a set of scripts which can evolve this debate with other characters and so on it can trigger all kinds of operations it can trigger them giving you something for example that you might need to solve a problem ok thank you Dania