 Okay, go for it. Alright, hello everybody. My name is Jeffrey Olson and this talk is titled Introducing System R. It is about a Lambda Calculus system implemented in Rust that I implemented in sharing and this talk is, you know, describing it. It's also an introduction to Lambda Calculus. The first section. We'll be mostly like slideshow format while we look over the Lambda Cube and just kind of as a concept and analogy and use it to explore some language features and introduce the syntax. Are you guys hearing anything? Can everybody hear us? Let me. Okay, I can hear you, but I can't hear the microphone from the front of the room. I thought. Is it coming off your app top or is it on the owl? That should be, it should all be on the owl. Oh, now I hear it. Okay, much better. Thanks. I stand and project right at it. Probably if I'm sitting down on my laptop. Maybe you can turn the, there you go. Never move it closer also to you, just to make sure. I know sometimes this, all right, I don't want to move it closer, I'm probably going to use this, so let's make sure. Okay. All right, can everybody hear me now? Yeah, I can't hear the three. Say that again? I can't hear you. Okay, great. Okay, once again, my name is Jeffrey Olson. This talk is introducing system R and the first section, like I said, we'll do a introduction to Lambda Calculi and System R in particular. We'll use the Lambda cube as a way to engage with that. I want to speak briefly about TAPL by Pierce and then I'd like to do a quick overview of System R from lexing to evaluation. And then also quickly touch on the extension and dialect system. The second part of this will be interactive. I'll at that point disengage from the slideshow and jump over to. Visual studio code where I have the code base and test harness set up. And then we can explore System R more concretely through a series of specs that are basically like you look at the code and read it and understand it. And then you can answer what the assertion is. And then part 3 is a quick return back to the slides to wrap it up and talk about the System R of tomorrow to cover intermediate representations building up and down. Looking at future enhancements and then also, of course, your questions. So, with that, let's push ahead really quickly about me. I've been programming since. Maybe 2000, early 2000s, I've been working professionally in technology since 2007 pretty much full time as a developer since 2008 2009 worked in several different domains. And right now I'm working as a independent contractor. Most of my domain experiences in.net and, you know, buying a business systems and distributed systems and most of my spare time probably for the past decade or more. I've been really interested in the rest community and more recently, I'm interested as well in a wasm and other web run systems. So, that let's dive into part 1. So really quick note about dual histories that we're going to cover today. 1 is the history of Lambda Calculate itself. I'm really not equipped to do that. I can engage this material from a functional perspective and share that with you in a short timeframe, but talking about the history and kind of, you know, these interplay between, uh, you know, mathematicians and early computer scientists that continue, you know, up to this day that build up Lambda Calculus is really beyond scope of this talk, but suffice to say, it started with the dialogue between Alonzo Church, Curry Howard, excuse me, Haskell Curry and a few other people starting in the 1930s with the untyped Lambda Calculus and then from there, a series of, you know, correspondences quickly produced the simply typed Lambda Calculus. And then over the next several decades, we have incremental improvements in the expressiveness of the system as he kind of explore different domains. And then separate from that, the history of system are itself. I'll dive into probably a little bit more in detail later on when we get to the code. There's a pretty succinct summary of it, but suffice to say that I've found through, you know, my interest in looking for, you know, documentation on Lambda Calculate and then looking for practical implementations. I came across a pretty mature implementation of System F and Rust that I took and then adapted and added several things to at this point, you know, they're quite different and they've diverged in features. And also, you know, I think that the code-based system where I've taken it and kind of updated and maintained it and it runs on Rust negatively now. And so that's, you know, but like I said, we'll dive into it a little more kind of the history of it in a bit. So here we have the Lambda Cube, which is a good analogy, but of course, you know, like all analogies, like it's not perfect, but it's useful for the purpose of this talk for exploring the ideas that are the functionality of System R and kind of how the Lambda Calculus builds up towards, you know, being able to do really cool things. And so we just want to justify that as well as introduce the Lambda Calculus as well as the syntax of System R. So, first things first, of course, we start with the untyped Lambda Calculus, that red dot is in an accident, it'll make sense in a second. It's a really simple expression language, although, you know, nowadays you can look at it and call it that, although, you know, back then, I mean, it exists as kind of a certain formulation of, you know, propositional logic. And it starts with just two terms, abstraction and application. And so abstraction is really quickly as function definition and application, of course, is like equivalent to function invocation. And then adding to that the concepts of alpha conversion and beta reduction, really quickly alpha conversion, it kind of like it covers stuff that you would think about if you cared about assembling formulas, like written by hand over time, baiting in letters and kind of, you know, beginning, you know, like when the Lambda Calculus was first developed, the alpha conversion and beta reductions are kind of the mechanisms by which they kind of simulate computation, which arises more explicitly later than the Curry-Howard isomorphism, which kind of says, you know, that these mathematical systems are equivalent to computational systems. And there's a lot of other things where certain advanced versions of the Lambda Calculus are equivalent to other more academic theoretical kinds of calculus. And, you know, and from there providing a pathway to computation as well because of, you know, equivalence down the chain. Alpha conversion, though, concerned itself mostly with, I guess, the idea of like with multiple scopes of definitions interacting, making sure that the variable names interact. And we don't think that's a big deal, you know, as modern programmers now because we have languages with lexical scope. Like I said, they're figuring this out and stuff was new. And then beta reduction is a concept, you know, that you call like the mechanics of executing a formula where values are kind of, you know, mechanically moved through the system and you can reduce an entire tree of terms down to a final term. And at some point when beta reduction starts, stops producing changes, and then you know that you have a well formed, you know, final and then that's equivalent to evaluation and computation as well. And so with just those first two things, you get a pretty rich language that can do, you know, even though you look at it and it seems like you can't do much, you can assemble all kinds of amazing stuff with it. Just here, I talk about church numerals, which is an idea of just using untyped lambda calculus to code numbers. And from there, you can build on top of it for all kinds of information encoding and representing truthy and fallacy values and then, you know, predicate operations and all kinds of logic stuff. Of course, the fixed point combinator, aka the y combinator is necessary for recursion within computation and functions separate from that later on arises the issue of recursion in type systems as well, which we'll kind of deal with in a passing way. And with these, all things put together this pretty quickly in the correspondence, this thing called Curry's paradox arises, which says that based on kind of how loose, loosey goosey the untyped lambda calculus is. It's easy to take a y combinator and feed it back to itself. If you don't know what the y combinator is, it's necessary, you know, and lambda calculus to do recursion. And it's basically this operation that you apply to a function that takes a recursive function as the first argument and the fix operation is what it's called converts normal function that takes a recursive function to call into another function that you can call without that recursive function and it implements and kind of implements recursion. So with Curry's paradox and using the y combinator and feeding it to itself, you can create a situation where a computation that reduces and, you know, it just it's equivalent to a, you know, unbreaking, you know, you know, an endless loop. And so, let's move on to actually showing that. So, this is the most basic expressions in the untyped lambda calculus. You can see here we have these little number of helpers. The first of course is that slash, which is the lambda operator here on and mostly referred to as functions. So it's helpful to think of it that way. The second item here we see is a parameter to the lambda lambdas of course take one parameter by default curry is easily possible language show that shortly. The parameters apply within the scope and parameters that functions are always lower case course. Next, we have the period, which is the projection operator within the, you know, top level, you know, I guess definition of a lambda or a function the projection is separating the declaration portion from the actual body. And lastly, we have the actual function body, which is expression based in this case. This function takes in single parameter name X and then immediately returns it and lambda calculus circles is commonly known as the identity function. I'm going to pretty straightforward at this point. Is everybody following questions out there. Okay. So, next example, slightly more complex we show parentheses for grouping different sets of functionality together. The first of course is our identity function again in the first position and then following it is another function, and it takes two parameters and it does this by occurring you can see with the projections and then parameters following one right after another. And then you get just get back one function and curing kind of occurs naturally in the syntax of the language. And so, you can see that with that, all it does is it just you know returns the excess passed in. And so these two arrange together, make up a function application so before we showed function declaration or abstraction and now we're showing function or lambda invocation, or application. So, the first function is the thing that's being invoked and the second function is passed as a parameter so in this case it's just an identity function that just bounces back this higher order function that's passed. Lastly, kind of just to you know stretch this out a little bit before with diversion, but we have here an example of fixed point combinator implemented in the entire plan to calculus. Like I said, it looks pretty hairy and it's pretty wild to reduce I've never done it by hand, but I've you know written, you know, examples and used it and I mean by golly it works to you know you pass it a function that like I said it takes a recursive function as its first argument and then the additional you know arguments after that, and then you can get back a function that you can invoke and it can do recursion, as well as passing additional arguments around. And then the last thing here just to point out is a free variable. And so free variables in the context of the lambda calculus programs are either open or closed and open means the program has free variables in it that haven't been bound so you don't fully know their information. It doesn't matter so much in the untyped lambda calculus but it's a thing that exists and is tackled in alphacan alphacan version. You know you can detect kind of as a process of you know, making sure that there are no variable conflicts you can also find unbound variables. And so you can determine as well if a program is open and not that process. But that's just, you know, good to show here because it's a thing that exists theoretically although in practical system or system as you're not going to see you're going to see a fully closed program with any built in intrinsics being used. Quick diversion to look at Wikipedia. If you, you know, wanted to just pop off this call right now I would say just go to the lambda calculus. It's a great article and read it. It's amazing. And it's a great resource in addition to TPL. I learned a lot here but in particular I always really like and I've returned to this a lot when kind of gotten fuzzy in my memory, but around church numerals and kind of numbers and numerals can work. And it's pretty, it's pretty nifty. And then from there you know like I said you can encode numerals, and then you can get the successor function you know which which that you can quickly conduct a, you know, compose a third function that can do, you know, just regular addition. You can do multiplication powers with the predecessor you can implement subtraction. And then moving on to as well, you can create trophy and fall see values and represent them pretty easily in the entire plan to calculus. And then from there you really just get a whole slew of logical operators. And I'm going to show this in the next slide it'll be pretty pretty amusing. Kind of what what it looks like at that point when you start mapping in just if you kind of add this basic idea just like we're just going to map and expose these intrinsics. And they're these things you know and will not even necessarily call it like a let or you know declaration or anything just kind of they're there, and they're mapped. So with that in mind jump back to the slide show. Okay. And then I wanted to show like this last code example really quick. This would be an example of with everything in place. You know, like I said with those intrinsics specified, kind of an example of what a fully kind of fleshed out function and we do something interesting looks like. One of the things that's new here that I didn't really introduce is the fix operator so like I said that's how you would implement a recursion. You have the fix operator and then in the tail position is the function that's going to be converted into a recursive function and that's enclosed within these parentheses and then to that function which is now the thing that's here in the lead position we're passing 10 and this additional. Excuse me. No, yeah, we're passing 10 and then an angler here. And so does anybody like off the top of their head know what this what this does. Because then could you take a guess at it. Okay. That's funny. It, what this does is this is a. What's that 10 is not quite. This is the function so this is the this is actually not super readable I don't blame anybody and it's not you know knock on anyone for not understanding it. This first function here what this is is a recursive basically countdown function that will be passed a number and a another function. And it will so it itself is a recursive higher order function and based on the number that it's passed it will execute that function that many times as it counts down to zero. And then we take that thing and define it. And then we pass it 10 and then the function that all it does is check if x is zero. It's, I guess I'll take a second to try and kind of break down and explain how this works. To me, I guess one of the things that's interesting about it I don't think this occurs to anybody else but it looks very list like to me once you start throwing in keywords and you know it starts, they start looking like special, you know, kind of like, you know, macros, you know, or whatever. And you realize kind of there's a front position so you can see how this, you know, kind of falls out of this as well. And, you know, and they're, you know, implicit and explicit connections. But so really quickly, like I said there's fix. And then there's this function and so this function takes a recursive function so this is it calling itself that's what this parameter is this is the counter, and this is the function that it's going to invoke. The first thing we have is this test. And so the test has three parameters that are passed to it, you know, it's a current function. And the first one is the thing that are the thing that it's going to test. And if it's truthy, then it executes the sec the first branch, and if it's falsely excuse me so this I needed to, you know, color code these parentheses because it gets tricky. So this is the first branch of the if, and then excuse me. Oh, no, excuse me. Okay. Yeah, so this guy so this is the first branch, and then this is the second so if this is false then it just returns zero otherwise if this is a truthy value if C is not zero. So it executes this branch, which is another if and the first thing it does is it executes the function it was passed. And if the function returns a truthy value remember this is the function that's going to be passed and all it does is it checks if it's x is not zero. So it will keep returning true at which point we now recurse. But first we decrement that's what this pred C is we're decrementing the counter, and we pass that function at which point we recurse back to the beginning, and we keep doing this until we get to zero. And it doesn't do anything here, you know, but I mean in this case it's be like invoking the function for a side effect. This isn't like any kind of functional operation. This is just an example of kind of mechanically, how you can, you know, get the system to do stuff also disclaimer I actually had a chance to debug this code because I don't have an untied plan to calculus interpreter on hand it matches this syntax. But a bunch of the other stuff that we see from here on out will, especially in the second section will be. Okay, what, why do you have two if the Nels statements so it looks like both the if all statements are checking that C is not zero correct. The first one checks that C is not zero and then the second one executes the funk executes the function that was passed in, which is the exact same check you are correct, but really that it's arbitrary functionality this tail function could be anything it could be it could be like it could be loading files, or it could be you know pulling invites for side effects. I see I see I see so so this would be like a this would be like a loop that that performs a task and then and then if the task succeeds then it does it 10 more times or something. Yeah, it's kind of a it demonstrates crude iteration in the entire. Okay. Okay, great. So, going from there. I'm going to the simply type plan to calculus so because of Paris paradox which I talked about briefly earlier. Alonzo church kind of quickly you know kind of devises and introduces the concept of a type system so the terms that are expressed they now have types that you know kind of are a separate and associated from their associated with but separate from their we introduce the concept of type constructors and the first type constructors one for making function types, and then we, you know also would introduce a fixed set of built in types like the natural numbers bullions, and probably also we'd add literals as well. And show quick example of this. And now in this code example again we have the identity function but now it has a type of the colon within parameters separates on the left hand side the variable name from the right hand side type. So here now is the type of rx. And the second example would show the arrow operator. And so here, this is a function x, or excuse me a function x and, you know, x and v that takes two arguments the one is a function that takes in a net and returns a pool and the second argument is in that and then it just, you know a value it passes those things in and then returns whatever it returns. So, both of these are examples of types that appear, you know into in type position and so the idea is within type position there's a restricted dialect of terms that are allowed within there and those represent the type system but they're separate from, you know, like actual full general purpose terms. A goal that kind of will get to expand this system is to allow arbitrary terms into type position, which leads towards dependent types. At this point now with the simply type lambda calculus introduced it's good time to talk about what I call non lambda cube features. These are things like our algebraic data types. So product types, which commonly we call tuples. Also records are kind of there as a kind of convenience papering over tuples rust has a structs as well as tuples, and they all kind of have specific because of the implementation details of rust. They also have kind of specific memory representations but also they you know fit within the product type category within algebraic data types. And then we also have some types enums and rust or data types in Askel with some types where the product type is all of these things together and the value that satisfies that type well the some type you're saying it's one any one of these multiple variants. And so we'll get into a product and some types, pretty quickly, because they become very useful. Once you introduce pattern matching and the case expression, you're familiar with it as match from other languages I mean this exists pretty much all of the, you know, more strictly type of functional languages Haskell, Sharp, you know, rust, you know, et cetera, et cetera. And so I'm not going to kind of do a deep dive mechanically right this second into what pattern matching is but I will kind of show the code of the elegance of it as as it exists within system are and kind of it's derived from a rust pattern matching exhaust system. And then also finally let polymorphism, which is, if you're familiar, of course, you know, functional languages, you know, the let keyword is kind of, you know, for declaring values and setting them aside, often, immediately. But within kind of this context, it's introduced as let polymorphism and has some structural differences that make it more kind of an organizing system for larger sets of code and the way to kind of string together multiple lets as single expressions together with, you know, some final of activity with all of that built up context. And so it looks pretty ugly in practice and I think that, you know, like, like, like go to considered harmful I think there have been let polymorphism considered harmful papers written. And people, you know, you shouldn't write programs with it but it's really useful for, you know, if you were going to write say a higher level language and then spit out a land of calculus from it let polymorphism is a super useful way. You know, it subsumes and is a super set of any module system you ever wanted to express it's also way more concrete easy to reason about let wrecks within you know and all the lets within the the list world are possible to be modeled within it and so it's really powerful organizing tool for for higher level languages. And so now we'll dive into some examples. So here for, you know, product types we show a print style type constructor for troubles. This is a couple that takes two Nats. And then we have an example of using the projection operator the period again to project into the tuple value to pull out a parameter in one of the positions by zero index. In the second example we have some type constructor, and you can see pretty straightforward curly braces the limited. We have labeled variants we have no variance, and we have a bar separator would kind of dive more into what's there because it makes sense in the context of what comes immediately after it of course, which is the beginning of a case expression. So we're operating on the bar X, which is passed into this function. And then the of the limits that after that, we get the actual individual arms of the case expression. And so those are delimited by the bars. The exhaustiveness will be validated during the type checking process but you know suffice to say that something that's necessary so that you can kind of soundness. And then within an actual individual arm, the first portion we have the pattern, you know, okay, you know, like folder structuring. And it's, you know, these are things that kind of trickle down to a lot of non functional languages over time and people are really interested about, but it's really interesting and notable kind of how elegant it is and how this is just built, you know, from the ground up and it's all there. It's, you know, great because it's just, you know, from the get go. And so I guess now is a pretty good time actually to kind of really quickly dive into the implementation of patterns just to kind of just demonstrate really easily how they work. So I'm working in here. So this is the top level module for patterns. And this is the main enum for expressing it as a as a term within system are so these variants. This is rust code. You can cover what the all the different kinds of patterns that you can have and it's really straightforward. We have the any wild card pattern also underscore this always matches also case expressions are, are, you know, checked from first down and they have to be exhaustive, but ordering patterns. The next possible pattern is a literal so any literals that are supported and I can really quickly show that you can see we have the unit type we have bullions we have the natural numbers tags which I'll show a little bit later as well as a byte literals. So any of those things could exist in an in pattern position of course, commonly a variable binding. Of course, we can have products, and the product type can be a vector pattern so any of these other things can exist within there we can have constructors for individual arms of a some type, showing the string for the label, and then a pattern for the actual instructor portion. And then I also mentioned this system ours bill for extensibility I'll talk quite a bit more about this later but suffice to say that in all of these language constructs, we have tokens terms. We have types, we have patterns, you know these basic they that because the system is extensible we have to add a variant to account for extensibility. And then when these systems are actually reified at runtime when the actual lexing parsing happens. There's an actual concrete dialect that may or may not provide extended behavior. The actual dialect that the interpreter for system are in force called the bottom dialect, and all higher dialects to translate down towards it. I have an example of a dialect that translate down translates down into the bottom dialect and then you can execute it. And, you know, and run it and it shows kind of the idea of how these things can be built up anyways, this exists everywhere and it's pervasive throughout the system. So back to the slideshow. Okay, so I said pattern full structuring. We talked about patterns and system are. And then the last bit of course is there's the big arrow and then that separates the pattern from the expression. So if the pattern is matched that in that on then we evaluate the expression with any bindings that that are that happen in that arm pattern. Any questions. So the question is a chance. Okay. All right. Next, we have an example of the left, let polymorphism via the light keyword. So here, we show a lit keyword and the position. And then next is the new binding created for this let so this is a pattern, which means that based on whatever you're storing here we could, you know, have arbitrary patterns destructuring the information that's returned. And we have an equal sign separates that binding portion from the value expression, which when this is evaluated and when the lead is evaluated, it's going to evaluate the contents of the value expression and store that in that finding the end keyword separate and the equal sign of the end keyword to limit the entire contents of the value expression. And so then with those things put together the following after in is what you would call the effective scope for the lead expression. And so you evaluate this last after the, you know, the value expression has been computed and stored, and then you're able to now use that binding in this scope. So of course, you can kind of see like how this just, you know, looks a lot like lexical scope and other languages, but kind of just mechanically works a little differently. Since it's purely expression based kind of the stuff you do and you can't really do, you know, like executing multiple things one after another without special constructs, or kind of ways to cheat and, you know, short treat, kind of do things that you need to say are not in the spirit of functional programming. And of course, practical languages always have to get into side effects, which is something that's not accounted for here. Okay, so with that out of the way we now can move beyond the simply type landed calculus and talk about the next dimension of complex complexity, which kind of goes in three directions. And the first of those is type polymorphism, and the basic ideas that we introduced type abstraction so the way that we the same way we can abstract. We can do function abstraction declare new functions and then we can pass values into an auto now we can do type abstraction at the function abstraction boundary so within this context type polymorphism is basically integrated with and tightly coupled to functions. We can do type abstraction at the function abstraction boundaries will show in a second, as well as type application at the function application boundary so what that means is, when you declare a function, you can introduce new, you know, polymorphic types generic types, and then within the scope of that, or excuse me, when you then take that, you know, polymorphic function and you invoke it or you want to apply it you have to set a concrete type and that's type application. Basically, this is system f. And we'll kind of get into, you know, so this makes up a bunch of additional things, kind of, you know, comprise what is a complete system f system because we'll deal with kind of the details of the type system of the things that come up. So, once again, we're back to our generic identity function. We have the benefits of let polymorphism now so we can kind of organize it a little better and you can show abstraction and application together. So first we have the let and the value of it is a new function, which the first thing it does is introduces the new type, and you see we're using that slash again. In the literature, there is a separate kind of uppercase lambda that looks like a straight triangle that would be used in addition to the lowercase lambda, which is for function abstraction. So this is basically type abstraction, then they immediately the function abstraction and those two things together are like associate and really that type abstraction is subordinate to the function. And then, but from there, it's a normal function again, and we're using that type for that we introduced previously within the scope of that function. The function had sub functions that were also polymorphic and wanted to propagate that type forward, it would be using that when it would do that when it would apply the type within the body of the function but instead here, we're just returning. So this, this function is stored in the value of I, and then we come, and we now do a function application so we also have to do type application and so if you're familiar generic systems is the way you leify types when you do you know a generic indication you know say in C sharp or net you have a list of T and once you actually fulfill what that T is in the actual application code you that that that is type application. So here once again like I said this is just the generic identity function it's going to return and echo back whatever it's handled. The next access away from the type polymorphism is what would be considered dependent types and so the kind of the, the, the, the, the distinction between them is that for type polymorphism you have what are called terms that take types which they really are terms that take types for dependent types you're talking about types that take terms, which is the idea of kind of arbitrary values or maybe special restricted set like say just the natural numbers that become a part of the type the canonical example for this of course is fixed like vectors in a language like rust that's important because rust actually considers you know a C style memory representation. But in, you know, in higher, you know, functional languages usually that's just used kind of as a way to enforce you know strictness around the type system, of course dependent types can do a lot more than that. I'm not really developing to explore and honestly I'm not here to talk about it. The last access kind of moving from the front pain to the back pain is generalized type constructors, and this isn't a big deal on its own. We already showed examples with kind of how the type system has been enriched with algebraic data types, and how there's these multiple type constructors being introduced but they're kind of narrow in their context and you can't find new symbols with them you have to declare them and repeat them over and over. Well a generalized type constructor means that you have basically a facility similar to let for you know storing values, including functions, but instead for declaring and setting aside types. But any language that has that functionality which really is like you know be you know any any function, you know or any many many languages have it is the I you know it's kind of an advancement over and kind of separate from system f and other languages. But really, the cube starts to make sense when you move from the front pain to the back pain and you apply the idea of generalized type scriptures to both dependent type systems and type polymorphic systems from that point you start getting you know, once those type constructors are adapted to those two systems, you know their respective systems, those systems become a lot more powerful system f becomes a leaf system f omega is what it's called, and that's really just it's just system f with with with type with type polymorphic friendly type constructors. And so you know and from there, you know you're you're pretty much at you know Haskell O camel, you know, most kind of your your your typical, you know, level of complexity. You know the one place where you know how Haskell kind of separates itself is the way that through, you know, in a pretty intelligible way with monads that you can kind of represent state flow and mutation and you know by them as well as a full effect system. And so all of those things kind of you know you're building up towards this crescendo of convergence on what is known as the calculus of constructions. Agda and coke are there improvements that kind of exist in this space. It's fair to note as well though that in if you move forward to the front pain, there's a there's a calculus. It's the combination of type polymorphism and dependent types but without type constructors, and it's not really useful of course you know until you actually moved to the fully realized system. We're actually not there though today we're mostly going to be where the blue dot is in the upper left corner and then I'll probably show you an example that moves over to purple dot in the upper left. Of course this is isn't the only lamp to cube. And you know the concept of the cube is arbitrary as I said at the beginning it's an analogy and analogies break down over kind of you know and they're over analyzed this isn't the totality of type systems, and things go in different directions and the goal is system arms to provide a strong foundation base to do interesting things really quickly talk about this book I brought it with me. I use it mostly as a reference. It is a types of programming languages by Benjamin Pierce. In this space it covers pretty much everything up through that I discussed here up through system app and great detail provides talks about implementations and other languages, and also gets into basically proving the soundness and edges of the soundness of these systems. I came to it through the code base that I adapted for system are, and I realized that I needed it kind of to understand the dark corners of the code base I adopted. And so I took that and then you know in addition to other sources I've shown. It's really good stuff. So really quickly. Now we're going to talk about how a compilation happens in system are. And so, for sake of brevity up just actually going to run this forward to the end. And you can look at this and see the complete chain. We have a corpus of code it's fed into the parser sub system which has its own internal multi step loop. The parser subsumes the lexer in this case and it takes in the text, and it just feeds it to the lexer and basically takes tokens, and then has a recursive process where with the next token it knows what kind of, you know, maybe a more complex term it's entering into so it can set expectations for what the following tokens are allowed to be. And if they aren't those things and it can, you know, basically create parser errors. And so in this way, we get out a fully formed term which we can pass through the type checking subsystem. The first thing that happens is we do the AOC up to this point. All of the variables are stored in devroing indexes which are created in the parser subsystem and then when the parser subsystem is taken apart kind of like a rough and rust a fine types terms. We, we basically retire that type and pull out the contents of it and use it for other things. We take those devroing indexes and we pass them on to the type system and the first thing it does is it basically de-aliases all those variables. That's all the actual things back in there. And then we can run through and walk the entire whole tree of terms and validate all the types. And so the all the type system, you know, subsystem check is doing is just walking all the terms and validating that the types for terms that are nested within and the types that bubble out are all, you know, everything connects together. And then from that you have a fully validated term, which is actually a tree of nested terms. So it's like it's like an abstract syntax tree. Once again, these things, you know, the Curry-Howard correspondence shows, you know, isomorphism that these things are connected. We can move from theoretical concepts to concrete, more concrete computer science concepts and, you know, on towards real systems. But you can take that type checked and valid term and you can pass it to the evaluation subsystem, which honestly all it does is alpha conversion and beta reduction and just choose right through it until the term doesn't change anymore. And at that point it's fully evaluated. If an error rises along the way, you know, we deal with it. So up to this point, you have to ask, what is this useful for? And really just run through this quick video. In particular, we present a sequence of refinements of algebraic effects going by a multi-prompt delimited control, generalized evidence passing, yield bubbling, and finally a monadic translation into a plain lambda calculus, which can be compiled efficiently to many target forms. This is from GA and Lyon from 2021. It's an awesome paper, kind of one of many papers that go back, I guess, at this point, more than 10 years around COCA and effect handlers based systems and effect systems where you type systems expanded to handle computational effects, you know, so it's another kind of direction of the cube where we basically are polymorphic over computation. So if you're familiar with the function coloring problem, you know that effect handlers can subsume that. And so it's a really powerful system. But all they're saying here is that they have a bunch of higher concepts and they know how to walk down through all of those back to a proven, you know, lambda calculus and all of their proofs in the paper are just showing how these higher representations are sound as they move down. And then at one point they're like, and now it's just regular system F, and you know system F is good and we're cool. And so that's kind of the power of this is this idea. And this has been, you know, happening in the Academy for decades, of course. But it just exists in paper, their ideas or concrete implementations are done in C or in the case of COCA it's a castle, and you know, see, and or wasm. And, you know, they're, they're, these are like actual, like they reify the lambda calculi as actual computational systems. And, but system R is operating at a higher level. So, and we're going to kind of tie this together and why it makes sense we have to talk about dialect and extensibility. And it's pretty straightforward if you imagine the previous compilation system, you can just say all steps extend on a compilation extension API where they encounter extended types in the lexer in the parser or in the type system on the type check process because they've been generated by previous steps. So, you know how to call in and tend to for this dialect and handle that, and then we can then take that information and you know everything up through type check can just be consistent it's one system flow but there's just these extensions for different dialect features. And then the last step is you have to be able to translate that back down to the bottom. Like was was described in that quote in the previous slide. So you have a mechanical facility to basically translate your higher order lambda calculus into a lower one, aka the bottom to highlight. And at that point you can just pass it to the eval subsystem as before. Okay, so any questions at this point. Those questions about line. All right, great. We're going to move into the interactive portion. We're going to see how it goes. And to be honest, if we're just if I'm just crushing y'all, I'll cut you guys some slack, because you know this stuff is pretty inscrutable and I mean I've stared at it for many, many hours. So of course it makes sense to me, but you know your mileage may vary. So this is the code base for system are it is a pretty straightforward rust code base is what you call a workspace based code base so it has several sub crates within it. We have libraries executables, etc. We have a definition of the members. And then of course we have the default which kind of just drives the behavior of cargo itself. And then within these top level folders we have the actual implementation going from the top we have the dialects so this is actual implementations of a sub dialects. We only have one right now, and I'll show that in a bit. We have the evaluation subsystem actually has a testing interface which depends on system are and kind of knows about extensions, and then the actual interpreter or the bottom dialect eval system and that is like it only knows about the bottom dialect it's like fits to it so the idea is you have to you have to translate down into a bottom dialect if you want to pass it in. And then we have our test suite. For this, you should just know that I'm a big fan of, I guess cucumber Gherkin style tests. So we'll be exploring that for the interactive portion. And then we have the actual implementation of system are itself this is broken down into kind of several major subsystems for the process that I showed above. And then of course last we have the x tasks which are just extensions for cargo for certain things like generating coverage reports and some other stuff that I have planned for the future. Okay. So, a good place also to start is to look at the read me for this code base. This is on GitHub of course. And so, really, it's pretty brief and succinct. I keep the description system from the original read me in the original code base. It does a pretty good job of describing what was there. You know I consider myself standing on the shoulder of giants, benefiting from that. And so, you know, but I'm taking extending it I cleaned it up added several things. And then again here is the explanation of the system itself. But now let's go ahead and just dive into the code. So what I have here is one special set aside Gherkin feature that features a set of tests that are all kind of stuff out. And the idea is you read the code and understand it and then you can basically solve the test. And here I go ahead and run it. You can see it says 10 scenarios 10 skipped the reason is for each one of these specs. The last step is a step that actually doesn't exist because I just put garbage characters in there. And then as we run through and fix each one we expect the number to go down. Starting at the top, we just have a scenario zero. And you can for the most part just ignore, you know, or you can look at the steps which is understand that they're kind of turning the system our machinery in the background through the testing interface and the eval crate. And then, you know, validating that everything looks good and then of course we can look at the final value so it's first one here of course this is a straightforward non generic identity function that we invoke and pass in 64. We know what the resulting value of invoke of evaluating this would be approximately 64. Very good. All right, great. Watch. So here we go we run it. Well one past all right. And percent. All right. Up next, a scenario one that along with the generic identity function and we're applying bull. Once again here, we know the revolt resulting evals was going to be a bullion type I mean it can only be one of one of two which do you think it is. Nice. Once again, again with that at this point you guys know exactly how this works generic net function. Okay, I'm not, I'm not going to, you know, insult any of these intelligence. But we're getting there. All right, we got three past we got seven, seven to go. Don't worry. It gets really hard really fast. Okay, so here now we get to see a case and, you know, algebraic data types. I think that this is like a basic swizzler function so if you understand what that is you could probably cheat and say does anybody have a crack at what this is this is also kind of unfair because you have to I don't know if you want to describe it's not just a single thing it reverses a couple. Yeah, exactly so very there you go. I've done a little bit of rustic then have sharp some pattern action. So I can tell for the pattern matching at the last quality. Yeah, this is just a reading exercise. Yeah. Okay, so moving on next. A couple other concepts to cover that I didn't really talk about before platform binding. So this is the idea. This is the concrete mechanism. This is an enhancement that I added for how you'd expose, you know, the P invoke basically, you know, the idea of how you call the underlying system. And so right now there's a rust API where you can supplant you can pass in stuff to all the parsing type checking and evaluation, and it does validation at each point along the way to make sure that everything lines up. The API allows you to kind of do good enough for system of typing as you know but it's a separate type system from rest of course you know it's a rust API. So let's just know that we get these two functions now add and that sub. This is really actually easy I can just tell you this is a Fibonacci function. And all we want to know is what for us. So does anybody know what for us. What's that. Without zero. They're skipping the other implementation. Zero. One, two, three, five. Okay. So all right halfway there. Okay. So we have unfold part one. So this is the thing I didn't really get into I mentioned recursion in the functional recursion which fix enables, but also as you kind of make these type systems more complex, you have the concept of recursive types. And so they're they're unrestrained recursive types are in sound, and they're not allowed in a lot of different languages for pretty good reasons, like in rest in particular you have to box them and set them aside so you don't have you know full full recursion, the static level you have to have that pointer in direction. So we have that as well. And so what we have is this specification of a recursive type and then we have to set aside a label, because we know by nature of it being recursive it's going to recurse back into itself. Some point later, you could call this this or self, if you wanted. You know you understand that's the recursive type and the actual contents of the type are, you know, basic algebraic type, and we have a nil variant, and then the other is a cons, and it is that cons is the label and but the contents of it are a tuple of nap, and then the second, the item in the second is the recursive, the recursive list. So, because we don't have type constructors in the normal bottom dialect of system are this type is repeated over and over and over, and it's a ton of noise. I'm going to actually turn it up to 11 a little later. But really, what you want to say is that this is like car function if you're familiar with list, especially like scheme, you've ever done the little schema you know car, could or cons, you know, etc. So, all you just have to guess about what it is and I'll say that we create this car function, which is doing all this stuff with this. This is unfold and this is how you actually interact with the type in a case structure this code won't compile if the unfold isn't there. And it's part of it is kind of, it allows you to destructure it at this point this is the actual recursive type again this is that cons, and the second parameter is the recursive type part of the unfold kind of makes this like not recursive anymore. I want to actually do functional recursive which I'll show in later example you have to do an additional step, because the unfold kind of unpacks it but you can't. You can't pattern match on it unless you use this that the unfold is kind of a, you know, like the abstraction step that you have to insert at execution time when you evaluate a recursive type. This is just kind of the noise of the language, but this is kind of how you rationalize it so, like I said it's this function called car, and then we know that we just pass it this cons list so once again it's this type. This is really hideous but all you need, you know, if you understand that it's basic. It's either a nil, or it's a cell of a map and then a following item you can see. So this is like basically a link list is what this is or you know a cons cell right that has a car and a cutter. And so you see that this is the car of the first cell, and then the second, the cutter becomes basically the cons for the next one so we can see we have 1020 and then we have the end of the list as nil so really what this is is a linked list of three items. The first two are maths, and then the last is the nil that terminates it so the utility here of the recursive type is basically, you know, generic, you know, types, you know the ability to kind of do like what you would do with like a link list and other So, we're going to take this link list and we're going to feed it into car, does anybody know what the answer is. 10. Let's try 10. Correct. It is 10 because what this function does is it returns the car which is the lead item off of the list. So this is actually, like I said I'll show recursion. An example, a little later, but all we're doing here and it's just what happens here is we're saying hey you handed in this list. It's the argument to this function. And we're going to, you know, match on it. And if it's a const cell just return the first item so all this is going to take if you hand it it's going to be the first car of the first item. You know, and if it's nil, you return zero instead if you know if that was all this year. All right. Very, very good. So next let's take a take a break from that. And actually once again kind of talk about recursion with fix kind of visited it briefly with Fibonacci. And here's kind of another example of what we do, although it's probably a little trickier to guess what this does. Does anybody have any idea what this does. Okay. Yes, so that's part of it. This value it's going to recurse down this many times. That's the first part. And so we see here if we look at this function, Z is that recursive is the recursive function so where Z is invoked. That's where the action is going to be where the recursion is. Once again, you know we're passing in. This is the first. So why zero is the first thing that was passed in right so it's it's these two Nats. And so we're taking that first item and we're decrementing it. This, this is taking so and then we're actually and then we're taking the second one. And we're basically adding two to it. And then we're repackaging that up as a other couple and we're passing it back to Z. So what you can say is that for the number that's passed in, it's going to that many times it's going to add two to this starting number. So it's pretty straightforward. It's like a fix. It's like a. Yes, it's like a fixed times two. And this is kind of backwards. But you know, this code isn't really meant to be, you know, elegant or nice to look at but like, you know, it's, it's what's, you know, remarkable is just like, like I said how basically approachable, you know, kind of it is and how it makes sense. Okay, so there we go. On full part one recursion with fixed oops I did not fix this credit. It didn't pass because it was still misnamed there we go seven past three to go. All right, this is the doozy. This is the big ones. This is the big thinking one. Oh man. Thanks. Don't look at that. Oh, I need to turn on word. Yeah, so, um, this is this is related to the first one. You could say that this is an evolution of the first function. It uses type recursion which we showed it unfold part one along with, you know, functional recursion which we showed in our previous example as well as with Fibonacci. And, you know, it's so it's interesting what's here and this is like, you know, like I said it's completely mechanical it's different from you know many other programming languages but what this is doing is this we're setting aside this L function, and what L is, is it's going to be a recursive function that is going to take in the first thing is a, this is the, the arrow type so this is the fun this is the recursive function, the first value of it is that recursive type this is pretty hairy here, and it has to be surrounding parentheses then we have the arrow and it returns a nap. So, the second parameter is a list of our recursive type. Once again the recursive type occurs many many times here and it creates a ton of noise. So, if you, if you have a type constructor and you can set aside that type in a type constructor suddenly this becomes, you know, a lot less damaging on your on your. From that we take a case, and we dive into the list again. What's up. What it does is it dives into the list and if it is not nil, but it's a cons, it then wants to see what the second element of the console is. And if that is another console which means the list continues. It's going to recurse back into itself. And this is the thing I showed you with I mentioned earlier about work for these. Once you do unfold on these types it kind of breaks the recursiveness of it. So, if I want to actually take this and be a thing that I can feed back to the recursive function, I actually have to reconstruct it. And that's what happens here. You can see cons of love love love love love like that's, that's this type, and then I'm feeding that into the back into the recursive function so really what this does is it does the opposite of the previous one. And it's just going to instead of returning the first Ella the first car of the first element of the list. It's going to walk all the way down the list until it gets to the end, and then back up and grab the previous. So it's equivalent to last, you would say in a link list it finds the null terminator, and then it backs up so you know I throw this one's a freebie. But it's 20 is the answer. Like I said, there we go eight past two to go the rest are breeze I promise to be able to make sense of me at this point you guys are rock stars. So next we're going to talk about tags. These are a feature that I added. If you're familiar with Erlong probably know tags. They appear a lot, you know, and especially once you use like overloaded functions they become useful. They're called excuse me and they're called Adams in Erlong. And so with kind of the way that Erlong side steps, you know, I should say in its original incarnation, you know, kind of it's becoming you know especially through elixir they're introducing more type stuff into it but in classical Erlong. A lot of the stuff that you do is statically types since Erlong is like radically untyped or you'd say very, very, very late tight. You do stuff when overloading that you would do with types you use tax to do that instead. I think they're pretty neat they don't make a lot of sense in the language right now, but I think that they're subsequent features that I'd want to introduce. It would make great use of them, especially nominalisms, you know, separate from like you have labels and algebraic data types but I kind of want something separate from that you can use in function positions. And so this is an example of using that. It doesn't do a lot here basically just echoes back out the number. As long as you know it's it's been, you know, specified correctly this actually is an arm that would never get executed. This is pretty much, you know, exhaustive as it is. And all it's going to do is just echo echo back the X, but I just kind of with this, you know, with the place where system are the base bottom right now tags don't make a lot of sense but I think they're pretty neat and I use this this was like my babby steps kind of my first modification of the language that I did as a modification to the base language instead of an extension over I could you know implement that I wanted to do a small change this was kind of part of that progression. And so I hope that there'll be future dialects still actually be able to leverage tags to do interesting things like especially like a. Go ahead. Just out of curiosity so it's not complaining when it's when you do case X of X comma at food. So is it. I'm just I'm just noticing that so you couldn't. If you had like case Y of true then return Y then it wouldn't. Does it like does it forget about what X is or does X mask. Does the second instance of X mask, the case instance of X, or is it where you lose the scope of that variable when you go into that. Yeah, so it changes to that that's totally fine but. So, so I can answer it for you it shadows it because of alpha conversion and the De Bruyne indexes. That's what it so it shadows it okay so it's it's not. Okay, so it's not like it's not like the the the variable gets forgotten about it. It literally is shadowing it. Okay. Yeah, because I mean it all kind of you have to see kind of the mechanics of it like I said like this thing is like really it's like a giant clockwork machine and that's what's really interesting about land to calculate is like it's at this level where you can actually see the individual individual pieces and and that's an example of like where you can kind of you can watch stuff flow into and out of the De Bruyne indexes and kind of how with the generated terms they just reference numbers instead of actual values or variables by me. And so, so if you put if you had like case x of parentheses x comma at foo goes to x, and then you had the second line was underscore goes to x, then. Yeah, they return different things. They return different things because in that first branch you. Okay, in the first branch the x would shadow the previous expert the second branch is sort of a hired scope so it wouldn't. Okay, just try to see what I complain about sorry. You go. Okay. Wasn't happy about it doesn't consider it. Okay, I know. Yeah, there we go. Sorry just wanted to, you know this is exhaustive on its own. It should be. Oh, look at that seats complaining it's not exhaustive so that's why I had it in there funny. I've already been through this obviously. Yeah, fun times. So that's probably like a little I should add that to the rough edges, because I consider I mean this first pattern with the tag it's I consider this exhaustive. Like this is equivalent to like really because this is just an atom value this can actually be eliminated you can't do anything with this. And so it's really just the binding and yeah. That's actually the thing I want to do. I'm going to note that down for later. Okay. And then so from here want to move on to be very fast. One, and this is bite literal. So, this is pretty straightforward. It's just square brackets where you can pass in a common delimited list of zero through 255. And you know, there it is. There's your bite literals. And this is funny. I actually have a more complex example that, you know, had a pretty a pretty long bite string, but I just truncated it for this and I don't know if anybody has the ASCII table memorized but I'll spare you guys from having to do that. But this is just get because the ASCII string was like a, it was an HTTP request that I have encoded as a bite string or bite literal. And so this is something that I wanted to add originally I wanted to do. I was going to do flows and maybe even decimals as well but I realized that just with the national numbers and bytes. And we just do it as higher representations. And I would make a lot more sense to do that so I'm going to, I'm going to just kind of set that aside and leave those for for dialects or you know some kind of more, more complex. Yep. All right, so we're back to slide show land. And we're at a part three really quick we will run through kind of what's upcoming for this system and of course taking more your questions. With this. What's in the green is kind of what we have and what I showed and talked about oh you know what there's one last thing I did one actually show that I did which is the dialect system. So I have specs for the dialects for type alias this is the one that I have kind of mostly implemented. And so really what this is, is there's just a new type keyword, and it behaves a lot like let, like let me say polymorphism but we have a different namespace that starts with a dollar sign and uppercase and then it lets you just declare and set aside a title, and then it applies within the scope. And so all this is is the generalized type constructor applied to type polymorphism. This is an example of with this dialect in place and there's some holes in it. And I wanted to, because I'm the, that unfold part two was so ugly I actually wanted to do it in type alias dialect and I found some places where it breaks down, because since I've written this, you know, this dialect I've actually learned a lot more about the language kind of more advanced features. And so I'm going to come back to this stuff. But this basically just does you know like I said type constructors and allows you to declare set it aside, you can see here we're doing, you know type application so this is like a full type, you know, and we're, you know, fully modifying it. This is kind of when what's novel about that as well as the type abstraction is lifted to that type declaration level instead of only happening within functions so it's like you know the way you can declare generic types in many languages the same thing there. So that's that's that's a really kind of notable thing. The dialects themselves. It's, it's, it's pretty wild. I want to probably talk a little bit more about it in a bit what I want to do there but that's some changes, but you could just say from above, you know, any dialect you can build up towards you. A place where you have a sufficiently advanced dialect say that does things like affect handlers or any kind of special system that you want to be able to model that you would just write a language and it's parser and lexer would just spit out tokens in terms in that higher like and so me you know the idea is that you can build this up to and the ceiling of it is right below an actual industrial programming language that can then be translated down but still be really powerful. I think it provides a common substrate. It's it's another IR that you know the idea kind of that the promise of the dot net by codes. You know, those are lower level this is existing at a higher level and also I think it provides kind of the, the, the right level of abstraction to kind of have like a function based ecosystem and modules, instead of, you know, and then be able to stream those together into a whole program and compilation time. And but that stuff that's kind of outside the scope of system itself and then building down below the base dialect, or the bottom dialect you can either evaluate this or you can do, you know, you can move it towards lower representations. If you wanted to take system or bottom dialect and terms, and you wanted to instead of evaluating it say you wanted to turn it into wasm or you wanted to turn it into rust, you know, lexical rust code, or you wanted to turn it into you know rust by codes or LLVM IR or, you know, dot net I L whatever you need to kind of have a better handle on language on the memory or you have to kind of add a bunch of functionality. And for me the easiest way to do that is to implement a linear dot a linear lambda calculus dialect below the base dialect. And so the idea is that the bottom dialect is kind of ignorant of memory and it translates down and in the process you do with linear algebra you you're able to look at a value and you're able to say was this thing used multiple times was it passed to another function. You know, if your values allow mutation you would say was this thing mutated and how. And so you get, you know, things like escape analysis and kind of what happens in rust with the A fine types, which is a kind of limited subset of linear types. You're getting all of that and then with that information on hand, you can then make pretty good assumptions about what kind of byte codes you want to generate especially if you have kind of a more explicit manual memory system which is pretty cool. Another thing I'm going to talk about here. So, for, I guess, really quick. So the idea is with the question marks on the bottom, you know, that could be wasm anything else. The linear everything in yellow obviously doesn't exist. The linear system are something that I think would be really cool. The dialect system as it exists. It's for translating, you know, dialects of one kind to another. The bottom dialect is itself a dialect, which means I could write a translator for it into a linear system. And so you could know how to if you have valid system are you can generate valid linear, you know, whatever. And so the idea is that just, you know, you get the annotations instead of enforcing a linear type system you just do all the linear checking annotate all the types and values so you can make that those decisions about lower representation. Any questions. Okay, one more slide we're almost there. No, I can hold to the end. Okay, I thought it was the end of the talk. Really quick. So for future enhancements, you know, like, this is with the kind of caveat that, you know, I mean, I'm an independent contractor. So, you know, I'm working on, you know, project, basically to earn money to support myself. I do this in my spare time. This isn't anything I'm getting paid for. So, you know, I mean, I do this at the pace that I can. I want to overhaul the dialect system. I didn't really show it. But the, it's pretty hairy. In terms of like what it takes this is the API. So this is like these are all little points in the Lex and the parser pattern matching all these different parts of the compilation process where I knew, you know, hey for implementing some extension this is a hook in point that I needed. I kind of went about this the wrong way and I need to just turn it upside down. There really should be just a per phase event entry point. And, you know, any point within where I might want extensible functionality within the base, you know, parsing lexing and type check system. I should just call into those and lift an event by basically compilation phase and then kind of push all that complexity into individual dialects. The point the type alias dialect weighs in at about 800 lines with everything that you need just to add that type aliasing feature. I'm hoping by pushing that complexity and I can actually reorganize it and bring it down. Of course, I want to implement more dialects. You know, there's a bunch of things I want when I talk about art funds that means what the art means is arguments, return type and type abstraction. So, the, the, those are things that all happen right now in system languages but if you're in system are but you'll know, the functions don't ever specify return type. And that's something that's only ever figured out and mattered if you end up passing that function in arrow as a parameter and it goes through an arrow check, you know, and so then the type system can validate, you know what the return type is and that the return types are consistent otherwise it just actually doesn't really, you know, matter. In addition to simplification also said I have this up here actually a term occlusion, and that would be the idea that you could set a extension up so that if specific non extended terms occur even specific kind of extended terms from some lower dialect that's above the bottom dialect occur that you're aware of in your dialect. So you can basically block those from being lexed parsed or type check, and you can substitute them with something else or you can just cause a fail. And in that way you could say yeah, no more lambda function declaration they all need to follow this format that exists in my new dialect. So I like the term occlusion is what I'm calling that and that's something that I need to add so that I can basically block out the janky features of system are and replace them with newer versions that kind of are more robust and you know easier to make sense of. I like, like I mentioned function overloading and along and tags I really like those kind of sums of their like some types of function basically, and then at invocation time you figure out which of the sums it actually your path you're going through. If you take function bodies out of that then you actually just basically have what are type classes or interfaces and other languages you just have you know series of declarations that are overloaded. So I think that that's a really interesting way to introduce nominalism. If you're familiar, you know, really, really deep into functional programming you know that there's this split between structural typing and nominal typing. And so right now a system are is mostly nominal. This would be a way to introduce more or excuse me more mostly structural this be way to introduce more nominalisms effects etc etc. I can mention rough edges that I want to clean up I want to do a proper compilation pipeline I started on this but it's a separate project and kind of letting that gestate and think about it. And of course I want to add that linear system are I would recommend reading the Perseus paper that's it's by some of the same authors that I showed in that quote earlier it's related to coca and effect based language, but they have a really slick memory algorithm that uses basically linear algebra to do analysis on the code and make decisions about whether things are copied and you can it does a bunch of slick stuff to kind of do the minimal number of allocations necessary and it's there's no garbage collection, even though the language, you know doesn't have you know strict memory or anything like that. It's pretty sweet. And so I'm hoping to be able to do all that and more. Thank you everybody for coming out. I know it's pretty dense topic and I hope that this was enriching for everybody and your questions. What's your final system on. Because it's adopted from system F, and R is for Russ but also like I have an idea for other follow on projects that kind of are you know related sound wise. And so it's just kind of tied into that you know I was how I named it. I had a question here. So there was, there was a partner code where you had like all these like type declarations in the middle of the cons expression, right. Is that because you're not getting like a type Hindley Milner style type inference, or is that, is that something that's possible that you could add to the language that would would help with some of that or. So you're talking about like some of the stuff in here. Yeah, I mean, yeah, it's. Yeah, that's a part of it I think that really what it is is it's um, I think that labels. Basically for types really propagate down. And so that's kind of something that probably like a, like a more fully fleshed out type aliasing system you know you would store it you know and you know maybe dive into variance and there'd be like name spacing that way. But like really yeah this is I mean you know the point about kind of like Hindley Milner is weird because like three fourths of Hindley Milner is just what is in the system F type check system, and it's, it's, it's reapplied for basically reasoning kind of you know both kind of it like in a lexical sense and also like in developer experiences, but but like, you know. So, a lot of that is here but you're definitely right that there are rough edges in the language but also I mean this isn't you know my ultimate goal is that this isn't a day to day language but it's a substrate that is mostly assembled at compile time. And, you know, before being translated into something else but like, you know, it'll be a lingua franca for a bunch of different systems I'm hoping, but you different languages. Okay, thank you. See any, any other questions, comments. Let me stop recording. Thanks again everybody. Thank you very much.