 Thank you, Brooklyn, for accepting our invite to speak at this conference at such a short notice. Brooklyn is the co-founder and CTO of a company called Fission Code. This is a company where they're building the next generation web dev tools. Like some interesting stuff I was looking at the website in terms of how you can really leverage edge computing, you know, for your web applications and stuff like that. I think she also founded a business talking about how she started the Vancouver's function programming community where she so she's joining us today from Vancouver, Canada, the West Coast, pretty early for her. So again, I appreciate her waking up early for doing this talk, but she started the Vancouver function programming committee community, the meetup over there. She's also an author to several elixir libraries. Some of you might recognize her work from witchcraft and exceptional. She's certainly a big proponent of the web three, having contributed to several standards web native file system, one of them at least that I can know of. But yeah, I mean it's a real honor to have someone like Brooklyn with us who's done so much for the elixir community in general for the function program community. And now for the web three space as well. So I think and again her topic is something that a lot of you would really enjoy. I think it's called beam to the future. Old ideas made new some, you know, very important things I think that I'm looking forward to so I think without much delay. Thanks again Brooklyn and over to you. Thanks for the intro. Yeah, so let's get started with beam to the future. Old ideas made new. This is the 2022 edition so again this is asked to fill in after another speaker wasn't able to make it. And so I'm giving a reprise of a talk I gave in 2020 in San Francisco, literally days before everything shut down from pandemic. You know, literally wondering like will I be able to get on like back home and so here I am bookending it again on the other side now. You know things are slowly starting to open up. And so here we are in 2022. I tend to start every section of talks and this is no exception to that with sort of orienting quote. And so theme for this entire talk is that in order for us to keep moving forward, sometimes we have to look back at older ideas ideas that maybe didn't make it through the whole evolutionary process, because the context was different, or maybe it was just truly by chance we took, you know, this path instead of that path. So we need to take a look back and look around to make sure that we're on going in the right direction. And there's quite a few of these programs from from Perlis as well. So many good ideas are never heard from again. I just got a very lovely introduction, but I'll give the quick run through here again. So my name is Brooklyn Selinka. I'm everywhere on the internet as speed. I'm the CTO at fission, where we're working on the next generation, sometimes called post service or edge apps and our goal, which will not make me very popular in this room I'm sure I is to make back ends and devops completely obsolete. The good news is it's still very early technology so your jobs are safe for now. And my background is in programming language theory, virtual machines and distributed systems. I do a bunch of standards work as well with the decentralized identity foundation. You can distribute off working group. I was a theory and core developer and focused on the machine for a couple years, multi formats, few others. I found a Vancouver functional programming meetup and then beam. And yes, I'm mostly known in this community for which craft LG cork and a few others have written ideas from Haskell into Elixir, as well as exceptional a few others so exceptional as an error handling library that makes it slightly easier to work with errors and with more context than the tax. This is a talk about the past, and yet we're talking about this the theme of this conference right is the future of early in Elixir. Why, why look at the past why look at different things well. I don't believe that this is the current state of the beam is the end all and be all of how programming works right it's, it's. If we're going to continue to make progress, right, and, you know, ship new new versions and add new features and be more productive, we need to look around and find new ideas, which is a little bit difficult right because the beam does so much right everybody else right now is looking at us as like well actors seem actually pretty great. Maybe we should do that too. You know as a side note for exploring actors inside of wasm efficient currently because it's just like such a great model for doing concurrency, which is so important these days. So, the core question right is how do we move forward as an ecosystem. Cross language cross paradigm right this isn't just early in Elixir this is the beam we have lots of languages, you know, alpaca and erlog and LFE and whole bunch of others. Right, and so if we don't know what the design spaces it's going to be pretty difficult for us to make progress. Another thing that you have to keep in mind anytime you're designing anything right you're kind of you have a theory of starting point and then you're doing a breadth research in the design space. And it's really easy to get trapped in a local maxima right so you know we've gone to this point, and all of the directions around us seem seem worse right but we actually want to find even better things. Sometimes to get there, you have to go through the rough unpleasant portions that may maybe aren't as as good ideas or didn't didn't survive for some some reason. And in integrating them won't be as polished until you make it all the way up to the top of the next page. So in our alerts in in broad strokes we're going to talk about three three main things breaking out of linear thinking and the general fundamental computation type new types of modularity for the beam ideas that are literally decades old but that we haven't really seen so much in this ecosystem, and So, in the beginning. A lot of other projects can take a really, really long time to complete right so. Parallels here, I mentioned, you know, cathedrals can take a century to complete. We haven't even been programming for a century. Right. Imagine a programming project that that took 100 years, you know, hundreds of thousands of people to complete. Other than the fact that we have things like open source for the, you know, total number of aggregate hours in parallel is very high. Imagine where we're going to go with this in two 300 years. So it's really the beginning programming, obviously, right. And, you know, we have to take an arbitrary point that one of the earliest things that looks even remotely like programming to us is Fortran. And of course that was, you know, in the era of punch cards, and to give us a sense of era. Elvis was on the radio at the time right like Elvis was the literally have cool cool new guy. Erlang shows up about 29 years later. The actual machines now look more like what we recognize some Apple computers. Music also a lot more modern right every listening to Madonna a little later later that year. I'm born. So and not to make people who around feel old but just like you sense of time scale that we're talking about it right we're not talking, you know, what's the latest release version it's over the course of. Decades because here we are 3536 years later at this event. A computing device literally fits in your pockets. And we've had enough time since Erlang was created that I've completely lost touch and glass animals is the big new thing number one on the billboard this week. And I've literally never heard of them. So, just again to give you a sense of scale that we're talking about here. So, paradigms, and the shift between them. Douglas Rockford who's somebody, you know, figure in the JS community says that it's really difficult to distinguish a new paradigm from a really bad idea, because the shiny new object is part of the old paradigm. And I think this extends to all facets and all aspects of programming right so a lot of people entered elixir because they were in Ruby. But it's actually a, you know, a bit of a bait and split right because elixir is definitely not Ruby right it's very very different. And for so long people were just trying to find like what's the newest you know object oriented language like well let's give you something simple, you know, just syntax to bring you in and now you have a better way of working and it's just a very new paradigm. We've also been building, you know with this idea of slowly trying to maintain the same paradigm we've been building lab style apps for 30 years give or take. So that's, you know, having a web server, a database and, you know, sitting on a Linux box somewhere. And we're at the point now where we're saying well we need to have a higher degree of parallelism we need to be able to auto scale all the stuff we're going to use containers, which is really taking this idea of like well we don't want to change anything, right. The fundamental architecture needs to stay the same so we're going to really extend this as far as possible. And then we'll just ship your machine to the cloud. Right. And what happens then is you've taken some of this complex and you've said well it's actually handling the scaling and all these things is somebody else's problem right we haven't actually gotten rid of it. Somebody else's problem. I still have to deal with this towering stack of technologies result to train people on all of these things. I haven't actually simplified the picture at all right for trying to do the same thing but in more contexts. Larry Tesla, who sadly passed away a couple years ago, who's had a huge impact on the industry I'm not sure if you've ever used coffee and paste but he invented this has this idea of Tesla's law, every application has an inherent complexity that can be removed only moved around. So you can reduce things up to a certain point and then it becomes you know it's either the programmers problem or it's the user or you can bake it right into into a framework or language where you can't actually get rid of it you can just move it from place to place. Right so you can either re architect how apps work or you can stick them in containers and make it ops problem. So keep that in mind. You know and with this history of we're just trying to do the same things. You know in the same general picture for now 30 years what if there were some different paths that we took down this this breadth for search in the design space. We're going to look at a few of them today. Briefly at array based languages, the applicative model. The standard ML module system. A few others but that's the source mostly known from and natural and business languages. Because if we had gone through and just cut a few parts of this tree the world would look very different today. I'm not saying this is really better it would just be different. I'm going to start with something that normally one of these boxes actually quite alien but I was informed there's a large contingent at this conference for for this. So, this is a relatively famous program. It's a one liner that does quite a lot. It's written in a PL, which is normally one of these stocks and I asked, you know, who here knows a PL I see you know like four hands go up right. I guess there's a big APL section at Functional Comp which is amazing I think it's a really lovely language. And this is the Conway's game of life. Right, which which is this, all it knows that there's a grid every dots on it is, it's, you know, white dot there's nothing there's black dot, then it's alive. And it has some basic rules about well depending on what's in this neighborhood, it should either reproduce or stay still or die. And you get all of this immersion behavior from it, right from from literally this one line, one line of code that normally in say in elixir would take, you know, a couple of dozen lines at minimum to do. And why is it so compact. Well, I rescind the, the creator of the PL has actually really lovely turn work lecture and papers and whatnot on how the syntax and the, the entire orientation of the language is around arrays and it makes things very, very easy to express in short sentences, especially if you're doing things on matrices which a lot of computing is right. So, how does this relate to to elixir. Well, we often hear that actors are also very simple small building blocks right that we're going to build up from. But the problem with actors is that reasoning about dynamic organisms can actually really hard right. Yeah, you know it's just a bunch of you know chemical reactions, but it's actually you know there's entire fields of biology right that that is about how organisms work. There's a lot of different fields actually that come out of this right. We get immersion behavior, which sometimes we don't and people, you know, I sometimes hear lately for some reason people saying that I you know, composition is great because it gets sort of immersion behavior. It does not you get immersion behavior all over the place. And even if it's understood and known right like if you're putting some simple things together sometimes stuff that you didn't expect to happen will happen. You need when you're looking at a system like this you need to be running a simulation in your head of everything that's going to happen so that you can even get up into a level of abstraction above this, which is a lot of cognitive load. And yeah, we can abstract that the way some of this right we have things like Broadway in elixir, a lot of other languages have things like arrows. But that's not this level right that's trying to abstract up and away from it. So, what's the, the advantage of doing things in arrays well. These are shared nothing architectures, and when you share nothing you get better performance, because a performance drops off when you have data contention so some people have seen this diagram before. This is as we add more processors more parallelism. And I'm does law, you know very famously says that you don't get a perfectly linear you know this red line 45 degree angle, you get this purple line where you have some some drop off the actual law that we see in practice is the universal scaling and says well here's our incoherence or our data contention penalty so if I have to wait for data or I'm getting some data that's out of sequence and I don't know how to handle that or even if I can handle that I have to do reconciliation. When we put in too much parallelism we actually lose performance so there's this sweet spot this top of the curve. And this is directly related to how much does the data related to each other so for just mapping over an array or over a list. We actually get a really nice straight line except for the overhead of actually spawning those processes. And so this is exactly that this is a like much craft is this async map, you notice that it looks a whole lot like you know dot map. And in this one we can do you know, just as a simple example right we can sleep the process for 500 milliseconds. And then, you know, actually do some computation. And then we can do a huge list of these right in under a second we have a bunch of benchmarks in which graph that are not surprising right where if you run in a map over some things. It takes literally minutes, and here it's like, you know, essentially instant. And this is great because we have a shared nothing architecture because lists essentially are like an array where nothing depends on each other each other right this is great for embarrassingly parallel problems really big things that break up nicely on a grid. You can do a lot with this in macros and do analysis at compile time you could do. You know, because of the inputs, you know should this be run sequentially or in parallel. You know, analyze or you'd be trunking it all of the stuff without having to think about well what's the exact data flow through the program, right just literally saying async map and you're done. When people want granular control, you trade off something right going back to testers law, right, you make it harder to reason about. And if you have something like your purity right the functional programmers and since you have impurity it's also hard to reason about because now part of things are outside of your system and out of your control right. So extending this idea a little bit further. What if we have table oriented programming right so it just happens to have the same letters as OTP. So to be top. So we want to express something like this right so here's a table it has name username and city, right just few of the folks that work with. Here's a schema that we can represent that with this or the naive view. And this might be some of the data that we have. Right. So it's just just a list of structs. But it might not be particularly efficient or even if we want to save up columns it might be difficult to grab that data because we're going to iterate over absolutely everything. But what we do get is because the data on each of these records doesn't depend on each other again. We can do things like well what I want to run multiple computations over us. Right, and I can have several functions in an array that I'm passing through this data. If we have zero dependencies because we know that this lives in a grid. Right. So, you know, each of these data's bits to data might be just a single field. Right. If you're familiar with data log or triple stores this is essentially what's happening. Right. You run these multiple functions over it. Then you can do that literally, you know, sliced any any which way you like. What does it actually look like. Well, this is again from witchcraft. Here we're taking addition and multiplication, and then providing to them as first arguments one, two and three, and a second arguments for five and six so I'm going to add one and four, two and four, three and four, you know, multiply one and four, two and four, three and four, and so on. So, with some nice use of new lines, you can then you get this nice grid at the end showing, you know, here's the, the matrix that would get back out of running multiple computations over the same data. The other way of looking at this is as if you have active everywhere right so relational algebras super powerful very well understood. So a lot of ideas from databases, turn out to be extremely useful to general computation right so software transactional memory, the ability to roll back changes if you crash in the middle of it or you have a broken expectation or somebody else, you know, wrote that variable. So what if Ecto is super powerful for talking about data and doing transformations on data and checking data. What if we just added Ecto to everything, right, what if we extended this more out into the rest of programming and the rest of the language. Let's talk about composition and modularity symmetry is a complexity reducing concept secret everywhere, and you can't communicate complexity itself only that you're aware of it. Composition more functional programmers obviously no composition means right. So, it breaks up into a few categories right so we have things like higher order functions and we don't actually have composition in Elixir at all but we do have function application which is equivalent right as soon as you add abstraction. And these are kinds of modularity. We also have rules, things like community or associativity. That mean that you know the order that we do things and I do be and then a and then be their equivalent. Right. And so that gives us another kind of orthogonality. So the ability to say that all of these different things are symmetric. Right. I pulled out a higher function or I have the ability to change the order of my of my of my applications. So let's look at composition in a couple different dimensions. Composition in the data dimension pretty straightforward right these are structs. This is a tree. We've been posting data together has a right in the function dimension this is what we're usually familiar with. Right, I'm going to take some data and then run it through F and then later I'm going to run it. She is still this the same sort of general picture right where we have these boxes and arrows, but now our boxes are functions instead of data. And also in this model, try to listen to data flow and say I'm going to fork out my data across multiple streams or then merge them again together later and we'll look at that in a couple slides as a concrete example as well. And then finally, composition of capabilities inside protocols and higher functions, or we say, you know functions are just data with a hole in them, and we need to complete complete that whole by adding in some last little bit of context. Right. Even seeing things like map or reduce as a value general evaluation strategies that you're then going to say well, this is how to complete that function here's how to complete the strategy I want you to add one to everything in the whatever the structure is going to get back. In elixir everything in works on lists, but there's nothing saying that has to be that you can actually write functions like again in which craft functions, where we maintain the structure, the data when you do a math and so you'll get back the same shape, say binary tree. And finally, and this one I was going back to such like oh yeah I'm using actually a lot of this one lately is execution cemetery. So program can be developed on a sequential platform. And even if it's meant to be run on a parallel platform as long as we're aware of the properties of house. Right. So if we have ideas like associativity, or commutativity, then it doesn't matter the order in which we do things and so actually having a sequential program is perfectly fine and a valid interpretation of that program so makes it much easier to test so as a picture just time in the downward direction. And we have some events. You know, have one and then we fork process and there's this to that happen and in the blue one on the right. And this blue you know it's going to happen sometime in the middle here, which means that we have a bunch of different timelines bunch of different executions that are actually equivalent right like all three of these straight lines are valid interpretations of this graph. The, the major difference is, you know, blue can move around anywhere as long as it's between the beginning and end notes. And as long as this orange comes in for the pink node that we can put in any you know move them around how we want. We can take advantage of this in things like explicit data flow. So this is a similar picture except time it's going to move just to make everything fits. So from top to bottom from left to right. So it looks like in code right so we're kind of some, you know, moving along and then split and have different tracks for the data and then you bring them together. Here's some actual concrete code that does this. And now color coded it so you can see it's even almost. When you play around with the whites based on the new lines, you can almost lay it out, similar to the actual picture and so now we have this higher level of abstraction, this picture humans are very good at dealing in pictures of doing data flow. Right. How modular are modules and libraries, right, which feels like a really meta, meta question. And then the border modules are a concept that's been around for a couple of decades, like literally 3040 years but just haven't really made it into the mainstream as much. So, a quick story. I was at a consultancy many years ago now. And we've handed out subcontracted to another company to do the majority of the web development portion of a project and it was their first Phoenix app that they had never written. It was the deadline, and it's just not ready. So it gets handed back off to us we have about 10 days to finish, and we go, well, how do we finish this thing fast enough, and thought, well, yeah, actually this is pretty straightforward credit app. What if we wrote some macros that would generate everything. So schema, the types, the controller, the templates, everything right and then be able to override individual functions. So that we wouldn't get the default one from the macro. We divided it. And we needed the ability to swap in different, for example, backends or different queries, depending on the context that it's been running. So we created these higher order modules here's the modules spec, right, where we pass in arguments to the module itself, which you can get as long as it has a using clause, we can do all of that construction and imports directly there. Right, so this is in the same way that you have a higher order function, or a protocol or behavior, right. This is passing in arguments to the module itself to say hey this is how you're going to construct yourself here's the dependencies you pass in. And this also then gives you a hospital dependencies at runtime, not just a compile time. Okay, so it's a very modular. You don't have to fork a project now if you want to have a different dependency you want a different JSON renderer, you just pass it into the module and then it works. Right. So, how do we even extend the module system in elixir and you know shocker it all goes back to macros again so this is from type class which is part of the wishcraft suites, where we do things like protocols, so that we make sure you want to do protocol x well you need to have protocol y already implemented otherwise we're going to fail to compile because this is part of the guarantee of the system. So if you want to have a you know the dreaded monad and you need to have a function, for example, and it will do that checking compile time. And it's actually it's not that much code is like 60 lines of code. And we can do all of this checking, because macros aren't, you know, just for metaprogramming, they are compile time functions. And so you can do it for literally anything happening at compile time. So higher order modules love behaviors right because what we're saying is this module implements this interface. So when you pass that in as long as you can say well whatever my interface was right back here. So if I query modern schema mod, well then I can call functions on right so this again it's feels like a really nice overlap with where the beam is and beam languages are but we just haven't really used them. Right. And finally, last section declarative embedded DSLs. So card is the highest goal of programming language design to enable good ideas to be elegantly expressed, and I could not agree more right the limits of your language limit what you can think about. And, you know, having expressive languages is why we're interested in functional programming. And there's an argument that actually everything is DSL right to some degree and I would actually agree, you know, even, you know, assembly language is a DSL for programming specific, you know, specialized hardware. Right. And your, your applications are DSLs for end users that are very narrow and very specific and extremely powerful right and there's this always this trade off between generality and power. So, there's this entire spectrum right all the way from really high level paradigm, you know functional out of jointed or patterns, all the way down to frameworks and applications. And what we normally think of as languages, we can construct your own little me languages that it's normally what we think of as a DSL or how we're going to use the tremendous in this section. As a counter example, right, even if we have, you know, what a lot of people call, call, you know, it's a DSL, you know, we have this squiggly lines and taxes all valid elixir right. This isn't in our definition working definition here for DSL. This is just regular elixir right these are just functions talking to each other. Yes, there's some language like things to that but really this is fundamentally is elixir we're not doing anything language like with it. This is a DSL. This is a language for constructing some products types. So we're doing structs and variants so some we don't actually have in elixir at all until we put them in LG. And what it gives you the canonical example right is if you want to express the state of a of a traffic light you have red yellow green. And I want to say well my state is going to be one of these three, not all of them. And that's the deaf some and then we have deaf data, which is, you know, even nestable so you can express these large nested structures on the right here we have, you know, maybe a player character in a video game, and expressing, instead of in with depth struct deaf yes all the fields, we put in default values, generate type specs all of this stuff right using this DSL, as well as sounds and mix them actually together sort of snap them together. So, what are some business languages that we need to use for our business logic. Well, what can we learn from cobalt right you know business language or business logic we're going to use common business language obviously right. So, for those who've only heard the name but but don't know what cobalt is it's a very widely used, you know, for example, banking system that have been around for a while. That is, if you can write it. Oh wow it's it. If you can write it and you enjoy writing it it's good work to get if you can, if you can, it pays extremely well because these large systems really depend on it and it's very refreshing a few people that actually write cobalt. It looks a lot like this. Right. Where the idea is to make it read more like like English right so the idea was well what if we have people you know business people actually able to read the logic as well. With mixed success so you know move space to WS user WS full name right all this kind of reads like, like, you know, reads like a story which is actually something that we want, you know, in a lot of code. So, as a story is a big one from cobalt I was at a about 2017 at a security token issuance company, which is to say stocks and bonds on the blockchain and doing cross border stocks. So fully regulated, and no one wanted to go to jail. So they hired me to come and write a programming language that would be readable by lawyers so non developers be formally verifiable and have set up analysis right, including the compiler. And we came up with what I like to call the unholy union of cobalt. This is a slight simplification but really looks basically like this, or we can say things like hey this conforms to the following rules DC Security Commission rules, the US STC and the Korea exchange rules. And if the following code doesn't match these things then fail a compile time. Nobody goes to jail lawyer can read this right. I've heard from people who've given business teams like they've written a little DSL like this right and getting business teams the ability to configure an application, not in a wizzy way, you know, admin panel, but literally in code, because it's more powerful. It makes it easier for the devs to ship things they don't worry about the UI layer. And really if you have a simple enough and a clear language. It actually empowers the non developers, because they have, you know, a little bit of this so I think as time goes on we're going to see this real blurring programming the most powerful concept the most powerful tool computers the most powerful tool that we've ever built. And we need to find ways to bring more people into it and it doesn't necessarily have to be in text, but the ability to compose new ideas, not just a panel, actually new ideas new combinations is extremely powerful. The upside of all of this right is this is really great for communicating with the main experts who know more about. I'm not a lawyer right you know but we wrote this thing with them that they could read fabulous. You know, the cells work right they form an algebra they can be pulled apart put back together really easily. You can make them correct by construction so it's literally impossible to to break unless you make you know syntax error which will get caught a compile time. We can check for all kinds of properties that compile time or runtime and we can make it. This fits really well to me drawn to the side. But we have a problem. It's in flexibility. This is the quick and dirty way that you that you normally see right so this is just using the built in a ST and what we can represent is is a little bit limited so this is that same. You know part of the standard library in LG. Here's a binary search tree right so we have two options this summer right we can have an empty node or we can have a some data and left and right branch. Here's how that looks you know when we actually type it out and create a tree. And that immediately gets started to a data structure and use directly right so there's actually no level in between here it's just we're literally just writing. Elixir in this specialized model, right, Ecto LG all of that works in there. A deep embedding is, I like to call it sometimes better a ST, right, or more specific one is three steps. So the game plan. Transform your data, or, you know, transform this game plan and then tear it down actually run it interprets right and so we have this this nice loop, this cycle. And then you can also be more cycles on top of that. So that data flow that I showed earlier right we'll have those diagram it is going to have to write and you know splitting and joining again, we can represent purely as data. Right. And now, if we want to, for example, test this. It's really really simple, right we look at the data structure, we don't have to run us. We can just say well you know the result of my syntax. Is it what I expected to be. Yes, and I can just look at it statically right it doesn't have to be well what's the output. Literally, here's what we'll get executed. We can clean up the syntax a little bit make a little bit more friendly these are equivalents, so that you're not looking at this deeply nested data structure right so this will then build up that. But the fundamental thing that we're working with is this view, not this view, right. We have our own specialized AST that's really specific to the problem. You have this extra layer in between, right, but it's way more powerful and flexible, you can hand it off to different interpreters and look at that in a second. You get to control the precise vocabulary what you need so we're not working in a term complete language with you know anything possibly happen inside effects and all this other stuff right we're saying no it's like it's just in our case data flow, nothing else. It's much simpler to debug than a run and function because you can just look at it, literally print it out what you know here's the game plan that we're going to be executing. You can do things like time travel debugging by stopping after every step and saying, oh, that was the state there really easy, and you're not locked into one canonical limitations to run a protocol. You can do it for integers, or for numbers, then that is the only way that that can get interpreted, right, or lists as the only way that can get interpreted. And with this approach, you can pass it to any other function, higher order functions and have it tear down that structure and interpret it in a new unique way that compile time or a runtime. So this is a, you know, here's with, you know, using that the game analogy, again, that's that before we have a few different things happening here right we have movement, go north, we have waiting. We have stuff happening on the console so we're going to set caps and do some printing. We have an NPC concept who's a dog that's been, you know, presumably walk around and do these things. And we express that, you know, that gets baked down either into a straight line program. So just to say, these are the things that are going to happen in sequence then maybe we'll loop, or we can do something more complex with our custom AST, and say, you know, there's going to be branching points and decision along the way as well. So we can make this feel a lot like GenServer, saying that as we're tearing down the structure as we're going through, we're going to handle things that are text like with handle text, and that can handle, you know, to look at different things or set link or, you know, all these different things and replace them with with actual functions that we're going to have it run different in production and in tests, right, maybe in tests I don't want it to actually start blinking text. I wanted to log out. And now I did some blinking text, right. Because the bottom line there is I can't match anything. So I'm just going to return back to you but I have. So the gameplay stuff right by item apply power up and handle movement and they're decomposed they don't have to live in one giant tree they can be in totally separate modules they can be in different packages. And you can compose them together, just with a pipe, and we're going to run down each of these as we're interpreting that data structure and actually run some functions. We have this one extra line here at the bottom. Right. That is now how to actually interpret it right so actually going to map the interpreter over this and then run out. So we can get just the day structure and then we can separately turn it into options. So more flexible from protocols really easy and simple to think about just not baked into the language right protocols required things to be canonical there's this one way of doing things this one implementation for lists. The libraries of well defined mini languages is DS cells, even with the interpreter, have the developer, not the language right, not the library writer, provide the interpreter, and you can do things like different tests, different implementations different back ends for this right in tests or in prod, which makes it trivial to mock you don't even have to worry about running the computation at all right you can just look at the plan. To wrap up. Let's make some new mistakes, right. Here's five problems for the next 30 years the beam hopefully some of these sooner than 30 years right but we're about 35 years since since started in the 80s. Let's let's get some new trouble for the next couple of decades. So five things that I think we should focus on one wasm right client and edge beam if we want to continue to be relevant, and we can always be relevant in the actual networks which is right and whatnot but there is a strong move to wasm and wazzy service functions and we need to have some sort of story about those. We need to lower the barrier to code so low code no code, which is still code, right. Everybody else along with us right. Everybody will be a programmer in the future right 30 years to now everybody will have some exposure to this. You better correct this tools. The fact that there's data breaches and, you know, both from individual applications and all the way up to you know really really scary things like you know flight control systems and everything between. We need to just apply that we need better static and dynamic analysis tools formal methods. We need to put more time into things like dialyzer and get them even more mature. Automatic parallel about evaluation dynamically humans are really bad at this machines are really good at detecting these things. So let's hand this off to the machine. We're doing things really granular granularly right now in on the beam and mobile agents so today we move data to compute and do some computation over there and keep the results. So we're going to suspend computation part way through and ship it around on the internet, or just say hey, I want that function and I'm going to run it locally here. And I'm even going to provide you a proof that it that it was done correctly, just thinking do now called snarks. Mobile agents are going to be extremely important, not in 2022, but sort of in 2025 and onwards as as that becomes more, more mature. So we need to have some story for moving literally suspended computation around, which works really well with that deep embedded DSL direction for and final parting thoughts. Again, from the upper grounds program 101 dealing with failures easy work hard to improve dealing with successes also easy, you've solved the wrong problem. So, let's work hard to improve. Thank you. Thanks Brooklyn that was awesome. I know we slightly overshot but I didn't want to interrupt your flow I think you did a great job. Thank you so much for kind of stitching all these things and I'm glad you talked a bit about a PL and some of the other stuff, because that I'm sure people in the audience would be excited. There is a growing APL community here so thanks for doing that.