 Hey everyone My name is Ryan lemur. I was here in 2014 for the first Foucon and This is my first time back. So it's great to be back at the time I did a code jiggle bandie the first code jiggle bandie with Dovel de laul and Incidentally today is the second one. He's doing one with Morton Cromberg. So be sure to to catch that It's gonna be great My talk today is about Haskell and making sense of the type system. It's a talk about my experience of going from What I would call core Haskell or Haskell 98 To the kind of Haskell you will find in modern code bases and there's quite a gap so I Came across Haskell about 10 years ago Someone just showed me a few lines of code and asked me if I knew what language it was and I got hooked and Read a few intro books Gave me a good education in Functional programming. It's a great education in functional programming. I really recommend it for that but Then you kind of hit the doldrums. There's a big vast kind of Sahara desert between the functional programming and core type system of Haskell 98 and the kind of Types and use of the language that you will see in modern code and So I haven't used Haskell for any longer than kind of a six-month period at a time So it's like an old friend that I don't get to see often enough But it's been a long time enough to to Look around it a bit. So I was stuck and in 2015 I wrote this book It's a little book some way between a little book and a long tutorial layered from Haskell 98 into More modern aspects of the language I finished it almost to the day two years ago and I only read it read into it again recently for for this talk and It kind of still stands. I'm I wrote it when I was two years more ignorant than I am now about these things and I Would recommend it if you find yourself in that gap between Haskell 98 and wanted to make sense of Type systems you see out in the or uses of types that you see out in the wild I When I read it now, there were a few things that kind of I had to raise my eyebrows a little bit about and that I would have said differently But I can assure you nothing will scar you for life In that sense. So I would still recommend it today's talk is really pitched at you know Haskell is not a broadly used language and So I've pitched it I've tried to pitch it on pitch it on a dual level where Where if you don't know Haskell at all you can hopefully still make some sense of it in all statically type languages you get this fact that types classify terms so terms of values and types Families of values you could look at it like that so This is all pretty obvious Kinds of types and this is the type. I'm going to just carry around a little bit in this talk. So it's a maybe type maybe something means It's a way to express an eligibility or optionality of a term so maybe char could be just char a or nothing and That's all good. The Haskell type system is amazing if you haven't seen it before if you're used to Java or Other statically typed languages It will blow your mind although ten years later. It will blow your mind less those languages have really drawn in a lot of The great things about Haskell some of them But still in Haskell types are not first-class so You can say x equals hello, and this is true for any statically most statically type languages You can say x equals hello, but you can't can't say x equals to the type string Compiler doesn't know what you mean. You can do equality on Terms, but you can't say if x equals type string You can pass Values to functions, but you can't pass types to functions You can't return types from functions. And so while the functional aspect of Haskell is very High level in the sense of type of you know functions of first class types are not and So the question then is what classifies types, so I'm going to just dip in and out of some code as we go along There's the maybe type, and I've just rewritten it there as Maybe a that's the generic type Is either nothing or just something and so I can ask for the type of just some string and the interpreter says it's a maybe string a string is the alias for list of char and But I can't say what is the what is the type of maybe It doesn't make sense Because types don't have types in in a literal sense here But I can say what is the kind of thing so I can say what is the kind of integer? And it'll just say star It just means type any type. It's a type kind of string Is just a type star Kind of maybe string is a star so That's that's quite different to the term level if I say what is the kind of maybe Well, that's a star to a star so now it's saying I'm maybe as a type that takes a type It's a type constructor, and if I give it a type then it returns a type So maybe it's not a type technically it's a type constructor, but maybe string is a type If I look at type classes in Haskell type classes in Haskell are like interfaces in similar to interfaces and so here I've got a type class show that Defines the interface for to string essentially for any type and if I ask what is the kind of show? It says star any type Show takes any type and it returns a constraint so Kinds classify types to some extent in a very blunt way I Can for instance not say maybe maybe because maybe takes one type But maybe his type is higher order kind so While the kind system is very blunt it just has these Arity specifications. This is just an arity of a type So one type this is a type that takes a type you can get types that takes multiple types And constraint so these are the only three types mainly that The Haskell type system can three kinds that the Haskell type system can distinguish so we can already start seeing cracks in the facade here The kind system is untyped the language gets a bit tricky. So you could also say it's not kind safe It's only arity safe. It only speaks of how many Is it one single type or is it something that takes a type or is it a constraint? So I want to show you an example that I'm going to carry right through this talk and It's So, okay This is what happens when you do dry runs of the code before you speak I warn you against that So I just want to show you the list type That you all know that that is the algebraic data type form of a list a list is a empty list Or it's a cons of something to another list And so there is a term of a list. I want to do something a bit more interesting. I want to create a sized List so a list that has its size in the type and So here's a data type vector na N is going to it's a type that describes the the size of this vector and At this point it's not going to help me much if I ask what is the type of This term it will just say it's a vector n string and so will this one be So n is still not a very useful thing. I want to do something like this I want to say well null is this is a type of a vector zero It's a it's an empty list, but I can't do that because Zero is a term. It's not a type. So I need a type and so the common way to do this is Something called Pino numbering. It's an idea from the 19th century actually It's just a recursive way of describing natural numbers. You start with zero One is the successor of zero two is a successor of that and so on And so these are two algebraic data types zero and successor that we're going to use now in our type vector so I can say Null is vector of type zero and This one is a vector of size one So we read it like this successor of zero is one successor of successor of zero is two I'll just speak like that from now on but this this is still quite Wild and rough because I can say successor of false and the compiler is perfectly happy with that It was show me successor of Boolean So how do we how do we get more precision with this type? That's where generalized algebraic data types come come to the rescue and really it's a very simple idea But it's got profound Implications and it's used very broadly you literally cannot leave the house without GDT's if you're working in Haskell So instead of the Algebraic data type. I showed you earlier. Now we have refined Types for this so I can say no is a type of zero vector of size zero and This is a bit more interesting cons takes a type and another type of size n And it returns a vector of size in plus one Which is quite amazing because now the type can if I ask what is the type of this? the Interpreter can tell me this is a size list of size one And here's one of size two so as you're using the type constructor cons the the data type constructor It's building up on the type label. It's keeping track of the size of your list And now I can't do things like What is the type of? earlier, I faked the size of the no list and now I can't do it because it's the compiler will tell me I wanted a Type of this but I got type of that I want a type of zero It should be zero, but I got a bull so things are getting a bit more kind safe Type safe And so now I can do some things with this with this type I can do Vector I can write a more refined zip so I can say if I zip two sizes two vectors of the same size I should get one of the same size and in fact if I then try to zip things of Different sizes together. I will get a type error and it says I expected this type vector to Char, but I got vector one char. So the type system is now Catching a whole lot of things that you would typically have to get at run time or have preconditions and post conditions for Similarly if I want to write the tail of a Vector I can now say in the type well tail only makes sense for something with the size of At least one so successor of n is at least one because It's either one or more And in fact if I try to and that's what we got the error for earlier if I try to Define a clause for the null a Case of of vector. I I'll get a type error and it will say I couldn't match this I'm looking for zero here, but you're giving me I Don't have this function doesn't work for vector zero So that's quite interesting that the type system can start doing stuff for us that Just not possible otherwise well possible with Actual term level code So that's all good. We've solved vector a bit, but the successor I Can still butch I can still shove any type in there and the compiler will be perfectly happy with that so Let's just look at what we've done there the ADT the algebraic data type For that case this n was free I could just put any type in here, so it wasn't very type safe similarly there. I could just shove any junk in there and Compiler couldn't really help me With a generalized algebraic data type. I've essentially taken this This is our vector with the terms null and cons and I've connected each term now here with the type Cons is at least one size one null is always size zero and So I've constrained that n in the vector n by using generalized algebraic data types But what if I wanted to do something now like? Append so I Append some a list of size n and m and The resulting type should then be n plus m, but I can't use plus Regular plus because regular plus is defined on term level and so I but I want to write Conceptually something like this. I want to write add That's a pretty straightforward Haskell definition of plus in that sense, but what with the What with the signature of that type level function look like so how you do that in Haskell is with something called type families and I'm going to use interchange the words type families and type functions, but I will show you why they are called type families so there's our Zero and successor and The way you would write a type family a type function you could think of it for now is in this pretty awkward syntax type family add equals and then the The clauses of what would be term level definitions on our type instances, but other than that on the right side of this Syntax it looks pretty much like a function that only came in 2008 actually when I when I met Haskell and 2014 Pretty recently when I was here last This closed type families syntax came we can write a function like this. There are quite good reasons to to use either for different reasons, but I'm going to use this syntax from now So now I can ask things like what is the kind of add What is the kind of the add function and it says it's something that takes a type and another type and returns a type What is the kind of zero? so now that just takes one type because one type has been Essentially bound so it looks like but like carrying on a function level and I can do that And then I must just show you Something very trivial, but actually quite interesting if you haven't seen it before is if I add one plus one I get Just says well, that's a type of star, but there's a nice way to just ask it ask the interpreter to Actually do the type level computation for me And so I'm going to do that and learn we hold there. We go one plus one equals two And this is now type level arithmetic. It's happening on the type level at compile time Martin Thompson spoke about efficiency earlier by the way and When we talk about type level computation, we're talking about extreme in efficiency So I'm going to ignore that for for this case, but it has actually great applications, but not for high-performance computing So let's use this type function on an append so there's my old Algebraic data or generalized algebraic data type and I've gotten a pen function there that uses this type level add function and It's interesting to note here that Again, the the function definition is completely oblivious about this happening above it on the type level You just define a pen like you normally would But on the type level through this generalized algebraic data type constructors It's carefully or it can't do it any other way. It's working out what the correct Type is and it's doing this type level arithmetic in the background in that way so That's all cool, but if I add I can add junk. I can say add into bull and Let me just show you if I ask it to Resolve that it just says if you add into bull you get into bull That's fun to show the the difference between type families and type functions so Just to recap we had we could do proper type arithmetic here, but we could Do whatever we like with the types. There's no kind safety And When we added junk types The compiler didn't complain because essentially what's happening is type families are not Actually functions. There are equality axioms. So if I say add zero of successor a successor It's essentially compiling that to Equality axiom and it's saying if I see add zero to successor I can reduce it I can simplify it to the right-hand side and so that's why when you add into bull It just leaves it as that because it has no way of simplifying it. So this type arithmetic while it looks like function applications actually just this play of Solving equations and on the compiler level So we with our generalized algebraic data types solve this Kind safety issue with the n in vector, but we still have this other problem So with successor I can still put anything in there so It would be nice to do this kind of thing if I could define a unifying natural number type that Unifies zero and successor and then I can say Those two are natural numbers of kind net and then I can start constraining things more And so how you do it is actually when I saw this first I thought Well, that's pretty crazy, but let me tell you about it. It's a good hack. It's a good I don't know if it's a hack or a great technique, but What you do is you define a regular type not data net with terms zero and successor and then you promote it You shove it all up a level you lift the whole thing up a level. So not becomes a kind zero and successor Become types and if you want to distinguish between types and terms you can always use an apostrophe like this To make it more clear even what's going on I can put I can promote Boolean data bool is just true and false true or false And I can promote that to kind bool and types true and false. So why on earth what I want to do that Let me show you some applications of that. So Type promotion is a I haven't mentioned these things at all. These are language extensions So Language extensions are compiler specific. You won't find them in the language spec at all And that's part of what makes it a bit bewildering to go from Haskell 98 into real-world Haskell Because language extensions are broadly used and they're not defined and documented in the same way as you would find them Find the Haskell 98 kind of core and the only way to really learn about them properly is to read the papers Or that book I wrote Although that just covers a small number of them So there's our net data type. That's a regular data type with terms zero and successor and But now I can actually say If I ask the type of zero it will say well, that's a natural number but if I ask the kind of the Type zero zero is now the lifted term zero. It will say that's a natural number and So now my type level Function is using these promoter types I don't even actually have to put the apostrophes in the compiler can completely generally figure out Oh, you mean the type or you mean the term and Now if I tried this junk add I Get a type error. It says I expected kind net But all I got was this You gave me something else You can only give me types of kind net which are zero or the types zero or successor So now we're getting more type safety kind safety. Sorry the bull Example I was showing you Now I can define a type level thing that says is this type zero and Is there is the type zero? Zero the answer is type true So we've got all the regular things we know happening now on a high level And now I can have increased kind safety here with Vect n and I can be explicit and say That n I mean to be The natural number kind so I'm constraining it to the two two types zero and successor That's all great, but there's still limitations because if I wanted to write a length function for this vector I Can't do I just can't do this. I can't say Vect n a Gives n because that's exactly what you would expect It's just just take the n out of that type and give it to me as the result Can't do that because you can't return types for functions raw types and But I can do this I can say well, okay return a natural number This is now a type on this context in this context and But this doesn't help me much. I can put a bug in the definition there and say The length of null is one and the compiler is totally cool with that because it's not connecting these two dots and That's really what Dependently type programming then is about is to say I want this type to depend on this type I want types to inform Future types for the downstream in a in a type signature So let's Let's soldier on and see what that might look like so with our type promotion now of NAT I've got kind safety Vector and a the end is kind safe zero and successor and the end and successor all good and unified by this type, but we still don't have Dependent types, so I can't have You saw the function I showed you length But another example is if I take just a normal replicate Function that we use quite generally on the term level Give it a number and a term and it will give you a list of a size of that number of that term So I can't do this. I can't say well give me something that takes a natural number and returns vector in a So either way, I can't have this. I just can't have this dependency and the way you hack it and here I'm going to just The area is gonna get a little bit thin here if you don't know Haskell But I just want to give you the gist of the idea You after we've promoted this type We actually mirror it again on a lower level And so we actually write a rapid type regular type to wrap that type of that kind We have regular terms the s is for singleton. This is called a singleton pattern And so we bind regular terms With types so using generalized algebraic data types again. We're binding terms with types and Singletons are at first look like a terrible hack and then the more you look at it It's very ingenious the way it uses the type system and so on but it's a hack So let me show you what that looks like And this is our last little code excursion. So there's our natural Number type and I'm using data kinds. So everything will be promoted Everything that can be promoted will be promoted there's my type function add and My type vector and here's the singleton type. It connects zero to type zero Successor to type successor and now I can write my length function like this I can return a wrapped type with a singleton and I can ease it's easy to define and I can now actually when I if I do If I want to define the length of null is one I'll get a type error and because the the type system is not connecting the dots I've given it a way to To to let one type depend on another one I replicate function Similarly, I cannot take a wrapped natural number and return a vector of that size and We we've got dependent typing and something to note here is on the left-hand side while I'm pattern matching on terms those terms because of the singleton map morphically to the types and so when I pattern match on the term level I'm actually pattern matching on type level So this is dependent pattern matching fake but it is so and It's just want to show you this type signature here If I ask what is the type of the partially applied function there of replicators zero it will say It's going to take whatever type, but whatever you return is going to be of size zero Singletons, I've now just written one by hand, but there is a singleton's library and this uses template Haskell metaprogramming in Haskell It uses a template splice there and Now I can just define It's so you can think of it as like a macro and in that macro I'm just defining regular term level stuff just defining the natural number that you saw earlier and in regular ad function and this will actually Promote be promoted. There'll be a ad function on term level. There'll be a singleton version of it with the wrapped kinds and There will be a kind Type-level function ad so it will generate all of that and it will generate the singletons and so on and There's our length function now and the subtle difference there is is that We use singleton there instead of s net But singleton library does a whole lot of other things for you and it starts making more interesting things possible So I can actually if you think about the length of vector Why should I even define that because it's in the type So if I give you a list that's the whole point of the size vector if I give you a size vector I shouldn't even have to Implement this the type system should implement it for me Because now this definition. I'm just walking the vector and doing it the hard way So how you would do it and here I just want to show you really some crazy Don't take it in my real point is look at how messy and syntactically noisy. This is but I can actually Define in the in the way. I can ask the compiler to just grab the Number out of the type and this is the syntax for that and But now I'm still working with singleton types wrapped natural numbers and so I can Simplified some more with some fancy footwork in the singleton library. I Just want to show you how messy it gets with that slide and that's the end of our code so faking it gets pretty messy and We would it would be nice if you could you could easily imagine something more like this where If I just cleanly ask the compiler just grab that end out of there and return it So Haskell is absolutely heading in that direction in fact, there are experimental extensions now that go well towards this it's still Razor's Edge at this point in 2015 Guy Richard Eisenberg under the supervision of Stephanie Byrich who is a big Haskell player Put a rotor thesis that actually shows exactly how to make Haskell more dependently typed There's a lot of work to be done and It's already somewhat later than I expected after I read that So it's expected that in Haskell nine Things will get more like that and we will really be freed from Hacking with singletons But why would we bother with us all this? Type level programming. There's a lot of practical Implications so append you saw but also look at something index something. We've all written a million times It would be nice to constrain the type of the number you get in So that you don't get out of bounds errors so that the type system can actually well You will get out of bound errors if it's out of bound, but the type system will prevent it You don't have to put preconditions in your code Transposing matrices, it would be nice to say Mn goes to Nm And if I take something from a list It would be nice to be more precise about what the sizes of those vectors are Another thing that I haven't gone into at all here is printf print the print function is famously difficult in Type language like Haskell because the string informs the Types and even the arity of the function and so that's you can't do that at all with regular Haskell So that becomes possible with dependency type programming Also, you this is a very pervasive thing of where we have to write Multiple versions of functions for multiple arity that kind of thing can get nicely unified with Dependently type programming so Idris if you've ever seen Idris you might have thought why are you going to all this crazy trouble? Because that's all you have to do an Idris all of the Everything I all the jazz I spoke about Is a non-event in Idris? I can just write type Types in terms type term level function I can do that I Can do that the types in terms are all unified in Idris Idris is a is a Haskell like Language and in fact how you install Idris is as a package into Haskell So Idris is kind of like this Beautiful shining polished gem of where Haskell is heading Although Haskell will never quite reach it There are other dependently typed Programming languages fully dependently typed Coke is the oldest one in 1984 very old precedes Haskell by good well few years and Coke by the way in 2013 and won this ACM software system award Almost 30 years after It was born Other winners of that award are TCP protocol Unix tech typesetting system Worldwide web protocol, so it's in good company, and I find it very interesting that 30 years later This is it seems that the time has come for this in a different way It's been around and it's been used in academia for a long time, but it's There is a crossover feeling in the air you can really feel it's It's it's beginning to happen now Idris Coke and Agda are total languages what you call total languages, and I'll explain to you now Idris is Partially total, but basically what you get is You move from regular programming into what we would call proof systems, so totality is worth mentioning The moment we have type-level functions We have compile time computation and the moment you're computing You have the whole thing problem. You might have things that recurse and don't don't get Don't get solved You know doesn't terminate and so you move the Turing halting problem to compile time from runtime with the penitly type programming And so this issue of termination is a very big thing So how coke and Agda and deal with that is they their type systems are such that they Enforce totality so totality means the type systems are so sophisticated that the compiler can can enforce That you cannot write recursive functions that will not terminate and how does that work? Well, you can actually what it does is it tries to always prove that if you do have recursion It converges every step converges and so that it will converge Obviously that limits the generality of the language, so that's why we don't use coke and Agda for everyday programming Idris is Somewhere in the middle Idris can prove totality for a large class of Functions very amazing. I never knew that that was even possible before I came across these these languages And the other issue that's really worth looking at here is this The curry if you've heard of the curry Howard correspondence. It's this idea that Type level term level is isomorphic everything you can define in a lambda calculus You can define on on a type level and vice versa Haskell curry Actually in discovered The curry Howard correspondence the first he was the first well him and him and Howard, but he actually ever appeared of 30 years saw this in different Aspects it's it's one. It's actually curry hard correspondence Kind of refers to this family of isomorphisms between term and type and another way of looking at it is That whenever you write a program with the type It's the program is the proof of its type the type is the theorem the program of this proof if it if it can If it proves the type With dependently type programming it's worth Thinking about well, how does this related testing testing shows the presence of errors Proof show the absence of errors, so that sounds great, and you would think wow We don't need to test But in practice it's going to be somewhere in the middle There are but I think there are large classes of things that we shouldn't be tasting that we should be proving in with type systems by the way Morton and I were talking earlier, and he was saying you know the only thing types are good for us to prevent you from getting anything done and I I know why he says that I'm not a type junkie, but I have a lot of I Kind of fell in love with Haskell types. I fell out of love with other and C sharp types Horribly fell out of love with that Saw Haskell type system and thought wow. This is a type system. I could live with and then kind of reached its limitations and then I've kind of got Sobered up about that and I must say now with dependently type programming I'm getting a bit drunk again, and I'm thinking wow. This is actually Maybe now is the time for types finally so if you have criticisms about types as I think anyone has who has worked with it also It's worth looking at Dependently type programming. It's a new paradigm that actually shifts the whole ball. It's the whole new ballgame again and all your old criticisms you need to kind of update them and and To keep your arguments fresh, okay, so Just a historical sweep of that Haskell 98 actually was finalized I think in 2000 but documented in 98 T's 2003 type families 2008 Type promotion Very recent 2012 Singleton's also 2012 type promotions went through a few iterations last big type promotion iteration when I was here last very recent this dependently typed Thesis written in end of 2015. I think it was published. So here we are I'm being kind of conservative here. Maybe Haskell gets more dependent before then Haskell will never have totality like Idris does but and you then you might wonder well, why would you not just switch to Idris and Haskell is just a huge ecosystem. It's it's a Working language Idris is at this point much more limited in its scope But I would say for the same reason as you would you should look at Haskell for the most concise expression of functional programming. I Think Idris has the most concise expression of dependently type programming if you want to learn dependency type programming in Haskell you will work much harder than just taking Idris in although Unfortunately, I think you cannot learn Idris without knowing Haskell a bit because it's the same syntax and then I wonder if let's say another 10 years or so if Static typing and dependent typing aren't going to become synonymous because dependent typing even partial is so much more powerful if you feel it if you experience it you will Have the same experience as you do when you feel first-class functions for the first time. It's that same feeling I get It's it's it's like a tectonic shift and So, you know the more I learn the more I think this is this is what I feel about software. I Think we're just beginning. I mean APL Erlang here. These are different universes different laws of physics And we're just about to begin So that's my story. Thank you any questions Complain Yeah, yeah So I don't know actor well enough, but I'll tell you this if you read which I'm which I think we spoke about the other day It sounded like you you have gone into Idris somewhat Idris has taken in Agda. So he's read that last big Right of Agda was 2007 and so it's about then five Plus years later that Edwin Brady took in all of that and so my sense of it is that it's a it's a Modernization, but it's partial. So as you say you can fake partiality in Agda also. So I Think also what? So there's that it's more modern. It's newer which by no means means it's better or anything, but I Think it would have learned from some of those mistakes And I think it would it's because partiality is built in because you don't have to bend backwards to get partiality I think partiality of instead of totality is very important in practical programming because I think Full-proof systems have their uses, but they're not going to be they're not generally useful for most of the things we do So I think that that to make partiality first-class is maybe a big new kind of thing or not a new thing, but a better twist on it perhaps for for The kinds of things we would want to do with it and the other thing I always find interesting about languages is There's there's more to it than the syntax in the language itself because there's this Languages come with their ideas and they're So closure comes with it didn't invent immutability, but it expresses it very nicely And I think Idris expresses type-driven development very nicely. It's a very strong statement of that and so I I factored that into Into whether you should go with this one or that one or that that's my that's what drives my Decisions cool anything else? I? Hope I didn't put you entirely off Dependently typed, but check it out when you want to learn types cool. Thank you