 All right, hello everyone, I'm George Wilson. I work at the Queensland Functional Photography Lab, which is in Australia, under data 61. I am here today to talk to you about laws. Yay! All right, cool. So, I'm going to tell you all about laws today. I'm going to be using Haskell. In Haskell, we use type classes as our sort of primary means of abstraction. And one really interesting thing that we do in Haskell that I don't see in a lot of other communities that I think is really valuable is we attach laws to our type classes. So these are invariants or properties that any instance of this type class must obey. Okay, and laws give us a lot of power, a lot of advantages when we use our abstractions or when we think about our abstractions. So today I want to tell you all about laws and why I think they're so amazing. I think quite often you'll go to a forum about Haskell and they'll tell you about this amazing type class. They'll say, oh my goodness, you have never seen anything as brilliant as co-monets or traversable or whatever the thing is of the thing of the day. And they'll say, all right, this is co-monet, it can do all this stuff. It's got these laws, don't worry about them too much. Oh my goodness, you can do this stuff, right? And so this talk is intended to be the opposite. I will do nothing but worry about laws. So let's get right into it. This is monoid. It's a type class that's quite prevalent in Haskell. It's in the base library. So as soon as you start writing Haskell, you have access to this. And monoid is a type class with two members. One of the members is called less than greater than. I will pronounce it append. And what append does is it takes two things about height M and it combines them in some way to produce a third one. Okay, so append is good for smashing stuff together basically. And then we have M-empty, M-monoid, and then empty. I'll generally pronounce empty because it's easier to say. And this is what it means to be a monoid, except there are also laws that you must obey to be a monoid. And these are the three laws here. They are the left-identity law, right-identity law, and associativity. The left-identity law says if you append empty onto something, onto something, let's call it y. The result of that must be equal to y. So appending empty kind of, it doesn't do anything. We call this the identity element or the neutral element. There's different, it's called identity. The right-identity law is fairly similar except it's flipped around. So if you append empty on the right-hand side, that should also not do anything. So empty, if you append it on either side, nothing happens. Then we have associativity. And associativity says if you're going to combine three things, call them x, y, and z, then if you choose to combine x with y first and then combine that result with z, that should give you the same answer as if you combine x with the result of combining y with z. Okay, so another way of thinking about that is we can add and remove parentheses wherever we want and we won't change the result. We won't change the program. All right. Let's look at some examples of monoids just to make sure that we know what's going on. So addition is a great example of a monoid. So here we can make a data by a sum and sum simply wrapped up an integer. And we can give a monoid instance for some that says if I've got two integers and I want to combine them, I combine them with addition. And if I need an empty integer, the empty integer with respect to addition is zero. Okay, now let's look at our laws in terms of addition. So the left identity law says, in terms of addition, says if you add zero to y, that's the same as y, the right identity law says if we have x and we add zero, we get x. So we can look at some examples, say we add zero to five, we get five, we add zero to seven on the side, we get seven. All right, so we're looking good for left and right identity. So if you think we're going to add, say, three, four and five, there are two different ways we could go about doing this. We could put the parentheses for left over the right. And I think parentheses are a pretty bad way of looking at associativity. Okay, when we're talking about associativity, even I've done it, I've said, oh, we can put parentheses in. What I really want you to think about with associativity is I want you to think about trees of computation. Associativity says I can restructure this tree. Okay, so we can draw some trees. So here are two different ways we can evaluate this expression. Either it'll let the associated way or write associated way. So if we add three to four, we get seven, then if we add five to that, we get 12. Or if we've shown them the other way, we could add four to five, we get nine, and three to that to get 12. All right, so this is the associative property at work. Okay, so we've looked at some integers. Another excellent example of a monoid that I really love is list. So lists are a monoid. The empty list is the empty, with respect to the list monoid. And if we have a left list and we want to append it to a right list, well, what we do is we pattern match on the left list. If the left list is empty, then the result is the right list. If the left list is not empty, it has a head and a tail. So we append that. So we cons the head onto the recursive result of appending the tail onto the right list. So what this will do is it will walk through the left list and then once it gets to the end, it will stick the right list there and we've appended the lists. It's recursively defined over the recursive structure of lists. And this also turns out to obey these laws, left identity, right identity and associativity. So if we append the empty list, that's the same as not doing any appending at all. And it turns out that this works in an associative way as well. So we can restructure the tree of computation. So list is an excellent example of a monoid. And you're like, yeah, well, that's cool, George. I mean, sure, you can write monoid and you can have some examples of it, but it's not a very useful abstraction unless we can write some functions in terms of it, right? So we tend to abstract over things so that we can work purely in terms of the abstraction without referring to the details of the concretion. So let's write a function that works just with monoids. This function is called mConcat. And mConcat says, if you've got a list of stuff and the stuff happened to be monoids, I can take the whole list and smash them all together into a single value. And I can do that with the monoid operation. So let's look at how this is defined. We take in our list and we pattern match on it. If it's the empty list, we give back the empty from our monoid. That's actually the only thing we can do, which I think is very cool, but that's another talk. Anyway, if the list is not empty, then it has a head and a tail. So we take the head and we append that onto the recursive result of mConcat and the tail. Okay, so we walk through our list, mConcat appending as we go, and we end up with a single final result that is the aggregate of the list in whatever our monoid happens to be. This is a really useful function and I use it all the time in my day job. Maybe you do as well. It's a really good one as far as functions go. So here's what an example of using it might look like. We've got one, two, three, and four in a list and we can add them all together and the evaluation of that will look something like this. So we'll add one to the result of adding three, two to the result of adding three to the result of adding four to zero and then we collapse the recursive. And this turns out to be right associated because the structure of lists is right, is recursive to the right. But we don't have to worry about that because we've got associativity. We get 10. So you say, all right, so that's cool, George. I believe that monoid is a useful abstraction because we have some instances of it and we can write some functions in terms of it. But have we actually used the laws? Why do the laws matter? I thought this talked with laws. Well, don't worry, it is. All right, so I want to show you why the laws matter by giving you a counter example, something that's not a monoid. And my counter example is subtraction. Subtraction does not form a monoid. So we could make a data type that wraps up a subtraction and then we could give it like broken evil monoid instance that breaks the laws and it says, well, if you want to combine two numbers, you subtract one from the other and the empty subtraction is zero, I suppose. But once we start actually looking at whether this follows the laws, it turns out that it doesn't. It breaks two out of three of them. So left identity expressed in subtraction would look like this, right? Zero minus y is y. Well, that turns out not to be true because negative five is not the same as five. So we don't have left identity. We've broken this law. We sort of incidentally do have right identity and associativity, if we look at that, there are two ways we could parenthesize this. But parentheses are not a very good way to visualize associativity, so it's not some trees. These trees will give us very different answers. If we subtract five from 11, we get six. And if we subtract two from that, we get four. But if we first subtract two from five, we get three. And if we subtract that from 11, we get eight. And eight is not the same as four. So we don't have associativity for subtraction. All right, well, maybe we can still get some use out of it. Maybe we could still call MConcat to do a bunch of subtractions for us. This could work, right? Well, let's look at the definition of MConcat. Oh, yeah, right. I forgot, right? Earlier on, I mentioned MConcat is recursive to the right because it just naturally follows the recursive structure of lists, which are recursive to the right. That's a real shame because when I subtract, I actually want things associated to the left or I get the wrong answer. That's a huge catastrophe. I'm gonna get four instead of eight or something even more wrong than that. Maybe I'll get a negative when I wanted a positive and then how do I have negative bananas? I don't know what happened here. Well, you associated to the right when you should have associated to the left. You did not have the associative property, so you could not perform what happened there. So this is pretty bad. We can't use MConcat because it sort of is associated the wrong way. Maybe we could still salvage this, right? Maybe we can still figure out how to get some use out of this. One option is we could sort of, we could call this MConcat R and then we could also have MConcat L. So we could have two different versions, one which associates all the way to the left and one which associates all the way to the right. This is something we could do. And it's not too important the details of the bottom one, but please believe me that that lets us associate all the events. But my big problem with this is sort of the whole point of having monoid was so that we didn't have to really care whether it was associated to the left or the right. We've broken our abstraction. When I came to use subtraction with my monoid, I had to stop and think, does MConcat go to the left or the right? And now I have to have two functions instead of one because of this one trivial example of subtraction, right? So if I didn't have these laws, then as I found new instances of my abstraction, I would have to keep coming along and changing the members because now I need two instead of one, maybe I'll find an even more broken example and then I'll need to change something else. So then as I kind of accumulate these instances, my abstraction kind of gets watered down into this mush of like, I don't know what it means anymore. If that makes any sense. Besides which MConcat R and MConcat L are sort of strictly weaker versions of fold right and fold left from the standard library. These functions are for consuming lists and one of them is sort of left associated and the other is right associated. And these are actually better than MConcat L and MConcat R as I showed you before because they have more general types. So you can use these in more circumstances. If you really need your subtraction associated to the left, you could simply use fold left. You don't have to go and make a fake monoid instance that breaks the laws and then go and ruin every page by having two of them. So I would prefer this to having a broken monoid instance that broke the laws. Our abstraction sort of fell apart as soon as we didn't have the laws. We had to water it down and split things in half and then we had to think about how is that to find again and I don't want to worry about it. The laws help me not worry about things. I can trade worrying about other things for worrying about laws and I find it's like nearly always an excellent trade, perhaps always. So let's look at some more examples. So we'll use some laws for good. We won't look at broken law things anymore. Let's write a greeting function. So this function takes a list of characters and produces a list of characters. As we know, because in Haskell strings there are lists of characters. As we know from previous slide, lists of monoids so we can get some monoid stuff happening and greeting takes a name and produces the string hello, name, how are you? But this greeting is for a list programmer and so we surround it with parentheses so that they can read it. So if you're playing conference bingo you can tick off the list of jokes. Even despite the very short and terse nature of this function I demand to refactor it because that's the sort of person I am. So I'm going to split out a useful helper function to do the parenthesizing for me because I've got sort of one parenthesis all the way over on the left and I've got another parenthesis all the way over on the right and if my boss walks up to me next week and says we're greeting closure programmers now I need those to be square brackets instead then I have to kind of remember to go into both places and update them. So I'd really like to factor out so the parenthesis is right next to each other. So what I'll do is I'll write a little helper function. This helper function is called between and it says given an opening and a closing and something to go in the middle simply append the opening to the stuff now I can refactor this and I can say between parenthesis hello name how are you and I find this more readable and maybe you don't and that's alright but what I really like about this is I've factored out this little function between between is a very useful function I can reuse it all over the place watch I reuse it so I can say between parenthesis between hello how are you put the name maybe you might argue this is refactoring too far I did reuse between no one can tell me that I can't but what's really amazing and mind-blowing and powerful about this example is that here I was in the world of strings strings before me in every direction and I was working with strings I was thinking very stringy thoughts and I broke out this sort of little helper function and the little helper function was very useful with my strings but this helper function actually works with any monoid just for free I didn't even have to think about monoids Haskell's type inference will infer that this function works for any monoid and that's just unbelievable that's really incredible I can use this for any monoid even though I was deep in the string minds alright I find that so wonderful I can pull out little helper functions and Haskell tells me this is much more useful than you even thought it's really exciting it happens all the time usually I'm not doing string stuff though but did you notice that when we refactored when we pulled out the between function we changed the associativity of what was going on to talk about laws right so originally everything was associated all the way to the right because that's what the fixity declaration of the append operator says but when we pulled out between and put the two parentheses when we thought about the two parentheses and what goes between them we actually ended up having this computation we had the parentheses and then in the middle we had this other little subtree so we just by refactoring our strings we changed the associativity of this expression I argue that you already use associativity all the time and you might not even know it okay so I think like knowing about associativity you'll be able to use it even more and refactor and reuse even more so I kind of feel like my job is already kind of done I don't even have to convince you that associativity is good you already love associativity now if you've just learned about associativity now you can go and do it even more so I think laws let us refactor and reuse more that refactoring even though it was subtle that refactoring only worked because of the associativity of concatenating lists we were able to reuse that function that was really cool as well that works for any monoid any monoid will also allow us to restructure things in that way and we just got that for free I think that's really cool but I've sort of kept something quiet until now and maybe some of you have noticed that if you know a lot about lists and monoids and stuff which is that laws don't really say anything about performance I can put parentheses in and take them out as much as I'd like and I will get the same answer but I might get it next week instead of tomorrow laws don't tell us anything about performance they just tell us that the program will give the same answer they don't say anything about when and in particular I want to walk you through an example where choosing one associativity over the other has a bad outcome so in the case of appending lists it turns out that lists really like to be appended in a right associative way so with all the parentheses to the right in a case like this where we have a left associated list append it's not a very happy situation to be in because it's a bit slower so I'll explain why so we have our lists here, we have three lists 1, 2, 3, 0 is one list 4, 5, 6, 0 is another list and 7, 8, 9, 0 is the third list we're going to append these three together and we're going to do so starting from the left first we'll append 1, 2, 3, 0, 4, 5, 6 well, we walk through the left list so 1, 2, 3 we get to the end and we stick the right list there alright, so we've done half the work now we've done one of our two append operations that we need to do we can garbage collect the first list alright, now let's append 1, 2, 3, 4, 5, 6 on to 7, 8, 9 well, we walk through the left list 1, 2, 3, 4, 5, 6 and then we get to 7, 8, 9 the problem here is that we repeated work we had already walked through 1, 2, 3 but then we had to go and walk through 1, 2, 3 again in our second append okay list append is list append is um takes linear time in the length of the left list but when we let associate our list append we're building up a bigger and bigger left list as we go so each time we do an append it's very expensive and that really matters if we're appending a lot of lists together I mean this isn't even a lot of lists this will still be vast enough but imagine I had so many lists I couldn't fit them all on a slide and they were all very long and then I really, really couldn't fit them on a slide then having left the associated append is a disaster you know and you might think you know because it'd be N squared on the left as I said but there is a solution to this there is a solution so you might be pretty down on associativity at the moment you're saying George come on I can refactor and restructure all my code but it may or may not get slower that's not very cool that's not really on but as laws take it away laws give it as well so I'm going to show you a data structure called dList and dList the problem of left associated lists depends it is the ultimate sledgehammer of fixing that problem and the way it works is a dList is sort of like a list in fact we can convert between we can convert a regular list into a dList and it's constant time to do so and the way we do that is we suspend the computation of appending that list we say I'm going to append this list to some other stuff later okay and then if we if we're done doing dList stuff and we want to go back to a regular list then we just call it with the empty list okay so we get the function out of the dList so a dList is a function we get the function out call it on the empty list it gets us back our original list this takes linear time in the list and dList much like list has a monoid instance where what it does is it combines these suspended lists of pens okay so it does that simply with function composition so we just glue the two functions together we've got one function that says I'm going to append my list to some stuff and then we've got another function that says I'm going to append my list to some stuff we just glue them together so we've got like a combined suspended list of pens to one day do later and the amazing trick about dList is that no matter how you associate your dList appends when you go and say go back to a real list it will right associate everything and then do the ideal linear time walk over the lists it does this with pure black magic and I linked a blog post about how it actually works but for now please believe me and this thing obeys some laws there are some really cool properties that these functions I've just shown you have for example 2List is a left inverse of fromList what that means is if I have a list and I call fromList I turn it into a dList I actually go back to a regular list and it's the same list I always get the same list if I just go back and forth that's a really useful property we also have another useful property that fromList is a monoid homomorphism and you say George what on earth is a monoid homomorphism I'm going to tell you what it is so fromList is a function between two monoids it goes from list which is a monoid to dList which is a monoid so monoid homomorphism goes between two monoids and it obeys some laws so if we call fromList on the empty list we should get the empty dList according to dList's monoid instance so another way to put that is we preserve identity so if you put the identity through the transformation you end up at the identity of the other side the other amazing thing that fromList obeys as a monoid homomorphism is this property that says when you extend x to y and then turn that whole thing into a dList you get exactly the same thing as if you had turned x into a dList turned y into a dList and then appended the two dLists so this function lets you effectively more simply or more informally what it lets you do is it lets you do your monoid stuff either before or after translating into dList and it turns out that doing it after it's used to do that so if we have some you know if we have some collection of list of pens then what we can do is we can we can use equation or reasoning we can work through with our laws and we can say alright well if I put a two list and a fromList out the front I won't change the answer so I can do that this is the same answer by that law I haven't changed what the program does and then by the monoid homomorphism property whenever I have a fromList an expression which is a collection of the pens I can replace that with fromList on each of the constituent parts of that appends and then append in the world of dLists ok so I can do that and now I've got fromList on all my lists to turn them into dList append them as dList and then right at the end called toList to turn the whole thing into a list and do so with right associated appends so that it goes at the ideal performance which is linear time and then so if we have lots of things we fromList them all, we append them all and then whether our appends were left or right we still get the best performance alright so I think that's pretty cool this trick only works because list is a monoid, dList is a monoid and the fromList function is a monoid homomorphism so we took advantage of like five or six different laws to get this performance benefit but what's really, I'll show you a diagram as well so left associated appends at n squared that's bad, what we do instead is we call fromList fromList is constant time but we have to call it once for each of our lists so that's n and then we're in the world of dLists we do all our appends we have n appends to do so that takes at n time and then we call toList which takes us n because we have to walk over the entire structure of the list and if you know anything about computational complexity theory you'll know that n plus n plus n is n right and so we beat n squared we got the same answer, we got the best performance it's awesome what's really amazing is that the same dList trick with a few modifications can be made to work for any monoid this is not just about lists, this will work for any monoid you've got lying around so if you've got a monoid that is way faster if it's right associated or something you can hit it with this hammer and then it will just be right associated you can actually associate your code same thing if you need it to be left associated for performance reasons hit it with this hammer again in a slightly different way and you can get everything to be left associated instead, it's just incredible it works for any monoid because laws there was nothing less specific about what we did that's really incredible so I use this trick all the time in fact there are even not only does this work for other monoids there are similar structures dList that work for even for other abstractions other than monoids so there's a whole family of these tricks to just say it's faster if it's right associated just right associate everything for me it's amazing that we can just do this it's really cool and it's law only works because law so I'll give a sort of made up definition here that optimization is altering the program to get the same answer more efficiently I think this is a fairly reasonable definition if you alter the program and then get a different answer it wouldn't really be fair to call that optimizing you're writing a different program so optimizing is when we get the same answer more efficiently and laws are all about changing in such a way that you get the same answer so naturally laws can give rise to optimizations as we've just seen and I love taking advantage of all these things all the time it's really cool so you're like wow this is pretty cool stuff George I can refactor more I can reuse my code more I can even optimize my code more you know so I'm pretty sold on laws they're pretty incredible but what kind of horrible tragedies will befall us without laws you know I mean I'm all sold on laws but I'm still not scared enough of no laws so I'm going to do that now I'm going to show you a horrible, terrible world without laws that makes me very sad there's a type class in Haskell called default and default has one member called depth and it is a thing of type A so we can give some defaults here so the default list is the empty list the default int is 0 and there are no laws on this type class there's nothing that tells us kind of what these members are supposed to mean or how they're supposed to relate to each other in fact there's only one member so we can't really relate to anything else but that's sort of beside the point so I kind of I can't really disagree that the empty list is the default list with that type signature but I take big issue with 0 being the default int okay you would only think 0 is the default int if you did addition all day if the only thing you ever do when you come across an int and you say oh an int addition right then you think 0 is the default int but you know if you prefer multiplication like say if you work in the product department right right right then one would be your idea of a default int right and you know if we're going to try and say that 0 is the default int let's not even get started on division it won't go well so I really take issue with this idea that there can be a default with no laws to kind of ground it to anything else mono it has empty and empty you know it has the same type as the sort of default from default but the left identity and the right identity laws tell us how empty relates to our operation our append we've got this relationship between the two members of the class that kind of give the class an overall meaning and you know capacity to be understood whereas default does not have this it's pretty uncool so we've been looking up but we could maybe salvage it right it's not an abstraction if it doesn't have laws maybe that's okay maybe we could still write a function that works in terms of that abstraction so I've tried to write one here it's called all default and it says if I've got a maybe a but there's a default a I can give you an a and the way I'll do that is I'll pattern match on the maybe if the maybe has an a inside I'll give you that one if the maybe does not have an a inside I will give you the default a and this is a function defined in terms of the default type class that sort of works you could use this function but I argue there is a sort of by my view strictly better function called or else and or else says give me an a and a maybe a and I'll give you an a and it does a similar thing it pattern matches on the maybe if the maybe is just a it gives you the a or if the maybe is nothing it gives you the a that you gave it first so the difference is or else lets you provide your own default it doesn't mention the default type class anywhere and I think this is quite good because like let's say for example I want to multiply some things well then or default wouldn't be very useful because I would get zeroes everywhere and zeroes aren't very good friends with multiplication so since default doesn't have any actual meaning I would much prefer to be able to provide my own default at every use case independently so I think this function is better so I think default doesn't actually give me anything maybe you could write some other functions that work in terms of default but I don't know what they are so this is from the data default package on on package this is a household package people using it there's a competitor to data default that I actually find a lot funnier which is called acme default acme is sort of what you put on your household package if it's a joke package and whereas data.default is for is a class for types with a default value acme default is a class for types with a distinguished aesthetically pleasing value so let's look at some instances it defines exactly the same type class but it gives wildly different instances so here we go the default into 64 is negative one the reason that this library is actually a lot better than the actual default library is because this one documents why the things are the default so the default in 64 is negative one because that's the biggest negative number I argue that's just about as good as picking 0 because it's the identity for addition like unless you're living in an addition world the default bool is false and it's false because the library author asked his friend he said hey do you have a favorite boolean and his friend said no and he said ah no that's false there's an issue open on the repository at the moment and I really hope that we can get a full request and get this one merged but some wonderful person have suggested that a good default string might be the entire text of moby dig but I think that's as good a string as any really anyway we've had a bit of fun with this section I don't want to be too harsh on data.default it has a job its job is to overload a name it overloads the name it lets you use DEF in lots of different circumstances it does overload a name I find it kind of ironic that depth is two characters longer than zero so we kind of cost ourselves three times as much work in typing by overloading that name but that's sort of besides the point so I think now you're all really sold of laws and really terrified of not laws hopefully and you're saying alright George I'm going to use all these amazing abstractions like monoid that have laws but when I make my own instances how could I possibly know whether I obey the laws if I make an instance that doesn't obey the law it could be a very subtle cause of bugs and I think I've seen some subtle bugs that have been caused by an instance that was wrong in like a really specific case so it's important to determine that you obey the laws if you're going to make an instance that has laws there's a really excellent way to do that which is to use the property-based testing library QuickCheck which has another library on top of it called Checkers and QuickCheck is for property-based testing and Checkers is a library full of sort of pre-hand test cases for all the different laws so for the monoid laws for the functor laws, for the monad laws or whatever other laws you've got so for example it's just got this thing called Monoid and Monoid is a batch of tests that tests all the monoid laws there's the same functor there's lots of different abstractions that have laws I use Checkers all the time to check my laws and sometimes I'm wrong and I have to fix things I think it's really valuable to be able to do that to test the laws it would be much better to write a proof that the laws hold but that's sort of difficult in Haskell we don't really have the capacity to like build in a proof next to the instance or we could with like tremendous difficulty but testing is sort of really lightweight pretty well so I really like using Checkers so for example if we took our broken subtraction monoid that doesn't actually work and say we wanted to test that and we also wanted to test addition we would define a main here in our test suite that simply says run the monoid test for sports addition and run the monoid test for subtraction and we just really have to write only a little bit more code than I've got on the screen to get it working and then we can run this and it will say I tried 1,500 times to break the laws for addition and I could not so that gives some sort of reasonable confidence it could still be broken but we have 1,500 cases where it doesn't that were automatically generated for whereas for subtraction it was quite quickly able to tell that left identity and associativity don't work for subtraction so I really like these libraries for testing my laws I do it all the time and I think sometimes the laws are kind of obviously true like in a really simple addition case or something like that but when I've got this big complicated structure I've built up by composing all these other structures and then I want to get just the right monoid that does the thing I want it's really helpful to be able to do this so I think laws give rise to useful functions that we otherwise couldn't have they allow us to refactor our code in ways that we otherwise wouldn't be able to and in some cases they could help us to optimize our code okay so today I've spent a lot of time talking about associativity and monoids there are so many other wonderful structures that have associativity even associativity that is just shaped sort of differently like monads have an associativity property but it looks a little different and this is not just associativity there's lots of other things there's commutativity, there's item potency, there's distributivity there's lots more things I couldn't fit on a slide or couldn't remember you know funkda has laws of identity and composition that don't even kind of fit this shape they're kind of because funkda has a different shape there's just so many laws out there and they're wonderful and you should go and learn all about them and use them to help write better code and have more fun so I think the kind of key point that I want to drive home here is that I think laws are the difference between an overloaded name and a real true abstraction we want to write in terms of our abstractions without relying on the concrete details and I think laws really get us there in a way we otherwise don't get there thanks that's the end of the talk thanks so much we have some papers of questions yeah so default it has value zero because otherwise if if you put laws on it it will become a monad identity yeah yeah I don't think there are any laws that you could put on default yeah that's why you should default like you said you are not happy with the value zero yeah so I was trying more to illustrate the point that I'm not happy with the concept of having a talk class for it it wasn't exactly an attack on zero so with your monoid homomorphism you had like you had the same much from list empty yep I was thinking back to Tony's thing that he's been going on about today and yesterday which is reverse of empty is empty and the second one is reverse of x plus reverse of y is reverse of y plus x yeah that's almost a monoid homomorphism but not quite why is that? does it not deserve? because it's instead of so you got from list x plus from list y and that's from list of x plus y but reverse flips those but I think it flips it in a consistent way so I think it's a monoid cool thank you yeah there's a monoid there's a wrapper in base in Haskell called dual or rev or something and it's just it gives you a monoid that's the other monoid but with the arguments flipped I think that monoid has some relationship to the reverse function yeah I think it is in a fast and loose way if you want to Tony's talk hopefully we want Tony's talk that's all okay yeah I've got a question in the back so the question is more around property based testing as you mentioned in this example are basically some zero the specific type is integer so the tools that whatever you mentioned will do well but what if my monoid type is a type that I defined like a wave type or something else so how good are those tools that they verify the monoid are different laws like just a question based on that yeah okay so you're saying if it's not int if you've got a what did you say sorry about the type so custom type do I have to like spend time in creating a text pictures and different different things to or the tool whatever you're telling such as tools they inherently thinks that I mean verifies the how easy is to test that property based testing when you have custom types yeah that's a really good question so with this library quick check there's another one I really like called hedgehog that Mark gave talk about earlier on they give you a set of generators and generators are these random generators for and you've got instant strings and pools and lists and whatever that what these libraries do is they give you a way to combine these generators into bigger ones so your type might be a person which has a name and age and an address you know string string and string or whatever you can use these primitive generators like the string generator the generator and the string generator to build up a person generator that you could then use to test the laws yeah cool so one I mentioned in the talk was quick check and the other one that I sort of like a bit more is called hedgehog but testing the laws in hedgehog is something we don't quite have working yet I think yeah there's been a few attempts at it but they haven't gotten up to the level the checkers is at so like I've got a household project where I've got a great big hedgehog test suite that does most of my tests and then I've got this little tiny quick check so a test suite that only runs checkers and so that's my solution at the moment I'd like to see testing of laws for hedgehog the problem comes in with when you're testing the functor laws you need to be able to generate randomly generating functions and that's much more difficult in hedgehog because of the way hedgehog is implemented than it is in quick check I think we can do it now though one of my colleagues was having a go at that so maybe we'll be testing all our laws with hedgehog and all this will be out of date I think those are the questions so if we try to create our own type class then we would need to come up with our own laws that satisfy it or do we have some guidelines for it or how do we make up our own laws so that we can put them into quick check and verify them yeah that's a really excellent question so if I'm going to make my own type class what laws should I give it or how should I determine what laws to give it or even to give it laws say for example if it builds on another type class like if you make a specialization of monoid then you'd certainly hope that it's still a monoid I think a lot of type classes aren't like that though I think a lot of type classes sort of are of the type of the default thing where they just overload a name so that you can use that name in lots of different places often you sort of can't give it laws in that case yeah I don't know what else to say so much the sort of parable of the evil of the default type class was to drive home a point and not necessarily saying that you should have a type class that you should never have a type class with no laws I suppose it's sort of I don't know what to say it sort of comes with experience you can kind of say I'm using this I think most of these sorts of things happen to look commuted or associated with whatever I think that's the right fit for whatever I'm doing sort of look at the problem you're trying to solve and what properties you might need in I guess how I answer that question for myself is I sort of I usually don't write many type classes I tend to import lots of type classes from other libraries and then they already have laws that were thought up by people rather than me so I just use everyone else's type class and they can think about maths for the last 30 years before I was born about 30 so I use theirs if I could just follow on from that question a little bit so when we write type classes should it have laws or should we look for laws and so on and I think a really useful litmus test to answer that question your question which is to all type classes you can write as a data type including monoid and so even in your head or write the actual code write it as a data type and then ask the question is that a better way to do it if you can't come up with laws so like take default for example as a data type it's called identity so the code is data identity A equals identity A just wraps up an A so I think in that case I agree with George about the whole evilness of default and so on should that have been a data type that just gets explicitly passed in instead of a type class and I think that one's a really obvious one where as a data type it's much better because they don't have laws just a lot of pain thank you so much George