 Dobro, vse nere. Sledam o transducersih. Prvom, da ga tudi da mi predaj, boš tudi ne vzvega. Vzvega je Rich Hickley, in tudi začal možno ljenje. Kaj je Kevin Beattie, tudi začal java skripk v poču in tudi začal v tudi začal v zvrčnjabadnih, tudi začal mnogo vzvega. Erik Meyer je jo vse boš, da je to zelo, ker je tavočno vzvega pusti in pol, da ne sejo vzvega. So, tudi na mročnji, I am mostly an enthusiast software engineer, that's it. And I like programming languages, compilers, and I worked in V8 team in Google because of this, and I worked in JIT compilers for a lot of time. And I started out as a JavaScript hater, a full hater earlier. I preferred hacking on the JIT compiler code, machine code with no stack traces, instead of dealing with that horrible mess that was web development. But eventually I changed that mind, I embraced JavaScript and so on, now I have my own programming language that compiles to JavaScript. Let's go on. Transducers. They are composable algorithmic transformations, so something that takes values and gives you other values. And they have nice properties. They are decoupled from input and output, so they are purely functional, and they compose directly like Lego building blocks. So you can just nap them together and they don't disturb each other. And really they don't care about the input, they are creations, and they don't create intermediate aggregates. They are quite nice. What does it take to understand them? Well, you need to understand the concept of a stepper function, then the concept of a transformer object to understand the reduced transformation, and then you have transducers. And then you can use this new transduce transformation. So this is sort of the roadmap to understanding them. First of all, we think about a stepper function. A stepper is a function that takes an initial value and an item, an element, and then let's understand new value for the initial value. And familiarize a bit with this notation. This is a type, stepper, which depends on two types, and it is described as a function that takes the initial value, the element, and gives you the new value. So this kind of notation to describe types will be used a lot here. And a stepper is the typical argument to a reduced transformation. Now let's see what this means. This means that we can have a function, sum, and it takes the first value for result and an item, and gives us the sum. And we have another function, which is multiply. And if we have an input, a rate 234, we can invoke reduce passing sum and one, and reduce will iteratively call this stepper with the initial value and every argument, and so it will perform this computation. And if we instead pass multiply, we'll actually get this. So I think you get the point. So this is about the stepper. Now we have the concept of transformer. A transformer is a bit more complex, but in the end it's like a generalized stepper. It is three functions, so it's an object with an init function that gives the initial value, then the transformation, the stepper, and then a result function that takes the last computed value and applies any final transformation you want. So it's like a generalized transformer. Now let's play with it a bit. And the idea is, first of all, we have a couple of utility functions, one that always returns one, and identity that always returns whatever you pass to that. And then we can build this function that we call transformer that essentially just builds the object. You know, it returns the object with these three properties. That's it. So now, like before, we can have this input two, three, four, and if we build the transformer, passing one and sum, we can pass the pieces to reduce, and well, it still works. No big deal, but we had to do the simple steps. So at this point, we talk about reduce and we tweak it for our purposes. So reduce, for us, is a transformation that takes either a transformer or a stepper, so we want to be able to deal with both, and the initial value, and an input sequence, and it gives us the result. So playing with reduce a bit, first of all, we build this wrapper, which takes either a stepper or a transformer, and if it is a function, it suppose it is a stepper, and so it wraps it into a transformer, and otherwise it returns the object itself. So this is just to simplify things. We have this wrapper. And at this point, we can define reduce as a function that takes the transformer. First of all, it wraps it to be sure it is actually a transformer, and then it invokes reduce, I'm not going to implement it here for simplicity, and then invokes result. So we have this reduce function, and we can do the same thing as before just to prove that it works. So we have the input, and if we invoke reduce with sum and one, it does the computation. With multiply and one, it does it. And if we pass not the functions, but the transformers, it still does it. Now, reduce can also copy from one array into another. This is sort of a nice thing to understand. If you have a stepper that is append, so it is a stepper because it takes two values, the first value and the other one, and what it does is push the item into the value. And if we call reduce and we pass append as transformation and start with an empty array and pass the input, well, we get a copy. And this is interesting because it shows you that reduce is actually the most general transformation we can think of. Every other transformation can be implemented in terms of a reduce in some way, even just a plain copy. Now that we have seen reduce, we are ready to talk about a transducer. A transducer is a function that takes a transformation and gives us another transformation. A function that takes a transformer into another transformer. So let that sink in a bit because it seems like a word playing. It transforms, transformers. Actually, the term transducer comes from transformation of a reducer. Okay, let's see. The first thing we'll do is a transformer that adds one and then we combine that with append. So we have this function, plus one. It only adds one. And then a transducer that is the transducer plus one is a function that takes a transformer, as we said, because it needs to transform the transformer and it returns another transformer. And this transformer uses as init function the one of the argument transformer and also as result function, the result of the argument transformer. But its step is not like this. First of all, it applies plus one to the item. And then it invokes the stepper passing the obtained value. So here we see the combination and let's see what it means. We have this transducer and we can use it. So we can make a transformer that is the combination of this transducer and append. So if we make this transformer, let's see step by step. We invoke the steps one at a time. So we pass the initial value, the empty array and two and it does two plus one append. Then we pass three and it does three plus one append. And then we pass four and it does four plus one append and then we get the final result. So it combine the transformations and then performs both of them. And if we combine it with sum, so it's the same transducer, but we pass another kind of transformation. Now we pass zero as initial and two and you have two plus one sum with zero. Now we pass three and it is three plus one sum with the result and we pass four and it's four plus one and again, you see, and this is the result. So what we have seen is that we can actually combine transformations. These are toy examples, but they are just useful for the concept of course. So the generalized transformation, which is transduce, is a function that takes a transducer, which is the transformation, a stepper, which is the final transformation to build the result, an initial value and an input sequence and it gives us the result. Now let's play with transduce. First of all, we actually implement transduce and in this toy example is really simple. It's just a function that takes the arguments, so the transformation, the stepper, the initial value and the input. And first of all, we wrap it, the stepper, so that we have a transformer and then we combine the transducer and the stepper and then we just invoke reduce. And that's it. Now, let's build another transducer, which everybody probably knows, map. Oh, to be a transducer, okay, map takes a function, F, that is the transforming function and it is a transducer, so it is a function that transforms a transformer. You see, I'm passing the transformer and it returns another transformer and the key point is the stepper. In the stepper, it applies F to the item to get the mapped value and it passes the mapped value into the next step, whatever it is. So this thing is the map function as a transducer. And now, we can make it work. So we have the usual input 234 and if we say transduce, combining map and plus one, and to build result, do an append. So you see, 3, 4, 5. So to every value it applied plus one, it mapped and then finally it did append at every step. But we can say combine map and plus one, but doing the sum, starting with zero and we have the sum of each element plus one, of course. Or we can do with multiply, it's the same. Or we can have plus two and then we can invoke with plus two and you can see it just does the computation. So it's combinations of transformations, as I was saying. Now, it would be useful if you could compose them a bit more directly, so we can have this function that composes two functions and it takes one function, another function and it returns the combined function, which is a function that invokes one and gives the result into the other. And now that we have this composition, we can write code in this way. We can say transduce, mapping, compose plus one and plus two, which means plus three in the end and do the sum of everything and you do it and it does it. So, what we have shown at this point is that transformations can be composed. I can write very simple functions, elementary functions, stateless, pure functions and they can be composed together and each step there, each function only deals with its own values, it doesn't care about the rest. And the nice thing is that intermediate steppers do not create intermediate results. If you were using something like lodash or underscore or whatever, the same and you continue doing map, map, map, map, each map does the full map, it creates the intermediate array. In this case, they don't care. They just do what they need to do. And this is important for performance, by the way. And only the final transformers actually deals with the building of the full result. Okay, of course, these were toy examples. Let's say something a bit more concrete. And unfortunately, the talk is short, so I'll skip a lot of things, but just get the idea. Load is a function to load modules. I need to write it this way in satellite table. Don't care about it. So we load lambda, which is a utility module for functional programming. Our goal is to start with this input, which looks like an Apache log, where each line is literally a log from an Apache log. And as output, we want to get this text, which is one line for each get request that was getting something that was not a static file. It's some kind of processing. And we want to have them in this format. IP address visited the full URL. So it's a bit of transformation. So first of all, we need to implement the filter. And we do that with the regular expressions. So we test that there is a get and we test that there is get, but not static, and we combine them together. This is lambda utility functions that give you other functions. They use carrying heavily. Carrying will be explained likely in the last talk of the conference. Take them as they are. You get a filter out of this, and you can use the filter and it filters. Then we want to make a transformation, which is two IP and the request. So we start with that log line, and we transform it into a pair of the IP and the request, this string get something. And again, this is lambda utilities. You get this function. You can use it with map and you get the result. Then we have a couple more transformations that we need to do something like, if we have this string, which is the request, we want to turn it into the URL. And okay, we have a function that does it. And then the decal transformation that's useful, you start with this pair, the IP and the request, and you get to this pair, the IP and the URL. And again, this is a simple function. You build it with lambda and it works. So the meat of the example that you need to see is this. You can build this object, parse log, which is the transducer. And you do that composing the functions with the filter one, the mapping with the transformation, the mapping with the other transformation, and then the mapping joining the elements into a string with this string in the middle, and another mapping that adds the new line at the end. So these are all transformations of a pipeline. And invoking compose like this, we get the transducer. And then you can say transduce, the input using the parse log, and then add to combine everything into the string, and out is what we wanted. We can do it with this shortcut form into that, om it's a step, but this is it. Now a funny thing. This is lambda. We built this function with lambda. We can load another transduce implementation, the original one, and we can invoke its own version of transduce, okay, and it still works. So essentially you build code with a library and use that with another library, and they are totally compatible because it's so simple that it's actually very easy to make interoperable. Okay. And what about this thing? This transformation was synchronous. I started with an array and ended up with text. But what if you wanted a sync? And what does a synchronous computation mean? Well, many say it's about push and pull. And it can be true, but I don't fully buy it. Now let's see why, but first we must see something about push and pull. Push and pull are dual. What does dual mean? Now I don't want to make some kind of algebraic theory or funny things. It's similar to the Morgan's law for or and then. In the end, here it means that if you want the dual of a function, you need to reverse the direction of every arrow. So let's start with something simple, getter and setter. A getter is a function that takes nothing and gives you a value. When you invoke a getter, you pull a value out of something. While a setter is a function that takes a value and gives you nothing. When you invoke a setter, you push a value onto something. Now remember them, because you will find them everywhere in the next slides. So this is just easy. Push and pull are dual. You see they are dual, because if you see the function definitions, they are the same with the reverse arrow. Now, iterable and observable. In JavaScript, an iterable is this object with this property. You can think of this property as a getter. You invoke it and you get the iterator. And an observable is something that has the subscribe method. And you can think of it of a setter. You push an observable into it. So if iterator and observer were dual, these would be dual. Now let's look at iterator and observer. An iterator is like a getter. And when you invoke the next, the getter, three things can happen. You can either get undone or you can get the value or you can get an error because it throws. So it's a getter that can give you one of three things. Undone observer is like the combination of three setters. It can accept the push of either next or undone, completed, or an error. This is the definition of observer. So you can see what do they have in common. What they have in common is that they both tell you what to do when you have the next value, the completion, or an error. But one of them is push and the other is pull. So we can really see that they are the same thing with the arrows reversed, they are dual. And you know the good news. The good news is that transducers don't care. When you define a transducer, you just define a function. It takes a value, it gives you a value. So they work in push context and then modified. So let's see. We are still in the same file. And what we do now is to load the reactive framework which works with observers. And we create an observable from the input array. And we transduce using the parser. The same parser object we defined before with Ramda. We pass that to the reactive framework. And then we reduce to combine all the elements and we subscribe getting out the out that we wanted. And of course, out is what we wanted. But let's say we like kefir and either library that uses the observable pattern a lot and it is asynchronous. So we load it and we do the same thing. The function calls are a bit different, the names, but we create the stream with sequentially passing the input and we transduce, reduce to combine everything and on value so we subscribe, we do it and out is what we wanted. But for added fun, we can use Node.js streams. So here it's a bit tricky because I'm using streams that work on strings. So I'm loading a lot of modules to have streams that work on strings. But okay, string stream can read a string as a stream. Split to split a stream. Transduce stream to transform a transducer into a reader writer stream. And stream dump to have a writable stream for a string. But after I do that, I can just do pipe into this, pipe into this, pipe into this. This is plain Node.js streams working. And I do this pipeline and out is what we wanted. And keep in mind that this is actually asynchronous in the usual sense. So if I were reading a huge file, it would perform the processing one chunk at a time without creating intermediate aggregates at all. So this is nice, good news. So far so well. But what about really asynchronous transformations? Because a push API, as I said, it looks async. But actually, when you trigger it, everything happens synchronously. So in the code that I show with you, every time we get a new log line, the transducer performs all the steps of the pipeline. All these steps, if the code is written like this, happen synchronously. This is why I was saying that, yes, push is asynchronous, but up to a certain point. You really need to understand what you're doing here. What construct is really asynchronous in, let's say, idiomatic JavaScript? Because we are JavaScript developers, we should understand how the language works. Well, the idea is promises. A promise is like this. So it's like two setters combined. When you've got a promise, it can either resolve, so you push a value into it at interest solvis, or it can reject. So you push an error into it, and it rejects. It represents, as we know, a future value, and it models the latency, the difference in time, between when the value is available and the moment when the code is ready to handle it. You know, the value could be ready later, but I want to execute the code that should handle it now, I mean, at least parse it, or even vice versa. And it's different from unobservable, because unobservable, technically, is like a sequence of values. While a promise is just a single value. One could use a library based on observable to model a stream of single values, but it can be awkward. An asynchronous sequence modelled with promises would be something like this. It's an asynchronous sequence is the promise of an asynchronous sequence element, where an asynchronous sequence element is something where you can either get the value or the fact that it's done or an error, but when you ask for the next thing, it gives you another promise. So the idea is, it's like a linked list. Every time you do a step, you get another promise, and you can check it value, of course, but every time you want to get to the next one, you will get another promise. And in this way, you really modelled a sequence. It works like this. It is, again, like push and pull, but async style with promises to model latency. So getter and setter, and asynchronous getter is a getter that gives you a promise. And an asynchronous setter is a setter that accepts a promise. And if we had patience, we could write the full signatures of iterator and observer, but they are like an iterator that gives promises and an observer that expects promises. And the nice thing is that RANDA and ANDERAR, they are libraries that work with transducers, they support this model already. So what this means is if we go to this piece of code and look at this, the idea is instead of doing this composition with, let's say, a pure plain function composition, you can invoke something like compose async or something like that, and what it does is it takes every step and in the middle it inserts some middleware code that checks if the value was actually a promise. And if the value was a promise, it actually invokes then. So it does the composition, checking if it's a promise and if it's a promise, it invokes then on passing the other piece of the computation, chaining it. Which means that you can structure your code in the same way, but this time you can have a fully asynchronous pipeline because, well, you get it. Every step could wait for the previous step. The takeaway from all of this is that transducers model composable and reusable transformations. You can define small functions, very testable functions, because each function is small, it's stateless. When you test it, you just pass values, get values, it's very simple. And there are several modern libraries that supports them. So since they are really nice, they work well. A lot of libraries like the Reactive Framework or Kefir and so on, they are just saying, okay, or Randa. They say, why not using them? They are so nice. And really, again, I want to stress one thing. When you do this composition, the full assembly of the result is done only at the end. If you use some other library like, you know, underscore or lodash or others, every time you invoke map, it does the full transformation and creates the full collection. And it makes a lot of difference. Push and pull are dual. This is something that is good to know. Not many people usually pay attention to that, but when you write code, you should think, do I use push? Do I use pull? Sometimes it's almost the same. And I think the idiomatic way to represent a thing is promises. That's all I had to say. Probably I've been a bit fast because I had fear of finishing too late. I know it was a lot of content, but this means you can ask questions. So, tell me. I was just wondering if you do it async, async, and could you use it with WebWorker student? Is there a system in your RANDA to do it with? Okay, I have never mixed transducers with WebWorkers. There is an issue in that, well, WebWorkers are very, very separated from the rest of the browser. The only thing they can do is copy a full value so the result of a WebWorker is essentially copied into the browser. There's no way to share things. So, I don't think it would be possible to do this kind of transformation, crossing the boundary, sorry, not transformation, composition, crossing the boundary between a WebWorker and the regular browser loop. That said, you could implement a WebWorker that needs to do a pipeline of things with a transducer. That would be a good thing. I mean, it would work. It would just be a good way to structure your code but it would be inside the WebWorker. Hi, thanks for the talk. There's a lot to digest there but I was just wondering if you could give any tips of where you've seen transducers used. Are they, is it too much of a fundamental programming concept to give some examples of areas where it's useful to use transducers? Well, I didn't see them used in libraries or let's say pieces of code because they are fairly new and I know they are used a lot in the closure and then closure script communities because those communities are more receptive of these things. I mean, they were born there and there they are used more. In JavaScript, I've seen them introduced in libraries. So I'm starting to use them. I didn't see them used around. My advice if you want to understand this more and follow my slides and everything is to read the tutorials by Kevin. You'll find the links in my slides and they are the best explanation that I have ever found about transducers. You could listen to the talk by Rich Hickley, the inventor, but it's a lot of closure and if you don't understand closure, it can be hard. So it's good for the concepts but it's bad for the language if you don't know it. While Kevin did a lot of work and actually you recognize that a lot of Kevin's material is in my slides. So. All right, thanks for that. Are these lazy evaluated and what would happen on an infinite stream? Sorry, I didn't see you. He's right. And I didn't hear well. So with these are they lazy evaluated and what would happen with an infinite stream of data? What would happen if? If you're parsing an infinite stream of data. See, you had a log file example there but if that log file was just infinite, right? It was always rolling. Are these functions lazy evaluated? Okay, in devaluation, all those evaluations happened in light table and light table just does evaluation like a... No, no, I mean about the data parsing. So you're saying that they, with a regular map function, there's a variable computed in the middle, right? With these, the value is computed at the end, but is it a lazy evaluation? Okay, now I get the question and the difference is if you use it compose or the asynchronous version of compose. If you use compose, the evaluation is not lazy for each step. So essentially a transducer composes all the transformations into one single transformation and this transformation happens for every item. So every time it takes an item, it doesn't matter if it was pull or push. If it was pull, you've got a loop and you take items and give... If it was push, the loop is somewhere else and it pushes items, but every item does the full pipeline. Now if instead you use it, the asynchronous version of compose, the one that introduces promises, code that handle promises in the middle, then each step could return a promise and then the evaluation of the pipeline would become lazy. So by default, what is lazy is the sequence. You've got a sequence and to each element you apply a pipeline. The sequence is lazy, the pipeline is not. I think unfortunately that's all that we have time for on that answer. This is a very good question. Big round of applause once more, please. Thank you. Okay.