 You can see how to design new blocks in psychos, how to construct complex diagrams, and so on. What I'm going to talk about here is some of the details that may not be obvious when you start using psychos that have to do with the formalism. So I just going to go over psychos very quickly again, different features, graphical editor. As I said before psychos provides a hierarchical graphical editor for construction of blocks. And many blocks are available as people say yesterday in different palettes. And of course new blocks can be written in C, even Fortran and Sylab. We didn't have time to write new blocks yesterday, but in the document you have a number of examples of how to design new blocks and how to run them and include them in palettes and even create new palettes and so on. What we didn't see really was that behind all of this there is a compiler, not a compiler in the sense of the C language where you generate executable code, but it's a compiler that generates a scheduling tables which are used during simulation. All the schedulings are done at the compilation phase so that the simulation can run as fast as possible. And of course it is the result of the compilation which is used for code generation. Now the simulator uses these scheduling tables and other information provided by a compiler to run simulations. In fact when you save a psychos diagram which has been compiled, you also save the result of the compilation. You can also go and examine the scheduling tables. Well, most users never have to do that, but sometimes you may want to change the sequence of execution in some cases because the sequence is not unique and for some reasons you may want to impose you may have some preferences and you can do that. And the simulator is hybrid, continuous time discrete time and that makes it pretty complicated. It's not just a matter of finding a sequence of executions. There are so many different cases that come up. There are different events. There is continuous time and so on. And for each one, there is a sequence of executions which are computed and stored in the compilation. And of course everything has to work with the numerical solvers which are built in psychos. Code generator uses also these scheduling tables. So what you have in psychos is a simulator that runs in some sets interprets by using the scheduling tables. So it is partially compiled. The block functions in order to advance time. When you do code generation, you go beyond that by writing explicitly the calling sequences to these function blocks and generating a standalone C code. So the scheduling tables are used during this space. Now these sub-systems could also include continuous time components. But in that case, fixed step solvers are used in the generated code. Otherwise there is no way to guarantee real-time applications. A very important point in using psychos is that you have access to silap functions. And we saw a number of examples yesterday that having access to silap functions are very useful. Designing a filter, post-processing the result of the simulation and so on. And psychos is really a silap toolbox. It's not more than that because it's almost entirely written in silap as far as the editor is concerned. And all the simulator and compiler parts are linked with silap. Okay, so it's just the toolbox like any other toolbox. Now a psychos user often needs to use silap functions, as I said before, and not only in designing, not processing the result of the simulation, but even in the construction of blocks. For example, when you want to construct a filter block, you can use filtered functions in silap in the function defining the block itself. Okay, so the fact that this block is defined in silap, it's very important. In Simulink, for example, the blocks are defined in C language. This is the difference with Simulink in particular. And also, as I mentioned before, you can run psychos simulations in batch mode using silap and do, for example, Monte Carlo simulations or optimization and so on. Now integration of psychos and silap is very strong, even stronger than Matlab and Simulink. And this has some advantages and some disadvantages. Now the editor being written completely in silap makes it completely customizable. Menus can be changed. You have icons in a block. The icon is just a silap graphics command, standard command. So you can just use any other command. You can draw a 3D plot inside the icon of a block because it's just any, it's just the existing silap graphics function and so on. And you can change the shape of the blocks and all this is very easy to do. It's very easy to debug because it's an interpreted language. And when silap is ported to a new platform, for example, there is a new version in Formac, then psychos works immediately in this new version. You don't have to do any extra work for it. And also, a psychos model is a silap data structure. It's a large data structure, but it is a standard silap data structure. So you can manipulate it as you manipulate any silap structure, go examine, you know, parameters within it, change if you want something in it, nothing special. Well, there are some disadvantages. Well, silap is interpreted language. So when you have very large models, sometimes things can be slowed down. Well, since the speed when you're using the editor is limited pretty much by the user that moves around the mouse, I mean unless you have somebody very, very fast, that is not a problem, especially with computers today. Okay. Probably the most important disadvantage is that silap has very limited GUI capabilities. There are many things that you cannot do in silap, and that's why when you're using psychos, it doesn't look like similar products where you have standard features where well, things look nicer than what you have in psychos. That's because in psychos we rely on silap GUIs. We don't use any additional GUIs. To make some things, to go around certain problems, we use TCLTK, which is interfaced in silap, but that's not a satisfactory solution. This may change in the future. Now, the formalism is very particular in psychos in that psychos deals only with the reactive part of the system. It considers all the blocks as black boxes. You know, psychos doesn't know what's inside the block. It considers the definition of the reactive system as an interconnection of black boxes. That makes it very simple, okay, because psychos doesn't have to worry about a particular function within a block. If I want to compute the sine or the cosine or perform, I don't know, x times y plus 3, it is not the psychos language that does these operations. It's a block. A block is a function, and psychos deals with these functions from a higher level. The only thing that psychos does is to make sure that these functions are called correctly in the right order, and it only knows certain properties about these blocks, not what's inside. That's very much different from, for example, Modellica, when you write the full system in Modellica language. Modellica is a very complex language because of that. Psychos is fairly simple. The only problem is synchronism. That's synchronous languages, properties, and the fact that we combine continuous time and discrete time. So I mentioned before, for a block, which is really a black box, you have a simulation function, usually written in C, and in psychos, we make the assumption that the computation that is done by this function, it's instantaneous. That's the basic assumption in synchronous languages. Okay, there is no time. It's instantaneous. In reality, it's not because it depends on your computer, but the formalism assumes that it's instantaneous, and the other assumption is that everybody has access to a unique universal time. Okay, these are the two basic assumptions in synchronous languages, and psychos makes them as well. Now, a psychos block, then, as we saw yesterday afternoon, has two types of inputs and outputs, regular inputs and outputs, which are used to transmit data between blocks, and activation inputs and outputs, which are usually placed on top and at the bottom of the block. And this is also somewhat different from simulink in that the activations are explicitly drawn on the diagram. Let's take a look at this example. Here I have a delay block. Now, this delay block is not a delay, as we usually know, it is not a delay in a signal, but it's a block that, when it is activated, generates an activation signal on its output, so this is the input. When it's activated here, it generates an activation signal with a certain delay. Here, the delay is three units of time. So three units of time later, an event is generated here. This event is fed back to the block itself, which means that the block is activated every three units of time. So here we have a sequence of events, the period being three, and these sequences used to activate three other blocks, random number generator here, a delay, and the scope. Now, the scope here displays the output of the random number generator and a delayed version of it. I'll just show you the result of the simulation. We have the random numbers here and the delayed version here. So the second signal is just a replica of the first one, but delayed three units of time. Now, what happens if we want to have more than one source of events? Now, this is a particular block, it's not actually a very special block, it's a standard block, but it does something special in that when it receives an activation, when it is activated, it creates an activation with a delay on one of its outputs. In fact, it generates an activation n minus one times on one output and once on the other one. So the sequence of events coming out of this block is n times more frequent on one than the other. Now, this operator here, the plus, is the union of events. So what happens here is that the events are, both events come back and activate the block again. So here we have a situation like the previous case where we had a delay, except that the feedback goes once through one of the outputs and n minus one times through the other one. So here we generate, in fact, two clocks, one which is fast, the clock here, and one which is slow here. The fast clock is used to generate random numbers and to run the display, and the slow clock is used to run the memory. This means that once out of n times, we memorize the result of the random number generator and display the result, but the result is displayed every time. So we can see the result of the simulation. There's a lot of explanation if you want to look at yourself. The result, so in this case, is three. This is the random signal, which are displayed, and here we store the random number, but keep it constant for three units of time and start again. So the block one over z here is activated three times less than the other two blocks. So here this is one way to do multi-frequency activation. But this is not a good way to do it. The reason is that we have two different sources of activations which are independent. That means, a priori, they are not synchronized. Now the correct way to do a subsampling is to use two special blocks which are the if-then-else and switch. Let me give you an example of that. So I want to do the same thing. No, I don't want to do the same thing. I want to do a similar thing. I have my delay block here, which creates a clock as before, which runs the random number generator and scope. And I have the delay here, the memory. And I don't want to run the memory every time, but under certain conditions. Now the conditions I used to do in this case is I want to run the memory block. That means I want to memorize the output of the random number generator only if the value of the random number is positive. Now I use this very special block, which is the if-then-else block. In reality, it's not a cycles block. It's not a standard cycles block. It looks like a block. We manipulate it as a block, but it's actually a language. It's a keyword in the language, if you like, like if-then-else in C language. And it is handled during the compilation. And what it does is that it takes the input activation that is received here and routes it to one or the other outputs. I insist on this point. It reroutes it. It does not generate a new activation. It reroutes the activation it receives to one of the outputs. That means the activation here, when it does come here, is the same as the one here. It's not a new one. So this is a completely synchronous diagram. And the only independent source of activation is this one. That's the only source of activation. So what happens here, we are activating the memory only if the random number is positive. That means when I stimulate, these are the random numbers, but only the positive ones are copied into memory. When we get the negative one, then it doesn't move. It just keeps the previous value. And it is there until a positive comes along. So that's one way to do conditional sub-sampling. So this signal here is less frequent than this one, but not by a constant factor, but it's conditional depending on the sign of a value. Now, is psychos data flow? It is not. Psychos is not data flow. Even though it looks like it, psychosformalism is event-revenged. As I said, at every stage we specify what event generates and activates what block. But there is an inheritance mechanism built into psychos to make it look like a data flow environment. That's because it's so much more convenient in many situations. Let's look at this example. If we just stick to what I told you, when you look at this diagram you say, okay, here, well, I should say that this is a super block which contains my delay with the feedback. So it's just a super block that generates a train of events. So the train of events are generated here. The square wave is generated. The memory is activated. But this cosine, while it's never called, it should do nothing. And the way it works is that when a block doesn't have an activation input port, then by convention we assume that it receives its activation from its regular input. That means it inherits its activation from the block that's just generating its input, in this case the square wave. So the cosine actually receives activation, same activation, this activation here. So the inheritance mechanism is very simple when we just have one activation input, one regular input because we have one activation. But when we have subsampling, when we are in a multi-frequency environment, the situation is a little bit more complex. So I come back here, as I said, here we only have one clock, so everything runs with the same clock thanks to inheritance. But if we go back to my previous example where we had two clocks, two independent activation sources, what happens then? So before seeing that, I need a concept to show you that it is possible to have a block with more than one activation input. Here I have the block selector. What does selector do? Selector says the input of my block, the first input is copied to the output if I am activated through my first activation input. So if I'm activated through this one, I copy this value into the output. On the other hand, if I'm activated through this one, the second input, then the value one here is copied on the output. So this block must know the way by which it has been activated because it uses this information to do different things. And notice that if the else block here is not activated, it may look strange, but not really because of the rule I said previously, it inherits its activation from the random number generator. And also the one here looks funny because it is not activated, but it doesn't even have an input. Well, that's good because it's a constant. It is never activated. This block is only initialized. That's almost the definition of a constant. So the result looks like this. I go back here. When the random number generator is positive, the activation goes through the then, so it is the second value. That means the one which is copied here, when it's negative, it goes through. That means when the random number is positive, we get one. When it's negative, we get the actual value. So everything negative here is copied here. All the positive values have become one. So I go back to an example where we have two independent sources of events. I have three blocks here that don't have activations, so they necessarily inherit. So what does it happen in this case? Because I have two different sources. Well, for this scope, there is no problem because this one inherits from this. That means from this activation here. So that's the diagram I have. And in the compilation phase, I should say pre-compilation phase, the missing activations are added. And that's the way they are added. The sum has two inputs. So the block gets two activation inputs, and it inherits each one from the corresponding regular input. So the activation input here inherits activation from the block that generated its regular input. And this block here comes after a block with having two activation inputs. But it only has one activation input because it only has one regular input, and that's just the union of the two. That makes sense because if a value changes, when does the value change here? Well, the value could change either when this activation is activated or this one. Or the other one. So this block must be activated at the union of times when this block is activated. So that rule makes sense. But I should say that this is not part of the psychospormalism. This is a facility provided to make diagram looks easier. You don't want to construct systematically this diagram because this is easier to construct. But you must know exactly what happens. Now in this case it really doesn't matter because the sum doesn't take into account the way by which it is activated. The sum is always the sum. You may want to write this fancy sum knowing that the first value hasn't changed and the second has changed. Maybe there is an algorithm, maybe it was not really for the sum, but in some cases it makes a difference. You should be aware of that. Let me just go back to this diagram to show you that the three scopes run on different clocks. This one runs on this activation. This one on this activation. And this one is the union of the two. So the times where the scopes are activated are exclusive between this one and this one. And this one is the union of the two. We see that the blocks are not activated at the same rates. Now what about continuous time? Because we have only talked about discrete time blocks. Now a block can be declared always active. That means the block, in theory, is always active. It doesn't mean it always produces outputs in a computer. It doesn't make sense. In fact, it produces outputs only when it's necessary. Now if we want to be consistent, what we should do is to have a clock which represents always, all the time, all the time activations, and connect it to the activation input port of any block which is always active. But again, for the sake of making diagrams look nicer and to avoid having to draw red links all over the place, instead of doing that, what we do is just declare always activation as a block property. And the rest works by inheritance. Let's take a look at this diagram. Well, in this diagram we have a sinusoid generator, absolute value, integrator, and a scope. Now, again, if we just consider what I said so far, this sinusoid generator must be a constant. It is not activated. It has no regular input, so it cannot inherit. But it's not constant. Why? Because a block property has been said for this block saying that this block is always active. The absolute value, on the other hand, it's not always active, but it inherits the property of always activation from this block. So you should think of always activation as a link connected to this block coming from a fictitious clock which represents all the time. So absolute value inherits, so this one also works all the time, and then we have the integrator. The integrator can inherit also always activation from the absolute value. So here what we have, we have a sinusoid, the absolute value of sinusoid displayed, and the integral of the absolute value of the sinusoid, which is displayed. So the result looks like this. We have the absolute value of sine and the integral. There's one point I should mention, and that is, I said the integrator does not, it's not necessarily always active in this case. It doesn't have to be. It works because it works by inheritance. In fact, in the first version of psychos, that's what we did. We said the integrator, we don't have to declare it always active. What's the point? But then, on some examples, we had problems. For example, somebody tried to integrate a constant. If you try to integrate the constant, well, if you don't declare the integrator always active, it doesn't inherit from anybody. So it never works. So the integral of the constant became zero. Which is not good. Unless the constant is zero, of course. So we added the always active property to the integral as well. Now, what I said about these very special blocks, if-then-else and the switch, it's also valid for continuous time activations. That means I can take a sinusoid generator, which is always active, and put it through its output through the if-then-else block. If the output is positive, then the activation goes to the first port of the selector. That means I copy the sine value here. If it's not positive, then I activate the second port, which means that the output comes from the second input. And what is the second input? Well, the second input is simply a copy of the sine multiplied by minus one. So when sine is positive, it's sine. When it's negative, it's minus sine. But that's exactly the absolute value of sine, which I integrate. So that's exactly the same thing I did before, and in fact the simulation result shows. But there is one difference, and that's an important point. The sample and hold block here, which is just a copy, is activated only when the sine is negative. The multiplication inherits from sample and hold, and from constant, but constant is not activated. So it's essentially activated from sample and hold. That means the multiplication is done only when the sine of t is negative. Otherwise it's not done. In fact, it can be seen here. Here the signal is not active at all. This operation, the multiplication is not done all the time. It's only done when it's necessary. Now for a multiplication, who cares? But suppose instead of multiplication here, we had a function which took 25 minutes to do. I mean, it was a very complex operation. Being able to say this operation, I don't do it. I really don't do it. It's not that I don't use the result. I don't do it. I just turn off this part of my diagram when the computation is not needed. It's very important, especially when you do code generation. You want to have high performance code. You want to be able to say, okay, this part is not working now. I activate this one. Now I activate this one and so on. So there is an interaction between the continuous and discrete time dynamics. From what we have seen here, we don't see exactly the interaction. We see the continuous time running by itself, and then we have discrete time, which either samples it or creates an input to the continuous world. But actually there are deeper interactions. I don't have to get time to get into all the possibilities. But the discrete time events can change the state of a continuous time differential equation, for example. And continuous time signals can generate events which are discrete events using zero crossing blocks. So there are very different ways that they can interact together. To show you a typical example where we have these zero crossing blocks, here I have a continuous time signal. When the signal crosses zero, an event is generated and the relay switches one of the inputs. Again, I don't want to get into detail. Just show you the typical signals that we have. Here we have a signal which is sometimes discrete because it's constant over some period of time and then it's continuous and then it's discrete. A cycle signal is a hybrid signal. Sometimes it's discrete, sometimes it's continuous. You cannot distinguish the two. Just one example I wanted to run. It is a silab demo actually. To run the demos, you just go into the demo part, choose psychos, and you have the number of demos. For example, that's not a very important example. It just shows you the interaction between continuous and discrete. Here we have a continuous system because the dynamics of the balls are second order differential equations. One of them is governed by a second order differential equation. What happens is that there are zero crossings which detect if the two balls have hit each other. There are n squared, almost zero crossings because there are n squared possibilities to the ways to hit each other. When they hit, the event is detected, is generated, and the continuous dynamics of the balls are changed because after collision, the positions don't change but the speeds change. That's exactly how collision changes the situation. This is a really nice hybrid example when we have both continuous time generating discrete events. That's the collision detection. Discrete time changing the continuous time because after collision, the continuous state is modified based on the events which has been generated. Thank you very much. I stop here.