 Thank you. I'm leading the development of the Spark technology at Edocore. So Edocore is a free software provider of development environments. In particular, we developed the Gnats compiler for Ada, which is part of GCC. At least that's how we started 20 years ago. And today, we're still half of our activity, and the other half is on all the other tools that you need when you develop critical software. I am going to talk about Spark. So Spark is a technology to prove properties of programs. And when I mention proof, I'm sure that some of you may have very bad memories coming back to your mind about vicious math prof in high school. So if that happens to you, just relax, because with Spark, the tool does the math, and you do the code. So here is a satellite view of the technology. Spark takes programs in the Ada programming language. It turns them into the intermediate verification language called Y3, and it uses the Y3 verification platform to generate formulas, logical, mathematical formulas that are proved by these things, which are automatic provers, AltarGo, CVC4Z3. And all of these pieces are free software developed by separate groups that we integrate and to which we contribute. So let's see now on an example how this works. I'm assuming that you know Tetris. Whatever your program, there are typically two kinds of properties that you may want to verify. For first, you want to check that the program doesn't go wild, that it stays within reasonable bounds. That's what we call program integrity. And then you want to ensure that the program does something good. And that's what we call functionality. So for Tetris, we'd like to show that all the data that is read is properly initialized, typical reason for bugs, and that there are no language error or exceptions, so depending on our language. So for example, division by zero, buff of our flow, et cetera. So that's part, that's for the program integrity. Then we'd like to show that the flow of data in the program is correct. It's accessed correctly with respect to the specification. That complete lines are removed, and that's quite important because that's how you score points. And that the following piece doesn't develop with the pieces that have already fallen or doesn't go outside because then it's really annoying when a piece goes somewhere, you cannot see it anymore. So that's for the functionality. So let's see how you express that in Ada. So in Ada, there are very rich features to express data types. So for example, the cell here, you can express it with an enumeration. So either empty or one of the well-known shapes of Tetris. So for example, if you tilt your heads on the right a bit, you will see that this one is a J. And the shape is any cell that is not the empty one, so a subtype of this one. And the three shape is also a subtype of cell for those shapes that fits within a square bounding box of side three. So that's all the shapes except I and O because they are of size four and two respectively. So that's for the cell. Then a piece is a record made of a shape, a direction which is itself an enumeration, and then a pair of coordinates for the top left cell in the square bounding box. So it defines the position of a piece. And we have the current piece falling, which is a global variable called current piece. And finally, the board is just a matrix. So the board is an array of lines where nine is an array of cells. And we have the board here as a global variable. The API of Tetris is quite simple. So you have five possible actions that you can apply to the piece. So you can move left. So you're left. You can move right. You're right. You can move down. And you can turn clockwise and clockwise. And I won't do any funny gymnastic here. So there's a procedure to action that applies in action and tells you if this was successful. For example, if I'm completely on your left, I won't be able to move more. So that's the function procedure tells you, tells the color about it. And there's a procedure to include the piece in the board after it has fallen and to delay the complete lines to score points. And you can see here on this main function, essentially a main loop with a sub loop where these various pieces of the API are called. So nothing surprising. So let's see how we now analyze this code. We could do it on the command line or in any of the ideas that we support. So for example, in one of the ideas, you go to the Spark menu and you click on examine file. This starts by generating a bunch of useful information and then calls this data and information flow algorithm. If it returns without any message, that means that there are no reads of initialized data in your program. That's the case here. So quite good. So let's go further. Let's state the actual accesses to global variable in the API. So for example, the procedure action will read the global variable that represents the board. It will read the current piece and it may update the current piece because it will change its direction or its location. So we specify it with this contract here, these global contracts saying that the pure board should be an input and pure piece should be an input output. And we do this for all the API. Really easy here. And when the analysis or re-clicking on examine file returns without any message, we now know that the implementation of the code respects all these correct data flows in the specification. So let's go further. Now let's click in the Spark menu on proof file. So now, again, after a generation of some useful information, it does, again, the same flow analysis and now goes to proof. So calling the provers that I mentioned before. This time, if you get no message, you get the guarantee that there are no runtime errors in your program. So no division by zero, no buffer flow, which may have an impact on security. Here, unfortunately, we get six messages and possible buffer flows and four messages and possible violations of data ranges. So in fact, this is expected because as many programs, the API is not supposed to be called in any order at any time during the lifetime of the game. So you need to specify the precondition. Precondition, we can specify it with the initial contract pre, states when you are allowed to call this procedure. And here it says that for calling this procedure, include piece in board, the current piece has to be within bounds. Within bounds, we express it easily in Ada with what we call an expression function. So a function whose body is simply an expression. So you can think of functional language bits. So here it discriminates on the shape of the piece and depending on the shape, it does first checks that rely on another expression function within bounds that states that a pair of coordinates is within the bounds of the board. And with these preconditions on two functions, we have got no messages. So we are sure that the call game logic has no runtime errors. Quite good. So let's go one step further and let's express the rich properties that I talked about at the beginning. So the fact that there are no complete lines and there's no overlap. So for example, no complete lines for the board can be expressed as an expression like before, saying that for all y coordinates, then the line on the board at this coordinate is not complete. And the fact that the line is complete is also expressed as an expression function here, saying that for all cells on the line, then the line at this location, that's an array access in Ada, is not the empty one. So that's really easy. Notice here that I have sticked this annotation ghost. That means that this function is only meant for verification. It's a ghost function. It will be stripped out of the final binary when we build it. So no overlap is similar. So with a richer case expression here that discriminates on the shape of the piece, but that's equivalent. Well, these properties don't hold always. They hold certain parts of the program. And for this, we need to express the underlying state automaton of the program. For this tetris, that's really simple. First, the piece is falling here. Then the piece is blocked. Then the piece is including in the board. So there's no current piece anymore. And then, hopefully, some full lines are removed and you score points. And here, we get the board where the full line has been removed. So that's four states, and it loops until you lose. We can express that in ghost code in Ada. So we can define a ghost state that defines these four possible states in an enumeration. And we have a global variable cure state that stores the current state. That's also a ghost variable. And then a valid configuration discriminates on the value of the current state. So, for example, for the first two states, we want to ensure that the current piece is not overlapping with the current board. Afterward, it doesn't matter because there's no current piece anymore. And we want for the last states after clean that there are no complete lines. So the two functions we defined before, we call them here, as part of the valid configuration. And that depends on the current state. So that's, again, a ghost function. And finally, we can use these specification functions in the contract for our API. So including piece in board should have a contract here that states the precondition, so in which cases you can call it. And the post condition which states what this service warranties to the caller. So the precondition here for include piece in boards, include piece in board should be called here, should be called here. So it should be called when the current state is that the piece is blocked. And we have a valid configuration. So this is something that is maintained. This is an invariant of the program, this valid configuration. And it should return with the current state being board before clean here and still with the valid configuration. So let's now recall proof file on these codes. Does the same. There's no message. Well, you can get the list of things that have been proved if you select the right switch. But otherwise, it returns without any message. So you've proved that your code, which is here, well, less than 200 lines of code, completely implements the specification, the rich specification that we've chosen. How hard was it? Because this kind of tools can sometimes run for hours, eat gigabytes of memory. So that can be a concern. Here, due to the modular way that analysis is performed function by function, it's fully proved at what we call level zero out of four levels. So, well, the detail is that only one prover, automatic prover is called. We leave it very little time for each individual proof one second. And we don't split the work for it. So we just each time give it one full thing to prove. And it's proved in 11 seconds on one core. It goes down to four seconds on multi-core. Of course, on such a small example, that doesn't really matter, but on the example, yes, this is fully bulk automation, bulk parallelism, sorry, for proving things independently. We have compiled this code, this proven Tetris. So you've seen now what I mean by proved. Originally for this Atmel, Sam Forest Explained Pro board. And so there's a small display here that's really just to play with it. And with my colleague, Tristan Jingle, who presented yesterday the 64-bit bare metal programming on Raspberry Pi 3, we did the drivers and the DBSP for this board. And then another colleague ported it to his payable time watch. Another colleague ported it to the Unity game platform. And the last one we did is this R-D-Boy game platform that I have here. We're here. This is interesting because we don't have NEDIC, we don't develop a NEDA compiler for this AVR 8-bit processor that I have here. So we compiled the Spark code to C. And then using this C code to this platform. So you can have the full story behind this project at blog.adocore.com, including the source codes. And you can download this tool set at Libre.adocore.com. I have many more links on the page of the talk with extensive documentation, with free online e-learning course. There's a university book on Spark. Most of our users, they're big industrial products. So, for example, if you fly to the U.K., you're using Spark unknowingly because the air traffic control in the U.K. is written in Spark. The tool that air traffic controllers use for routing planes and detecting conflicts. But there are also free software groups that use Spark. So, for example, the MUEN separation panel, which is developed in the University of Rappersville in Switzerland, is done in Spark. So, thank you for your attention. Thank you very much, Yannick. I think there's no time for questions. There's hardly one minute left. So, if you have any questions,