 Okay, so I'm also here now for a lightning talk for 10 minutes. So I will bombard you also with some slides in these 10 minutes. It's about what I learned from Retro, microcontroller and my experiences with RTL programming. Not always as I like them to be. So what I did is I had the crazy idea to make my own custom ASIC chip. Now, as people say, it's impossible, but I don't think it's impossible. What I was planning to make a chip with open source cores with a Motorola 68000 Z80 and a MOS 6502 on them, I combined them because it's from logistics points better to put it on one chip than having three different chips, but it has some features with 4K on-chip RAM, IOS and maybe some peripherals. But I tried to limit the features to be able to know that when the chip comes that it works and that you don't have a non-working chip because then all the money is lost. And then I also planned to have some boards that were Arduino type boards that would be compatible with these instruction sets. I'm not going to detail if you want to know more about this Retro microcontroller, but you can go back to the Retro computing death room of last year and then I have explained in full detail. So I did a crowd supply crowdfunding and you see I didn't reach my goal. I only got $4,000 of $22,000. It's a pity, but the nice thing is of course that non-money has been spent. I didn't have to invest anything, so I know now that I need to work on more things. So when I did this, I also followed several online forums to see what people think about it. What I tried to do is combine this Retro computing with Arduino and it seems that this combination doesn't really make anybody really want to go for it. So an Arduino guy says, yeah, it costs more than an Arduino and it has less features. They just want to run their sketches. They don't mind if it's a Z80 or a 6502. They just want to run their sketches. Of course I'm exaggerating. I'm putting it black and white. But that gets the message across. Then you have open source guys where several guys have said, yeah, it's nice what you're doing, but I'm not really looking for an Arduino board. And then you have the Retro computing guy. He says, yeah, but you don't have a memory bus on your chip. I cannot do anything with it. I want to put it on a board and have some frivolous. So because I only plant Ios. So then my next step is why I wanted to have not too many features is to be able to first time write, but this is kind of. And then I go further is and the reason that you want to limit the features in my opinion is that actually the RTL tools, the phdl and Verloch are not the right tool for the job in my opinion. So, um, so what are the main RTL folks? I see is for example, they say a clock is like any other digital signal. Only we look at the rising edge and the falling edge. Um, but you define it as a logic signal. So also the first thing when you notice how the image that you have to learn that you know what has what is being inferred when you write something. If you write an if statement, it can be either a flip-flop or it can be a max. Um, in itself, that's not the problem. The problem is that you have to know when you write it. That is the case. If you would have the right abstraction level, you should not have to take care. You should only think in if statement and not in flip-flop or max statement. So that to me is one of the fundamental problems that we have now with RTL. Also, then you have this crazy idea of you have your language and only a subset of that is synthesizable. The rest is not synthesizable. It's also to me is I'm coming from the programming side. It's, yeah, I could not understand this. Why it's the case. In the end, I learned that actually these have evolved of trying to simulate the reality and it's more developed to be simulating. Also, the blocking versus non-blocking statement in very low gear is an indication to me that it's not the right abstraction level. Everything is parallel, but then a statement, an assignment can block everything, which is not right to me. And then also you have FPGA versus ASICs. If you have something working on FPGA, it doesn't mean that it works on an ASIC. And then read the fucking language reference manual. If you go, then on these sides, yeah, these are professional tools for professionals. You get bombarded with in section 3.5.4.1 of the language reference. It says this. And if you don't know the language reference by heart, you are not a good RTL writer, I think. Some people think like that because it's a commercial tool. So some good B solutions. Yeah, of course, the languages have evolved over time. They have added some nice features like combinatorial, generate statement, record interface. But in my humble opinion, they don't fix a fundamental problem. To me, it's more lipstick on a pig than not truly, of course. You also have transaction level very low, which is now promoted by Redwood VDA. They try to extend it to writing pipelines in the railcard easier. But also again, this doesn't solve the problem to me. Then you have system CTLM, which is event-driven. You have also transaction level modeling is an example. But when I look at how NAND gate is implemented in system C, it doesn't really give the indication that it has any improvement over the RTL languages. In that respect, it's already some time ago that I looked into this. But to me, going from the transaction level model to an event-based model, you need to do that manually. You don't know good automated tools or they are commercial, as far I know. Then you have systems C, C++, system C as your electronic system level design. Most of the time, the good tools are now propriety. You have VIVADO, HLX, Catapult, Synopsys also has their own. I would love to see more development on Panda Bumble, which is actually an open source tool, so it does this. I also need to investigate Goat more. But that you need to register before you can download, which is for an open source guy kind of already a first hurdle to take. I think this would be nice to have, but it's not a good fit for implementing something like a retro microcontroller. Then you have MyHDL, which is an HDL. It's a Python syntax for doing, but it's based on the same event-driven principles as RTL. In my mind, that's not going to solve it, but the author has a different opinion. The Callaway has a different opinion. He really says that he wants to be event-driven in his language. I found three languages which are going to me in the right direction. You have Shizzle. Some people who know RISC-5 probably know about Shizzle. It's the MIT guys who developed the rocket core of RISC-5 in Shizzle. And then you have Spinal HGL, which is similar, but a language. I tried to go into both. For me, Spinal HGL seems to be a little bit more intuitive than Shizzle, but that can just be me, of course. They're both based on Scala, so that was one of the problems I had, because I wanted to go further into Spinal HGL, but once you get an error, you really have to know Scala also to fix that error. It's nice up to a point, but there's something you need to know. And I was then, in the end, I decided to go for Miesok, which is a Python library, an art fair, that does the same thing. But it's under big development, because they now stop development with Miesok, which is the first version, and they're now working on Miesok. I'm a Python guy, so for me, it seems to be a nice fit, but all three things I've given here are, I think, a good thing. And I want to go to the last slide. This is good. This is the first step, but I want more. And to me, one of the big things that are missing is that you need to also be able to compile and debug on your level that you're writing things. If you do Python, you debug in Python. If you do Java, you debug in Java. And for these tools, most of the time what they do is they generate Verilog or RTL, and then you have to do the debugging RTL level. Or, yeah, so I think Fertil, which is the lower language for CISL, you can do debugging on Fertil level. Miesok can simulate on Python level with some restrictions. But to me, that's one of the things to solve to get everybody doing hardware development. Thank you. The question was that if I compare the licenses, actually I only restrict them myself to things which are open source, so either permissive or what is it called? Coupier-Left-Type licenses. I prefer Coupier-Left-Type licenses. The only problem was maybe with System C in the beginning there was some problem with the license, but it was solved by going to the Apache license or so. Any other questions? Did you see some words about the gadgets you were using for giving the presentation? There is Pi on the case. It's just Raspberry Pi. And some wireless keyboards. So this is my AV system at home where I watch movies. I don't know how to summarize the question. So he says that there has been a lot of estimates done by the commercial tools. Shouldn't they stick to that to jump on the bandwagon? Or being different is a good idea. The thing is I think they didn't do the right innovation in the right direction because they always had to keep up with Moore's law until now. They really had to always jump along all the guys to work on getting their tools ready for the next note and they had time to reflect on how can we make the language and the structure better of everything. And I think Opus Urskai can do that. They can do it at their home. So we should have this in parallel and we should have interoperability. That's no problem, but I think that's possible. Thank you, everyone.