 I start to have a few questions for you. How many of you have attended the webinar series on open form? How many of you knows how to use open form? 50%? How many of you have never tried to run an open form case? 5%? 25? And how many of you don't have experience doing multi-physics simulation of nuclear reactors? You all, okay, you. I can assume the others have done some level of modeling and simulation involving neutronics and thermohydraulics. And maybe the last one, how many of you have experience with CFD? Okay, thank you. So for those of you who have attended the open form webinar series, this is gonna be very similar, almost the same as the first lecture of that webinar. That doesn't mean, so if the slides are the same, that doesn't mean the lecture has to be the same. So this is in person. So you are more than welcome to interrupt me and we can make it interactive. So if you don't understand something, something is not clear or you want an additional clarification, please raise your hands and stop me. This is the advantage of being here in person and seeing me, I see you and you can stop me. So this first lecture, I can do this here, right? Maybe, maybe not. So this first lecture is gonna be about the general introduction to open form. Mainly about examples of how people have used open form in the past because this will give you an idea on how you can use it for your own purposes. And that will be the part that I will discuss. And then the second part, Stefan will do it and it will more be about how to approach a new problem and the lessons learned in the last 15 to 20 years of use of open form. Something I forgot to mention, I'm not sure I can go back, can I go back with this? You can only go forward. I have to double click apparently. So it will be Stefan and myself who will present this but I need to acknowledge, I will click for the substantial contribution to the preparation of the slides for the webinar. So open form, what open form is, if you go on the web and you look for open form, you'll see that it is described as an open source CFD toolbox. It is, and it has capabilities that mirror those of commercial CFD. It's free to use, I don't have to pay and you have access to the source code. And it has a large user base, very arguably it's the most successful open source CFD tool that you'll find out there. The user base is really big, we speak about 10 to 20,000 engineers and scientists worldwide, way beyond nuclear. So open form is used in aeronautics, automotive, maritime engineering. You will find fellow farmers pretty much everywhere, you go. Despite being a cold the CFD toolbox, luckily open form is more than that. Open form stands for open field operation and manipulation. You may understand from that, that it's not just a tool, it's a library. And it's actually a very good, very large, very well organized HPC scalable C++ library for the solution of partial differential equations. With open form you can solve whatever partial differential equation you want. In addition to that, it's not just, you will find out there libraries that allow you to discretize something. You look into open form, you will realize quickly that it's a large library. It's more than a million lines of code and in there you find a lot of functionalities like solution of all the E projection algorithms, mesh search, octry, you have a very efficient octry, mesh search, you have finite area, methods. There's a lot in there. So pretty much everything you want, you will find it in there. Other interesting thing is object oriented with a very high level fail safe API. What I mean by that? You know, if you want to solve partial differential equation, this is the way we write the partial differential equation in mathematics, right, or engineering. This is the turbulent kinetic energy equation. I took it because there's a lot of operators in there. So you have time derivative, you have the divergence, you have a laplation. And you have a source. And this is the way you write it in engineering mathematics. Nice thing about open form, the whole programming is made it so that at least at the very high level you can mimic the same equation. So you don't have to program your own discretization, you can literally say for a derivative over time it's gonna be a DDT. A divergence is gonna be a div. A laplation is gonna be a laplation. What's PVM? Finite volume method, right? So you're saying I want a derivative over time, I want a divergence, I want these things to be discretized using finite volume. I think it's PVM is actually finite volume matrix. You will find also PDC which is computation. That depends if you want the matrix or if you want the source term. So nice thing in general about open form is that at the very high level there is an obvious correspondence within equation and equation, so equation and programming. If you go deeper down in open form it will get more complicated but still you have object oriented, it's a well programmed, it's been professionally programmed since the beginning. You will very rarely see things like the use of acronyms which is not a good practice. Most of the time the variables are named after, with names that are understandable. If you have an alpha for a porosity you will see alpha. You will not see things like al or alf, you will see alpha or if you have a porosity most of the time you will see porosity and not p. It's very easy to understand and it's object oriented which means all pieces of programming are pretty much independent. You can touch part of a code without affecting all the rest and you can understand that when you have one million lines of code that is pretty much necessary. You cannot have a monolithic piece of code. It's a library but it's still a CFD toolbox and this is nice. It means you will find out the numerical libraries that will allow you to solve whatever you want except that if you want to solve whatever you want you have to build the whole thing. Nice thing about Openform is that it counts with 50 probably solvers already available, professionally developed, tested, validated, verified for a lot of things and most of the time it's thermal hydraulics because it's a CFD toolbox which is a good thing because you will quickly realize that CFD numerically speaking it's one of the most complicated things you may want to solve. So having that part that is already taken care of both for single phase and two phase and very different formulation of two phase this is a very good feature. But there's more than that. You'll find in there electromagnetism, Monte Carlo stress analysis, even finance. So you find a number of solvers and I said before we have 10 to 20,000 engineers working with this which means you will find solvers developed by the community, a lot of them. I mentioned one here which I think is very relevant is called Solids for Form is developed by the University of Dublin. And it is a very large set of solvers and a very big library about solution of general non-linear mechanics which you understand from previous slides you don't have much of that in Openform but you have a very good library that has been developed for more than a decade that you can use to do thermal mechanics. I also mentioned Openform has a lot of functionalities. This is essentially, if you want to do multi-physics you want to have a library that allows you to pull functionalities instead of developing new ones all the time. Developing new functionalities can be extremely time consuming and may require expertise that you don't necessarily have. With Openform you find most of what you need in multi-physics especially you will find mesh-to-mesh projections which are strictly necessary if you want to do multi-physics. You will find dynamic meshes which can be very useful in certain applications for instance, sodium-fast reactors if you want to simulate expansion of your core due to thermal deformations it's good to have the possibility to move your mesh or if you want to do fuel behavior and Alessandro will mention that it's good to have the possibility to move a mesh. You have ODE solvers which can be useful if you want to do point kinetics for instance. You have phenyteria method which can be useful in some cases. For instance, if you have fuel behavior and you want to simulate your fluid along the surface without having to simulate full CFD you can use a phenyteria method. You have Monte Carlo you have Lagrangian particle tracking and this is just mentioning a few. Once again, in one million lines you find a lot of functionalities. And again, you have the community out there. So you will find functionalities that have been developed by the community and that are on top of what is distributed with open form. The obvious project to mention here is FormExtend which is a very large sibling project compared to open form where you can find a lot of libraries. Years ago I started working on reduced order modeling and I found in there a library for proper orthogonal decomposition so I didn't have to develop anything, it was there and that spared me months of work. So if you put everything I said together you realize that you have very large set of solvers and functionalities. You have a modular cost structure with a high level API, object oriented programming. Something I haven't mentioned but you have quality control. ESI version of open form is developed according to ISO 9, ISO 9.1. And you have state of the art numerics, something I haven't mentioned but open form scales pretty much as any other CFD toolbox out there. So pretty well, not extremely well. If you want to go to exascale we have a problem but this is not an open form problem, this is a CFD problem. Most of the algorithms we use these days are not ready for exascale especially Finite Volumes. Many people are working on that but we cannot say that we can easily do exascale with Finite Volumes these days, they are all Finite elements for that matters, it's just. Most of the algorithms that have been developed in the last 50 years do not vectorize well which means you cannot use them easily in GPUs. You have to modify the algorithm and that takes a while, it requires basic research and computational science. Now I will give you, before giving the word to Stefan, I will give you some example of things that have been done in the past. I have to say that a lot of this content has been taken from a paper which I believe is open accents and that you can find on ScienceDirect, Nuclear Engineering and Design. And co-authors are myself, Ivor Clifford again and Stefan and Stefan. So you have three out of the four co-authors here if you have to ask questions. So a little bit of a history of using of open form. At least I know it, maybe there are things I'm not aware of in which case if you are aware of them, you let me know. But the first activity I'm aware of about use of open form in nuclear engineering dates back to early 2000s in South Africa and that was during the PBMR project. They quickly realized that it was difficult to do PBMR analysis using legacy codes with structured meshes and mostly prepared for single heterogeneity course and not double heterogeneity course. And they decided to start develop a tool and they decided to use open form. Fortunately PBMR project died out, hopefully not because of open form. But open form used from multi-physics survived. It was brought to the US by Ivor Clifford actually, did his PhD at Penn State and kept developing models for HDR, prismatic in this case. But it's only in between let's say 2010 and 2015 that we start seeing a widespread use of open form in the nuclear community. And I think the community that drove that expansion was the molten salt reactor community. The same time we saw work on SFRs and FHRs and various activities about modeling of advanced reactors. Between 2015 and 2020 and now, we started seeing some persistent development. So not just PhD studies that died out after the student finished his PhD, but research group that committed to maintaining and keep developing and verifying and validating a specific tool. I'm aware of at least three tools that are out there that have been maintained, developed, and are still maintained and developed for more than four, five, six years. And the three of them will be presented in this workshop. One is GenFoM, reactor multi-physics. One is fuel behavior, is offbeat for fuel behavior. I should say for fuel behavior and generally non-linear thermal mechanics because offbeat, Alessandro will probably mention that, but you can use it for simulating a vessel or simulating a graphite core. It's pure non-linear thermal mechanics. And the fact that we use it for fuel behavior is just a very specific application of non-linear thermal mechanics. And that's containment flow. FoM, that is for containment flows and Stefan will discuss that, I think on Thursday, right? So I will not spend much time on these tools because we will present them extensively in the next few days. I will spend some words about, you know, work that led to tools like GenFoM and offbeat. And this was the PBMR project. As I say, the purpose back then it was to develop something for PBMRs, for HTGR. And the idea was to have something that was 3D, a structured mesh, parallelized, extensible. And one of the key questions at that time, do we have some water? Yeah, better. So one of the key questions at that time was can we use a library like Open FoM for Neutronics? In 2023, you may find that is obvious. In 2003, it was not. I mean, you have to go back to 2003 and realize that tools at that time were mainly nodal codes, specifically done for Neutronics, structured meshes, using very specific methodologies like nodal methods optimized for square or hexagonal geometries. And all of a sudden, they wanted to do structured meshes with finite volumes and segregated, meaning not all coupled in the same matrix. It was not obvious at all. Luckily, the answer to the question, can we do that was yes. And it was actually fairly straightforward. First of all, because the implementation was straightforward thanks to Open FoM. It's like DDT flux minus laplation of a flux and you do it in certain number of times and you have a multi-group. And then you exchange terms and you have coupled Neutronics. So the implementation was fairly straightforward and the solution turned out to be fairly, or actually very robust. They did converge pretty much, didn't have problems. The answer was positive. It was possible to do Neutronics with Open FoM and it was 20 years ago. After that, another thing that was done more like 13, 12 years ago, instead of 20, was use of Open FoM for something that is closer to Open FoM, thermal hydraulics. But Open FoM at that time was mainly CFD, so fine mesh CFD runs or LES, well, I would say in 2010, probably mainly runs. And this was, again, work from Ibercliffe for this has been really a precursor for Open FoM. It paved the way for many of the activities that have been done with Open FoM. And what we wanted to do was to develop a multi-scale solver for HTF, prismatic HTRs. You know, HTRs, they tend to be multi-scale by nature. You have a very large core with prismatic elements. Instead of prismatic elements, you have fuel and graphite and inside the fuel, you have trizoid. It can be very complicated. We reconstructed the whole thing. At that time, most of the people were doing very simple homogenization. And they decided, no, I want to do more than that. And they decided to do real multi-scale using reduced order model. And you will find people like three, four years ago that have proposed this kind of techniques. But keep in mind, this was done in 2011. It was way before other people started doing that. And why was it possible? Well, because Iber is a brilliant person and because Open FoM helps a lot. You already have CFD solvers and going from CFD to coarse mesh porous medium, you just have to add a porosity and some source terms. You can easily tailor equations to add source and sink terms. Sorry about that. And you have a lot of functionalities. And one of the functionalities was proper orthogonal decomposition. So if you want to do multi-scale and you want to do it in a rigorous way and you want to do it in a way that the lowest scale can be solved in a time that is reasonable for the highest scale, well, you may want to use reduced order. And one of the ways to do reduced order, sorry, again, it's proper orthogonal decomposition. It was a library for that. So you can understand how having a library that already provide you with the CFD solver and already provide you with a lot of functionalities can make your life easy. You don't have to redevelop everything from scratch because trust me, if you want to do this thing, starting from scratch or from a library that just provides some level of discretization, well, this is a multi-person project from multiple years, not a PhD thesis. Now I would like to open a little parenthesis. For something that is going to be useful for the coming days and something that is useful for you to understand how to use open form. Imagine you want to do, again, imagine you want to do thermal hydraulics of a reactor core. You can do CFD, right? Very few people do that. Why? Because it's very computationally expensive and it's not so worth it because most of the time you have a very repetitive geometry and it turns out that if you have, you use correlation for pressure drops or no-cell number that give you potentially better number than CFD, especially if you have things like sub-nucleate, sub-code-nucleate boiling. Doing that with CFD, yeah? If you have a good correlation, you're gonna have good results. So CFD can be used, but it's not often used in our field. What we know that people use very often is system codes, sub-channel codes, right? Now if you look at what people do when they use volume or finite elements library, that they use something called porous medium approaches. You look at what they do in moves, it's porous medium. You look at what we do in open form, it's porous medium. Why is that? Well, first of all, you can find on books, Todras and Kazim, I think it's, you can find the derivation. You can prove that sub-channel and system codes are a specialized version of porous medium. So porous medium is actually a generalized version of sub-channel codes and system codes. How do you obtain porous medium? Well, you start from Navier-Stokes. You do volume averaging, I will not do it here today. You do a volume averaging of your equations. And what you will find out is that you find the same equations, Navier-Stokes equations, with an addition, which is a porosity term, which is just telling you not all the volume is occupied by your fluid, only fraction of it. And that is described by this gamma, which is porosity. And the other thing you will find out is that there are additional terms there, F and Q, that are essentially describing the interaction of your fluid with the structure that you're not describing anymore. They are describing pressure drops and the heat transfer with the structure. So you will find pretty much the same equations. You can prove that you can find these equations, you can use these equations to do sub-channel if you want, or to do system code if you want. Nice thing is that you're using the same set of equation you were using for CFD. So when you use tools like Openform or Moose, this is a very nice set of equation you can implement in your tool to use the same capabilities of the tool, but to be able to do nuclear, typical approaches like sub-channel or system level analysis. It's all in the same set of equations. And another nice thing about these equations is that if you enter, you go from a core that you are treating as a porous medium and you get out into a plenum, porosity goes to one. The other terms, the source term goes to zero, so you fall back immediately to a standard CFD approach. And you do not need to do the coupling between core and plenum using a separate interface that always blows up very easy. The tool will do everything itself in the same matrix, you will have the core and the plenum, no problem with stability, no problem of numerical diffusivity. You can solve for a whole entire primary circuit in the same mesh. Now the reason why I said that is partly to introduce you to something we will use the next days. Partly because I was speaking about open form and how easy it is to implement a question in open form. But I wanted to use these to give you a warning, which is imagine you are using, you're solving for momentum equation. And you have something like this and you may be tempted to say, okay, I can solve it. I have DDT, DDT, I have divergence, I have right divergence. I have Laplacian, I put a Laplacian and I have a gradient and I put a gradient. While you do that, it will never solve. You can trust me. You can try it, it will never solve. The real way you solve it is actually this one. The DDT pretty much remains a DDT, but then you have corrections for continuity errors. You have a rearrangement of the divergence to separate the linear, the diagonal and non-diagonal term. And you have a rearrangement of the pressure contribution so that it fakes a stagger grid. This is not meant to discourage you. This is meant to tell you, look if you want to do, to use open form or moves or whatever library to do multiphysics, you will have to know your problem. And you will have to know or you will have to do research and realize that when people do CFD, they don't solve the equations like I showed you before. They solve the equation like this. You go into Ansys, you will find this. You go into CFX, you will find this. The reason is is historically people have worked on this. They found out this is the way to solve for this equation in an accurate and stable way. So having a library that allows you to throw in a question doesn't mean you can throw in a question and get a solution. Unfortunately, you will have to do some research, realize how to feed equations to the library and then solve them. So I think it's important to tell you this because otherwise you will enter a loop of frustration where you try to feed the equation and say, open form doesn't work. And then you say, does open form doesn't work. Then I use moves. Then you will use moves and you will say, moves doesn't work. And then you will end up using system codes and hopefully they will work and most of the day they have the same problem. So just be careful, know that you will have to know what you do if you want to obtain solutions. So when it's familiarity with the problem, nice thing is that open form will often help you out because for a situation like this, the set before you will already have the CFD solver. So pretty much you will have the old thing. The only thing that you will miss is the porosity and additional terms like FSS and QS that simulate the interaction with sub-scale structure. So you will have most of the thing that is done you will have to add a few terms. And again, it's open form. You will have contribution from the community. You look in the community what they have done. You will find out that someone did gen form. You look at gen form, you will find this set of equation already available. Do you have to use open gen form now? You can take this thing, bring it to your own solver if you want to develop your own solver and use it. And with this, I'm not suggesting to each time redevelop the same solver. Honestly, I've seen diffusion solver in open form developed at least five or six different times. And you can find at least three or four publications out there about diffusion in open form. So please don't do that. If you find a solver out there, use it. And use your time to improve the thing and give it back to the community that we can all advance in the same direction instead of trying to compete on the same solver because we think that maybe we can do it better. Maybe you can do it better. But out of that 100% of the time, 95% of your time will be used to catch up with what other people have done, which is, I believe, a waste of our time. So if you find something out there, my suggestion is get in touch with the developer if it's not available. Otherwise, if it's available, maybe just shoot an email to the developer and say, yeah, we want to use that and use it and start from there. You will realize that, although it may feel simple to redevelop your thing, it's gonna take much more time. Nicoleta was speaking about best practices in nuclear code development, open source nuclear code development. So we have been working on that for several years now. We have interviewed several experts and we found out one interesting fact is that if you start developing a tool, the usual timeframe, you may feel like you will finish it in a PhD, you will most likely not. The usual timeframe that came out from people developing open-source solvers, Moos, Salome, it's a five-year plus five-year, more or less. You will need five years to develop your tool and make people that know that your tool exists. And it is five other years to consolidate the thing and make it enter international projects. And that is about tools that have been successful because you also have to consider that maybe four out of five they die after a PhD, not because they were not good, but simply because there was no continuity in the institution. Maybe the institution doesn't have the interest or the money to make the project survive. So think carefully before starting a new thing from scratch and my suggestion is always if you can use what's out there and try to improve it, if you can give it back to the community so that we can all go in the same direction. So with this parenthesis, I'm slow. I wanted just to mention a few other examples of use of open form. So I mentioned before MSR modeling as probably the field that has brought open form to the attention of the community. There is a good reason for that is that when I speak about MSR at the beginning it was mainly multi-salfast reactors because that was really the case where we could not use legacy tools. There's no way. I mean, if you want to solve this thing and you don't do CFD, you are not solving this thing. Simple as that. Because without CFD you will not be able to simulate things like flow detachment. You will have a flow field that is very approximate. And when you have a flow field that is approximate and you have a very high power density in your fluid this means your temperature field will be approximate. This is a reactor. This is actually the reactor that we will use for the end zone sessions this week. This is a reactor where if you fail in simulating things like the flow detachment on the wall and you fail in predicting possible recirculation regions it means you will fail in predicting that your vessel is melting. That fail, right? You don't want to have that. So people since 2012, 11, pretty much, Manu, 2012, they realize we need CFD. And if you need CFD you cannot use legacy codes like trace. But if you need, if you have a molten salt reactor you don't only need CFD and you want to do reactor analysis in neotronics. And this thing is very tightly coupled because it's moving fluid. You have the heat transfer that is directly in the fluid so there is no delay in the heating of the thing. Plus you have precursors, delay neutron precursors that move around. And well, openform was very good for that because you already had very good CFD tools. RANS, LES, you had both of them. Even DNS if you want, but it could be a bit of a overkill. And adding diffusion as I said before is if you wanted one group that honestly for transients is good enough is this a question, as simple as that. And if you want to add precursors you have a relatively simple equation, derivative over time. SP is a fancy way in, sorry, in openform to say source. So you have precursors, you have lambda multiple, decay constants multiply by the precursors. You have the neutron scores that comes from a neotronics. You have a divergence, why a divergence? Because these precursors are moving around. So you have to feed the solver with the velocity coming from CFD and move the precursors according to this velocity. And you have a laplation, why a laplation? Well, because these precursors, they diffuse. So any species transport you will always find at least a divergence. Most of the time, if you cannot neglect the laplation, a derivative over time. Sometimes, like in this case a source, why a source? Well, because precursors are created by fission. And you can understand that it was not a difficult equation to implement. Once you know the equation implementing in openform was not difficult. So doing MSFR analysis with openform was kind of a low-hanging fruit. And with that I don't want to say that what people did was easy. They had to have the intuition about using openform for that. And again, it was back in 2011. People were used to legacy codes. It was very normal to use trace and rail up and you spoke about openform and people were like, what's that? And it required people to make research, realize there is a tool out there that you can use. It was very good quality. It was very little used in nuclear. So I must give all the credit to those people that realize we could use it. And once they realized they could use it, it was for them a very low-hanging fruit to get very nice simulation that at that time was surprising because people were absolutely not used to this kind of simulations. It was, whoa. I will skip this one. You can do more advanced things with openform. You have things like radio, TV, transfer. This is a simulation, sorry again. This is a simulation of the damp tanks in molten salt reactors. Molten salt reactor you can dump your salt into critically safe tanks and cool them. And this was a simulation of natural convection in the tanks and radio, TV, transfer towards the external of the hot room. So these are things you can do with openform, which is pretty cool. And again, it comes from the fact that you have a large library and you have in there things like radio, TV, transfer. If you have to develop from scratch, it's a long way to go. Very cool one, also this one. Also in this case, it was probably the first time people did this. It was a direct coupling between discrete element method and serpent. This was a simulation of the approach to criticality. What you see here is low, but you will see it. It's the K effective increasing with the loading of the pebbles. So it was direct coupling between DEM, discrete element method in openform and serpent. It was done at UC Berkeley in 2015. And again, you will see people doing this today, but this was done eight years ago. So keep in mind the time scale. And once again, why it was so early? Well, because openform gave everything. It gave discrete element. And well, luckily, serpent people that developed a multi-physics interface for openform. And that's the advantage of having a large community. Most of the time you find functionalities. Oh, it's not, okay. Then I have three slides, which I mostly will skip because we will discuss this later in this week. So one is about GenFoam. So these are the tools I mentioned starting from 2015. People starting more or less. People starting to put together openform-based applications that survived, that are maintained regularly updated by a research group or an institution. Once GenFoam, as far as I'm aware of, it was the first attempt for a general multi-physics solver for reactor analysis. It has been used for several things from, this is a simulation of the multi-physics reactor experiments. This is core flowering in a sodium-fast reactor. You remember before I mentioned about dynamics, dynamic meshes, mesh deformations. But this is something you can do with openform and literally deform your mesh to simulate reactivity feedbacks. You can do two-phase flow, thanks to openform. You can do very strange reactors, like the Argonaut reactors. They will discuss more GenFoam later this week. Off-beat, same thing. Off-beat was born three, four years later compared to GenFoam, which makes it so that inside off-beat, we threw everything we had learned. So in my perspective, it's a better quality code. It's also a bit smaller. It was born for field behavior. We quickly realized, as I said before, that you can do more than that. And to give you an idea of how powerful and useful openform has been, this went from a very strange idea of using finite volumes for thermal mechanics to a multidimensional solver for field behavior that is now included in several URATOM projects. It's actually very central into the Opera HPC project that was recently funded by URATOM. And that happened in five years. So it was very, very quick. This five plus five years rule, maybe a little bit less for off-beat. Why was it so quick? Well, several benefits out of using openform. Probably Alessandra can comment more on that because Alessandra has been the lead developer of off-beat. She is the PhD student. She was the PhD student that created it and brought it to success. And is now the lead developer and maintainer of the code. But definitely something that helped is the functionalities you have there. Like, you have the cladding and you have the fuel. How do you couple them if you want to use unstructured meshes? Well, you need some magic in between, right? Well, the magic exists in openform. It's called AMI, arbitrary mesh interface. So you don't have to develop that from scratch if you have to develop that from scratch. You probably would need another PhD. So again, having functionalities like that is extremely helpful. And I mentioned before this 10,000 to 20,000 engineers working on openform. Well, that helped a lot because we found out very soon that solid for form existed. So this library maintained by the University of Dublin and that was instrumental to being able to develop off-beat in a very quick and timely manner. Last one. I'm not sure I'm qualified to comment this. So I think I can let Stefan discuss this and I'm giving him the rest of the lecture today. Thank you, Carlo. Yeah, so the last tool, again, we will detail much more on Thursday is containment foam. It has started as a tool to analyze gas mixing in a large dry containment. We were using pretty much the standard functionality that openform was giving us plus some tailoring towards the conditions that we were expecting. And from that we learned that we need to take care of more physics like phase change, condensation, phenomena, thermal radiation. So we put in some Monte Carlo transport solver. We understood that also we have to consider the interaction of safety systems or technical components with the flow. The interaction of the operators, the activation of safety systems and so on. So all this lead to a library which is, I think, still based a lot on the standard functionalities of CFD but also contains a lot of extra features. And again here we benefited a lot as you learn from what's available in openform. All the solver and models which we could take is a blueprint for starting to develop our own functionalities, turbulence models, the particle transport library that helped us with the photon transport in the Monte Carlo solver. The general parallelization of openform that will allow us to run this kind of, let's say, large problems as you see here, the Wattel model containment with roughly 650 meter cube. And I think something like five million cells. So I will take Carlo's conclusion here. I think it's a pretty enthusiastic conclusion I'm completely sharing. With a bit of ingenuity and imagination you can do pretty much everything with openform. Of course there are a few buts but at the end, if you spend some time you will discover that nearly everything is possible. To detail on this, I will try to answer a bunch of questions in the next 45 minutes that probably come up in your minds. The most prominent one, what is the effort of doing this? But also how can I approach such a problem? Which kind of competences do I need? What will be the quality of the results that I'm obtaining? And also what about the license related issues? And to address the first question, how to approach the problem, we will consider a kind of hypothetical reactor. It's a multi-physics problem. So we will run for thermo fluid dynamics, for thermomechanics and neutron physics. So we have a monolithic block core here with some coolant channels, similar to something what you expect for one of the microreactors that are coming up recently. It has a lower plenum, an upper plenum, and a reflector around it. And we want to model the thermo hydraulics coupled with really neutron kinetics here, so a classical vector dynamics problem. So the way to approach the problem is to split up it into different physics and domains. So we will have, of course, a coolant domain where we are mostly interested in the heat transfer from the solid structures to the coolant and then the transport of the heat by the coolant. We will have the solid structures, so the fuel and also the moderator, the reflector materials, and we will have, of course, a nutrenic domain which covers both of them to some extent to consider the neutron balance. So we will take those and, of course, have to create a mesh for all of them and when it comes to the fluid and the solid structures, Karl already mentioned, we can, of course, go for the brute force way, discretize all these, resolve the fine geometry, all the coolant tubes, but, of course, this would be far too expensive for engineering analyzers. So the major approach here is to go for porous model, meaning homogenize the fluid and also the structures within a cell. So we have a cell which shares the fluid and structure to some extent. And for the nutrenics, of course, we will, again, need a mesh which overlaps both domains. We will define the fields that we have to consider for. So in the nutrenic mesh, on the nutrenic domain, this is primarily the fluxes and the power what we are solving for, and the cross-section and the late neutron precursors, which we input. In the coolant domain, we will solve for pressure, velocity, temperature, maybe also turbulent quantities, turbulent kinetic energy, eddy frequency, and so on. And as input, we give the thermophysical properties. And in the solid structures, we solve for the temperature and we input the thermophysical properties again. So what you see here, mesh plus fields, this gives us more or less the data we are having to handle. And we also add the equations we want to solve. So for the neutron domain, simply we consider some neutron diffusion and decay and production of precursors for the delayed neutrons. On the fluid domain, it will be the Iran's equations, or unsteady Iran's equations. That's all for the third. And on the solid domain, we will simply have a heat conduction equation. Both are, of course, either in a resolved or porous media formulation. So with this, we have already a kind of base structure for our problem, which we can consider to build on the solver. So we will set up three classes, one class mimicking the neutron physics, another one for the coolant, and the last for the solid mechanics. All of these classes, they have inputs, primarily the temperatures. So for the neutronics, we need the temperatures in the liquid, in the coolant as well as in the structure to compute the cross-sections and get the neutron flux. In the coolant class, we will consider the reactor power as a heat source, but also we will have the heat transfer from the solid structures, which the fuel material itself, which will transfer the heat to the coolant. And with all this, we get the output variables that we are desiring. So the reactor power, the coolant temperatures, or the coolant heat up in the core, and the solid temperatures. So of course, the meshes or the domains, they are partly overlapping, but they will not share the same mesh. So we will have some functionality in open form, the mesh-to-mesh mapping that will allow us to transfer information from each class to another one using some interpolation schemes, for example. So all this is there, the mesh-to-mesh mapping, and we can simply utilize it for our problem. So with this, we have a kind of a blueprint of such a problem, and in reality, it's a bit more complicated than what we line out. So for example, if we're in a porous media formulation, of course, we have to find some way to distribute the power between the solids and the fluids. So here, we have to find a way to do it. And also we cannot simply map temperatures between the domains, but we somehow have to find the heat fluxes to conserve the energy between the different domains. So at the end, this is something to think about. It's definitely doable, and of course, the available features and solvers, they will help us on how to do it. So at the end, we will also have to define some nested subclasses that will give us all the information that we are needing, something like a cross-section, thermophysical properties, heat transfer correlation, pressure drop correlations, and so on. Once we have these three classes, we can integrate them in a solver algorithm. So at the end, we create some instances of the neutronics class, of the solid class, the coolant class. We proceed with a time loop where we solve all the domains of physics one by one. In a segregated manner, for some of them, we can have some kind of inner iterations, for example, here to converge on the solid and fluid temperature and the heat transfer between those domains. And we can also have some outer iterations within a time step to converge on all the physics. And with this, we can advance our problem in time. So this is a general approach how to do it. And you will see pretty much, this is what we have in all the tools that you will see this week. Let's briefly talk about the license. So OpenFoam is for free in terms of the license. So it's distributed under the new public license version three, which is a strong copy license. For those of you who don't know much about this, actually it means that you're free to do with the software whatever you want. The only limitation is that the license will automatically affect any kind of derivative work and you cannot modify it. So whenever you develop something based on OpenFoam, it will automatically become licensed under the GPL version three. And that means whenever you want to share it with some colleague, it's not possible just to give an executable, but you will also have to share the source code. This is something which is pretty advantages for collaborative work, for science, academia, for teaching, since you can just base on something that others provided you. You can avoid duplication of work, in particular for the very basic functionalities. But at the end also this can be a bit of burden for commercial players in the field that will not be able to put much investments in terms of money or manpower to the maintenance and also the development of such tools. So here, a positive example, of course, ESI is one of the maintainers of OpenFoam. They spent a lot of efforts on further developing and maintaining this toolbox. But also other companies, they are using OpenFoam and partly release features that they develop to the community report during the workshops and by this contribute to OpenFoam. The workflow in OpenFoam is pretty much what you know in many modern engineering softwares like FEM, for example. So we start defining our problem in terms of the geometry we want to solve for the computational domain we are considering. Then we discretize this domain into a mesh which has, let's say, an internal mesh, the fields we are solving for and a boundary mesh which holds our boundary conditions. Proceed with the setup where we give boundary conditions to the boundaries, initial values to the internal field. We select the models we want to include, the model equations we want to solve, the numerical schemes and methods and parameters for our solution. And once we are done with this, we can dump things on a computer or cluster infrastructure, get the solution and proceed with a post-processing evaluation and most likely repeating the step. So to talk a bit about downsides, there's no graphical user interface, at least no official generic graphical user interface of OpenFoam. And this simply inherits from the fact that OpenFoam can be easily customized for many things and as soon as you do this, all those new features, they will not be reflected at all in the user interface. So you would see that there are some specific GUIs for the various functionalities. For example, there's a commercial one maintained by ESI. I will also show you on Thursday that we do develop GUI for Containment Foam to E-setup processes, but this is something that is very specific for a particular purpose and not, let's say, in a generic way to get hands-on OpenFoam. When it comes to the geometric creation and the meshing, but also post-processing, we heavily rely on non-OpenFoam functions or codes. So for example, for the geometry, one could use OpenSoftware like FreeCAD to create the geometry. For the meshing, we can use Gmesh or CFMesh or Snappy HexMesh, which is shipped with OpenFoam. Post-processing, this will be a paraview, most likely, but they also convert us that give you possibilities to do some command line post-processing or export data in VDK and doing any kind of other post-processing tool. Generally, what my feeling is that when it comes to geometry and meshing, many people still rely heavily on appropriate theory tools since they are much more powerful and easy to use than what we see in the open source domain. So here, let's say there are a lot of powerful meshes that can simplify your workflow, that can ease the meshing procedure and at the end speed up your work a lot and many people are still using this. There's one point which I personally do not see as a drawback. This is OpenFoam is only running under Linux, so there were some attempts to do it under Windows, but most likely you will run in Linux, you will lose Windows Linux subsystem or direct Linux distribution to install and run it. And also if you go to high-performance computing infrastructures, all of them run under Linux and you will have more or less the same environment to do it. It will allow you to do nice scripting around OpenFoam to optimize repetitive workflows to do parametric studies easily. And I would not see it as a downside, but of course for those people who are used to run things under Windows, have a nice graphical user interface, they will have to learn a bit how to do things in the terminal and Linux. Something that's still not really optimal in OpenFoam is the documentation, so there have been some attempts in the community, in particular driven by ESI to consolidate all this. There's a nice tutorial wiki, some nice training series that you can follow to discover the various aspects of OpenFoam. There's a book of course that has been written by the OpenFoam Foundation. There's another book which is I think in Springer on the finite volume method with OpenFoam. Pretty good books, but at the end what you will see whenever you try to approach a kind of problem is that there will be a gap here and there that you have to fill yourself. There's no hotline you can call, but at the end you can go to CFD online or those user forums, post your questions or search for the entries and try to find a solution to that specific question that you are having. So all in all, OpenFoam has a quite steep learning curve. I remember myself when I started it six or seven years ago. It was pretty hard to get a hand on so many questions coming up at the same time and it was very hard to answer them. But what I could really recommend to you is not try to use it as a black box. So try to take a tutorial, just modify a line and make it a doctorial case. Of course, this is the way to go, but at the end try to understand what is specified, what do these things mean and are they applicable to my personal case or not. And then try to learn from similar cases and other tutorials, what are the options and which one fits best to the problem. The advantages are on hand and Carlo has shown a lot of them. OpenFoam is transparent. There is nothing hidden inside. If you want to see what a specific functionality does, you can simply open the source code. You can see if the equations fit what you expect and if not, you can just adopt it, recompile and use this functionality for your own purpose. So to conclude saying, it's a very nice way of integrating application and development together in this workflow. Carlo already mentioned a bit about the structure of the base library in OpenFoam. It's a very complete library holding all the things we need for solving matrices. Discretization, the solution for solvers for linear systems, the dense matrix algebra. You can solve ordinary differential equations. There's a lot of functionality for handling meshes, for the import of meshes, for deformation manipulation of meshes. The mesh-to-mesh projection, octree-based mesh search. We have a Lagrangian library for particle transport. We have Monte Carlo methods. Even things for reduced order modeling like proper orthogonal decomposition. It's not there in any distribution of OpenFoam, but it's in OpenFoam extended and can be ported to other ones. There's functionality for coupling applications in OpenFoam itself. If you want to go for conjugate heat transfer, for example, you can couple a fluid solver with a structure energy equation solver. There's functionality for coupling OpenFoam to external codes. For example, a file-based coupler in OpenFoam, the SI version. There are also third-party projects like PreSize by the Technical University in Munich, which have, let's say, designed the kind of generic code-coupling adapter where I just have to define an interface to the third-party code, use the adapter of OpenFoam and the PreSize coupling scheme to bring things together. And there's much more to explore. Just take your time to walk through the source directory, the applications directory in OpenFoam, and you will find a lot of useful things. OpenFoam is object-oriented data encapsulation multi-level API. So all this has already been presented. OpenFoam as a general CFD code relies on the finite volume method, which actually says that we take our computational domain and split it up into a finite number of small volumes. It's a quite flexible thing, so we can more or less discretize any kind of geometry. It's scalable, can use as many as possible computational cells. It's pretty much intuitive for an engineer, so as you see here, we have, let's say, our solution quantities, like the pressure, the temperature, whatever, on the cell centers, and we have values on the phases that describe the fluxes from one cell into another one. We will integrate the conservation equations over the finite volume, the conservative formulation, it's ideal for convection-driven problems like CFD and more or less what I know it's the method which is widely used for CFD in both the academic codes, the open source codes, and also commercial codes. It's quite okay still for diffusion problems, thermomechanics, neutron diffusion, and I think one benefit here is that each of those cells we are having here, the finite volume cells, just has a small number of neighboring cells, so we yield sparse, diagonally dominant matrix which can be easily inverted and allows us for fast and efficient solution. On the downside, meshing is, I would say, an art of its own, so here we have to produce good quality meshes, good quality in terms of a number of criteria, something like non-orthogonality, skewness, aspect ratio, growth ratios and so on. So all these factors will determine whether the solution is converging, diverging, giving us accurate results so we will have a lot of numerical diffusion or instabilities. It's maximum second order accurate in space, so if we want to have some higher accuracy at the end this comes at the cost of defining the mesh. It has first order elements with flat faces, so at the end if we have curved surfaces, of course, again we need to compensate this with a higher resolution. And last but not least, this is not a con, this is general for any kind of computing software. We have to be familiar with the concepts behind, so we have to know about solving partial differential equations, reading the geometry, mashing it, discretizing things and obtain a linear solution. So open form or generally definite volume method allows us to use unstructured meshes which can have a kind of arbitrary geometry, so we can go from tetrahedrons, hexahedrons, but even up to polyhedrons with any number of faces which allows us to actually handle any kind of geometry that we have in let's say non-standard, non-traditional reactor designs and components. So what you see here on the right side, the sodium reactor core, where we have hexagonal fuel elements, also on the left side something a bit between a spherical and a cylindrical core, these sort of things they can be done without the limitations we would have from Cartesian coordinates or cylindrical coordinates. Generally since it's finite volume, all cells are three-dimensional, so it's not so easy to do a 1D or 2D mesh in open form. There are some tricks to do it basically working on the boundary conditions. But at the end in some applications where we want to do some 1D calculations, something like having a thin gap to bridge or computing a pipe, of course we have to develop case-by-case solution on that. Using unstructured meshes is let's say really flexible but on the other hand has a bit higher computational footprint like the mentioned Cartesian or cylindrical meshes. At the end in those meshes we have a mesh index which we can sweep through and always identify the neighboring cells to our present cell. In these unstructured meshes actually we don't know in advance how many neighbors so at the end what we have to do is store connectivity matrix that gives us this information and this will of course require some extra effort in storing the information, calculating the information and so on. Open form uses operator splitting approach which is actually meaning that we will solve for each equation a separate matrix and then use fixed-point iteration to converge it. And we have some coupling terms in the equations, for example like density that may treat it explicitly. This has a number of advantages and disadvantages. You can just look up the literature, you will find tons of papers which will probably always show an optimal solution for a specific case. But what you can just conclude here is say on the positive side of course it's easier to precondition a single matrix for a single equation to choose the right solution method if it is just one equation. We don't have to do all things at once so we can have them one by one and this makes it much easier to develop and debug things and we can just focus on a single equation at a time. On the downside this may be less robust for a weakly coupled problem or strongly non-linear problems so if you for example consider shocks, these sort of things may work better in a coupled solver and sometimes we will have to lose some numerical tricks to get a stable solution. But on the other hand for having smaller matrices the requirements on the computational sites are lower and all these things probably will depend on the case that you want to use. There have been some attempts in open form to also use a block coupled solvers I think in the open form extent project they managed to solve for temperature even in coupled regions in a single matrix but still the work in open form heavily relies on operator splitting. Open form is parallelized using the domain decomposition approach of the message passing interface that actually means we are split up our computational mesh into mesh partitions and in between these partitions we have processor patches where we exchange the solution and this can scale up to thousands of CPU cores simply following a rule of thumb saying 20 to 30,000 cells per core scale well if we go below the performance decreases it's still possible to do it and going up of course we will have to accept longer computation times. This means a small problem of course we cannot just dump 1000 cores on it and get it solved efficiently but we somehow have to adopt the parallelization to the problem size itself. There are some bottlenecks which are common for most of the FEM and FVM solvers just related to the data structure I don't want to go much in detail but at the end this will lead to a lot of data exchange between the core and the memory so let's say this will be a bottleneck for the speedup and also the I.O. will be a bit limiting for larger problems. Since OpenFoam has a own way of storing data in these dictionary files you will see later so this is pretty flexible for setting up a case reading the case information but at the end when it comes to large large runs with a lot of cores then when you're collecting this data and writing it in these file structures is less efficient. There have been some approaches done to add HDF5 format to OpenFoam to improve this and also in the high performance computing technical committee at OpenFoam there are a lot of people working together to overcome these limitations to let's say provide an interface to linear IGBRA solvers for example there's some work done by NVIDIA to offload some of the metric solutions to the GPUs and also there's a European Horizon 2020 project called XFoam which tries to bring OpenFoam on massively parallel computing architectures so these approaches, attempts try to overcome the limitations that we have here and if you look at the releases of OpenFoam you will here and there find some of the outcomes of these projects already implemented in the standard distribution. Last but not least let's talk a bit about computational requirements so remember the rule of some, something like 20,000, 30,000 mesh cells per CPU core can be done that means if we go for a standard two-dimensional simulation which typically has a number of 100,000 CPU cells we can do that on a workstation with 6 to 16 cores it's easily possible if we go for 3D cases which can range between 100,000 and several million cells we can easily use this on a medium sized cluster and if we look for example for course mesh with neutral diffusion on these kind of porous media approaches then for full core we will have a few 100,000 to a few million cells which is something that can be solved on a workstation desktop computer laptop depending on the computation time that we can accept for the run time of course it's much easier to obtain steady state solutions if we can scale up the problem to let's say the optimal number of CPU cores this can be in between minutes to a few hours if we go for long running time dependent problems this can be easily for a week and give you some ideas of the analysis we are doing with containment form typically our validation cases they range from one hour transient time to two hours this is something we can do in between days to week therefore let's say a full scale technical application where we have huge meshes and also to consider long transient times like several hours this can easily go up to weeks or even months computing time that is to be spent for the memory you have to remember we store a number of fields for each cell so typically for a run simulation this is something like 10 fields pressure, temperature, the velocity turbulent quantities so if we multiply this by the number of cells we come up with something like a gigabyte for one million cells if we do something more multi physics like discrete ordinate methods where we have to store the information for the solid angles then this can easily be multiplied by an odd of magnitude or two so we get easily a number of 100 megabytes a gigabyte per million cells so with this I am at the end of this lecture I hope we showed you a bit our enthusiasm in open foam you learned how to approach a problem with this toolbox you got some ideas of what you can do what are the pros and cons I hope you also saw that it's not always a low hanging fruit so at the end what you pay is with time so you have to prepare yourself to understand not just the problem you want to solve but also the tools that you can use to do it and when you do it of course at the end many things will become possible thank you any questions to our lectures please go ahead please microphone we have microphones so people so people online can also hear your questions I just want to know if there is any tutorial or solver which have that neutronic equations already there to solve for the neutronics results like flux or something in open foam already existing because the presentation there comes at some point that solve neutronics then solve solid and then solve coolant equations so within open foam as distributed you will not get any neutronics you will have to look at contributions from the community I believe the only neutronics capable solver that is open access is GenFoam which is the solver that we have been developed at EPFL, PSI, and now Texas A&M solver is there is accessible if you can I actually go online with this computer yeah let me try something this is something I will present more later this week but still want to answer your question so if you search for GenFoam you will immediately find open access so you can find it there so this is a community contribution so it is nothing but a very complicated application of open foam inside you find the source you find documentation and you find the tutorials now the tutorials you have to understand open foam style meaning it is a bit more commented than open foam basic open foam you will find essentially no comments and they hope you will understand since GenFoam is a complicated tool we have documentation so we have a description we have an oxygen documentation that guides you through GenFoam will tell you how to compile pre-process running post-process will give you basic information about neutronics thermo hydraulics you will get information about how it works and if you go to where I was tutorials you will find a number of tutorials I'm thinking about one that makes sense for neutronics probably small ESFR this is a 3D european sodium-fast reactor and you will find the description of the tutorial and you will find an all-round that will tell you what's going on there is basic information there is no video tutorial on how to do that this is something we will do on Wednesday we will go through how to use this thing the general thing about using community-driven developments is that we always suggest first of all learn how to use OpenFoam if you learn how to use OpenFoam and even though the resources are a bit scattered they are there there is significant documentation online it's just a bit spread out but if you spend some time you will find it and in this webinar series that we did for the IAEA we provide the general introduction to OpenFoam and especially we provide an introduction to a lot of resources that you can use so the overall idea of that webinar series was to give keywords and references to start using OpenFoam so since this documentation is a bit scattered we felt like it's important to give people a place where they can find all these websites and links and keywords to search for so if you look at the IAEA online course there is a lot about that once you know how to use OpenFoam GenFoam is nothing but a complicated application of OpenFoam disadvantage is complicated there is a documentation and tutorials that you normally do not find in OpenFoam solvers so we actually spend some time commanding all the tutorials explaining how to approach them providing them with all run files and all run is a bash script but it tells you all the steps like creating a mesh, running a solver changing this dictionary and this kind of things video tutorial might be coming soon but not there yet but there are significant resources available to introduce you to the thing but the coming days I will get back to these in much more details let's answer your question ok, thank you we have a question from online participants from Algeria and here we have Abdul Ghani born in Nani could you you are muted unmute your microphone Abdul Ghani yes, just can you hear me yes, please go ahead with your question thank you my question is basically around the idea that OpenFoam most often use it for a sort of very complicated and complex simulation based on mathematical formula which might need I mean a lot of computation so the question might be about the validity of working with such OpenFoam with a single core or sequential programming instead of parallel programming so I imagine that it's for reactor analysis and so on it's a very complicated task that needs many computational I mean, yeah, computational resources so does working with a single sequential programming might be lead to I mean designing sort of state of the art stuff that we might need in the field or just of sort of simple stuff until we use parallel programming somehow clear I mean because there is sort of you know the complexity of the equations and the mathematical formula that we use to design the the reactor and simulate its behavior so using sort of simple sequential programming because you have talking about MPI which is passing interface and the high performance computing the NVIDIA involvement in this so working just in simple way with your with your PC with your traditional way of doing programming is it beneficial or to do I mean stuff that are beneficial to the field to the real advancement we need high performance stuff we need sort of GPUs and parallel programming for help and for I mean in order to do something high value why is it like instead of wasting time of trying to parallelize and buying new GPUs and clusters we just put more efforts on the traditional sequential program is it yeah is the traditional way of doing it really valid to do something beneficial for the field because I imagine it's very complicated task it needs a lot of computation we understand thank you we understand I think Stefan can answer because he is parallel we can it's okay I mean we can both give our perspective I think it's not an obvious question what you ask I can tell you what I see people are doing around me and there is a general consensus that if we want to license advanced reactors we might need tools that go beyond legacy tools this is the reason why in Europe and now in the US we have been developing open form based tools and this is the reason why the US has invested significant money developing moves so I think there is a consensus in the community that these tools are important that doesn't mean that you cannot do research without them so first of all even with these tools you don't necessarily need HPC resources you look at the tutorials in GenFoam we have 3D reactor analysis in there and you can run it on a laptop no problem at all it's gonna run in minutes in a laptop so you can do the equivalent of what was done with system tools using open form I mean the nice thing about these new tools is that you can pretty much decide how computationally expensive you want to be if you want to stay coarse mesh using correlations you are absolutely free to do that and most of the time you will get solutions that are compared to a CFD solution take a reactor core take a fast reactor if you want to do a CFD solution you can smash simulation with correlations most likely the regulator is gonna prefer the correlations and the porous medium approach and will not like the CFD so I would say that going for high fidelity HPC parallel simulation it's not a must we just know that it can be useful and we don't want to find ourselves in 5 years when we license reactors that we cannot simulate them so we need to keep on progressing but to me that doesn't mean you cannot use traditional approaches either on legacy tools or on these new tools traditional approaches are valid especially when they come with correlations that have been verified and validated and improved through several years my very personal perspective Stefan I can also give a personal view on that since I'm not an expert actually in all these high performance programming at the end we choose open foam as a basis for our development since it gave us most of the things that we need and with this we left a bit also the decision how the details in the base libraries are done to the community that is behind open foam and I think as soon as there are new techniques being let's say settled in the community also open foam the releases will reflect it and things will develop in this direction but for the moment I would say it's a state of the art and a solid basis to build on but that's the the trivial reason for approaching this if I can just one thing I think you touch something that is actually I forgot to mention that we probably implied all the time is that if you use open foam you don't need to be an expert in numerical analysis or any part you know you most like you don't even need to know how to discretize or you don't need to know how to solve a linear matrix the more you know the more proficient you will be the more research you will be able to do using that tool but the fact that it is object oriented it also means that most of the classes are it's a multi-layer library so you can stay at the very top and just throw in equations not knowing anything about the discretization and solution or you know the deeper you go if you are an expert in you might be an expert in linear system solution not in nuclear engineering and you will still be able to touch those classes and not touch the rest so the way it's been programmed that's the whole benefits of object oriented programming is that you are not required to touch all of it you touch what you can you touch what you know and the rest you trust it you have to have a basic knowledge because of course when you solve it if you have no idea what a linear system is it's not going to be easy but you don't need to be an expert on that so I think this is an advantage of these new libraries they allow you to focus on what you know frankly I doubt if I comment that you don't have to be an expert if you well you don't need to be a mathematician but continue with the parallelizations versus you know the sequential programming usually you need several you don't need to only calculate one transient that's all you need many many variations so what if you run immediately like 64 transients with different initial like boundary conditions instead of trying to parallelize one it's per CPU it's more effective it's very effective it's just sometimes you have no choice right sometimes you want to have some very complicated simulation there is always trade off so of course it allows you to do okay thank you we have question question up there somewhere here I said no over there not okay please then you hi which cross-section library do you use for neutron diffusion anything we can we have used the NDF we have used Jeff you can use whatever you want and how you prepare the cross-section if you need multi-group cross-sections uh historically we have used a lot serpent we are trying to use more and more open MC for obvious reasons because we're speaking about open source here um I know people that have used dragon for that so there's there's a whole set of tools we don't recommend anything specific we have like in general we have routines to translate output of open MC and of serpent into open form gen from readable input so you will find utilities for those two tools but it's fairly quick to do it for any tools you want okay thank you thank you for presentation I would you like to ask about the data projection in the each physics the mesh dance is the same as all of them or they are dance the mesh dance they uh different each other it depends on how you want you can do both I can tell you in gen from they are different so we have a different mesh for Neutronics thermal hydraulics and thermal mechanics they can be the same but sometimes they are not so we decided to let the user use different meshes sometimes because of different refinements sometimes you because you don't want to solve the same region imagine you do thermal hydraulics and Neutronics you want to do the entire primary circuit and you don't want to solve the heat exchanger for Neutronics right so it is different and open form has two three different algorithms to do mesh to mesh projection and the one that you use almost all the time is conservative weighted volume algorithm essentially it project fields from one mesh to the other making sure you don't lose power pretty much or any extensive quantity I understand so when you calculate the vision coefficient for example the vision parameters in the Neutronics calculation you use to any average or interpolation method for the any specific volume for the Neutronics again some pictures if you speak about generally open form you can do pretty much whatever you want if you speak about gen form we allow the user to decide if they want to use if you want to provide an adjoint flux for weighting you can provide and the tool will do the weighting for you you can even calculate the adjoint yourself because we have an adjoint solver open form it's completely feasible with gen form it's implemented thank you so much ok we have a question in chat which says is the thermomechanical class of gen form a subset of of bit if yes there are any plans to merge of bit with gen form for multi-physics analysis this is an incredibly good and timely discussion we are discussing with Alessandro who find a way and the path forward to merge the libraries of gen form and of bit there is an obvious reason you look at I will show you I think this is we still have some time and this can be interesting so if you look at the source code of gen form you will see classes in there you have things like neutronics, thermo-hydraulics, thermomechanics if you look at the main the main pretty much inside the something called solve and if you look at the solve it mostly look like this if you are solving fluid mechanics thermo-hydraulics.correct fluid mechanics if you are at the final iteration thermo-hydraulics interpolate coupling fields thermo-hydraulics correct energy and then you have things like thermo-mechanics interpolate thermo-mechanics correct neutronics deform mesh here you see the object oriented right you see that in principle you have a library and you have classes and you have for instance you have a class that is called neutronics and this class has functions that deform the mesh and solve for neutronics so the old structure if you look at it so this is GenFoam GenFoam is pretty much 200 lines and what's below are classes so are functionalities once you embrace this approach of object oriented you realize that having in there something additional to that that says of bit.initialize and of bit.sol is kind of obvious why don't we do that we have the time to do that so yeah the objective is to try to merge the two libraries having separate classes that can do the whole thing from so we will try probably to get rid of the thermo-mechanics solver of GenFoam which is very primitive and have a single library for all the functionalities so we will have neutronics good thermo-mechanics done with of bits we have thermo-hydraulics we have multi-physics control and well pretty much the idea is to have a situation where GenFoam and of bit are nothing but two complex examples of applications that you can build using that multi-physics library so yeah the idea is to merge the two things and have something that is easier for us to maintain more obvious to use for everybody easier to document and more sustainable as an animal overall as a solver oh yeah good question so I think we can accept the last question before we go for the launch yeah I have a short one is there any list of all the papers published using these tools GenFoam, of bit and the ContainerFoam that's a good question that should I don't think it's there anymore at some point years ago we started a list of publication I don't know where it is anymore actually if you go I'm sure if you go to just to give a half answer so if you go to the single tools on the GitLab you will find publications to related pages and there should be a bibliography and you find eleven of them there's more and if you go to of bit probably they have the same thing so you find publications inside at least the essentials one so these eleven ones are those that include they're essentially a theory manual they include all the theory behind or 90% of it so there is enough to cover the basics probably a good idea for us to start creating a list we started years ago in the frame of another initiative opened from special interest groups for nuclear we started compiling a list of publications I think it's still there but I'm not sure let me check it's a good question also, like Stefan suggested that we can make a on-core site we can make a list of publications also related publications that could be a good place actually just for curiosity I'm not sure it's still there, oh maybe nope there used to be a publication list here it's not there anymore, sorry about that it's a good idea and I think we will try to make it happen so now we just came to the lunch break I believe correct me if I'm wrong and then we start at 1.20, one hour later a practical introduction to the open form with Stefan I believe the lunch is the same place where we have the coffee and the cafeteria