 Welcome everybody. I'm going to let give us a few minutes for everyone to enter the room. We've got a very busy webinar today. We're expecting sort of 500 people. And so to find out what everyone's background is, we're going to have an introductory poll. Erin's going to tee up a poll. So what is your professional background? The question, the answers are utility, research student, consultancy company, or an interpretation provider. We've got a second question there. When was the first time you heard about digital twins more than five years ago, two to five years ago, less than two years ago? Or this is a very new subject for you and you've only just about heard about it today and hoping to learn more about it. So give it a couple of more minutes for everyone to settle down and answer the poll. So do answer the poll if you can. And I'll start beginning. So welcome to this webinar. My name is Oliver Griefs and I'm an Associate Director at the Engineering Consultancy Act in Surrealis, as well as a Visiting Professor at the University of Exeter, as well as the Chair of the IWA Digital Water Program. I brought this webinar together today to really let's have a discussion on what digital twins are. And I've got three fantastic panelists, Pilar Conejos of Idrica, who was responsible for the digital twin in Valencia, which really gave me the introduction to what a digital twin was. Vimal Nart of the AM team, who's a fellow panelist on the IWA Digital Water Program along with Pilar, and he's an absolute master of digital twins over in Belgium. And of course, without going saying, or James Ballard of Seven Trent Water, where lots of interesting things of digital twins are being brought into practice right now. And I'll let James tell you all about it in a little bit. There are some sort of introductions that I have to say. This webinar is being recorded and will be made available online on the IWA Connect Plus platform, the IWA network website, as well as I'm sure get shared on all sorts of social media as well. The speakers are responsible for securing their copyright permissions for any work that they will present, of which they are not the legal copyright holder. And the opinions, hypothesis, conclusions or recommendations contained in the presentations and other materials are the sole responsibility of the speaker and don't necessarily reflect the IWA opinion. And they may well, of course, be their own personal opinions as well. So those are the sort of things I have to say. For the webinar, there is a chat box. So do use the chat box. There's also a Q&A box that the presenters will be using to answer your Q&A questions. There is a Q&A session at the end. All attendees' microphones will be muted and raising hands. Don't bother because we can't look at it. There are 221 people in there in the webinar right now. I'm going to be moderating today. As I said, Pilar, Vim and James are going to be speaking. And we've got a really, really interesting 40 minutes to an hour ahead of us. So our first speaker is Pilar Konekos. And I would say about five or six years ago, maybe a little bit more, I thought, what's this digital twin nonsense? And I wasn't convinced on it. And I'd had conversations at various conferences. It was only when I went to see the work that Pilar did at Global Omnium, where she worked at the time, before she moved over to Idrica that that light bulb moment came over my head. And I went, wow, so this is what a digital twin is. And this is what it can do. So without further ado, I'm going to introduce Pilar Konekos of Idrica. Pilar is the digital twin manager at Idrica. And she has been working on digital twins for a lot longer than, guess what, the name digital twin existed. So Pilar, do you want to take it away? So thanks, Oliver, for your kind introduction. And thank you all for attending this webinar. I'm going to speak today about digital twins for water distribution networks. In my presentation, first of all, I will start with a quick introduction about the concept of digital twin and how can it be applied to a water distribution network. Next, we'll see a real case example of application of this solution or technology. And finally, I'll speak something about the future and the potential of digital twins. So I'll start with the definition of digital twin. Really, there are a lot of definitions of digital twins, but one that I like the most is this definition that we can see in the slide. A digital twin is a virtual copy of a real system that represents its behavior continuously and serves as basis for experimentation. That means that we can try new ideas or new changes in the virtual system before making the decision in the real system. And so this is the way that we can minimize risk, time, and finally costs. And here in the definition, I think we can see the key thing that makes a different digital twin for all other solutions. And it's the capability to reproduce continuously the behavior. And this is the reason why we are going to need simulations model when we build a digital twin. So the concept of the digital twin was consolidated by Dr. Grybs in 2003. And it was first applied to the industry field. But of course, it can be applied in a city management context and specifically to a water distribution network. It's true that today, most of the deployments are in the industry field because it was first applied to this sector. But now we are seeing like some early adopters in the water sector. And also the real expectations are high because according to this research, digital twin implementations are going to increase by 56% in the coming years in all the sectors. And also there is an increasing interest in developing hydraulic models. For example, according to this graph, in 2027 the investment of hydraulic models, water and wastewater, is going to be the double than in 2020. So I think this is closely related to the interest on developing digital twins for water distribution network. Because as we are going to see later, most of them use hydraulic models as simulation models. But first of all, now we know the concept of digital twin, but why developing digital twin for water distribution networks? We all know that water networks are complex systems, thousands of kilometers in length interconnected. They work in a variable environment where things can change suddenly. And climate change and water scarcity impacts dialing in these systems, and they are an essential service. So we have to provide water to the followers. So there is a strong need to manage these systems in a resilient, in a secure way. And digital twins can help us a lot to do it. Because according to the definition with a digital twin, we can monitor the whole system. We can analyze its behavior. We can simulate also the behavior of the system under other conditions. And finally, we can have the necessary insights to improve the real system. Because in the end, this is the goal, improve the physical and real system. For developing a digital twin, we need some components. And here we can see the three main components that we need. First of all, we need data and real data. We need models also, because, as I said before, a digital twin has to simulate the behavior of the network. So we need models to reproduce this behavior. And finally, we need analytics in order to have the necessary insights to improve the physical system. Let's see the three components. Regarding data, I think today most of the utilities have the necessary data, because most of them or some of them have different sources of data, like GIS, sensors, SCADA, even smart meters or component areas management, maintenance management systems. It doesn't mean that we are going to need the first time that we deploy a digital twin, all the sources of data. There are some that are absolutely necessary, and the others is good to have it, but if we don't have it at first, it doesn't mean we can, like ingesting and ingesting more data after that. Regarding models, we have two models that we can use. We have data-driven models based on AI, machine learning, and so on. And we can have basic-based models. In the case of water distribution networks, basic-based models like hydraulic models, I think work very well to reproduce the behavior of the network, because in fact, we have been using them for a long time, but it's true that we have used these models for planning the network. So these models we know, these hydraulic models, we know that are simplification of the reality. So for these models to be part of a digital twin, we have to accomplish several things, like they have to be continually developed, they have to be calibrated, and they have to be continuously updated. So these models have to be able to represent continuously the behavior of the network. And this is the reason why we have to connect these models, these traditional models, with real data in order to keep them updated with the real conditions. And here in this slide, we can see the differences between having a traditional hydraulic model that we have used for planning or having a hydraulic model connected with the real data to be part of a digital twin. In the traditional way, we are used to developing a hydraulic model, taking some specific data set in order to reproduce a specific situation of the network, like the current situation. Finally, we achieve, we take this data, we calibrate the model and we can achieve to reproduce the current situation with a high accuracy. But what happens since then, that a lot of changes can happen in the network, like assets after service, new assets, new set points, and so on. And if this model is not updated, the level of calibration starts to be lower and lower and lower, and is not able to reproduce the current conditions of the network. If we connect this model with the data, what we can do is to maintain the level of calibration or the accuracy of this model, because can reproduce all the changes that happen in the network. And finally, we need analytics. We have to add on the top some algorithms so that using the data provided by sensors and also provided by the hydraulic model, we can have the necessary insights in order to make decisions in the real system. And here we can have different level of materialities of digital twin. We can have a descriptive digital twin to reproduce the behavior of the network. We can go forward and have a digital twin that is able to make diagnosis in order to answer questions like why things are happening. We can improve on going forward, and we can have a predictive digital twin to tell us what is likely to happen in the future. And finally, we can have a descriptive digital twin that tells us directly the actions that we have to take at any time. So regarding water distribution networks, we can use the digital twin for making decisions for different objectives. We can use the digital twin for planning in order to have an optimal network design. And we can also use the digital twin for supporting over-day operations like anomaly detections, early response to emergencies, energy optimization, leak location, and so on. So we can cover a lot of objectives regarding water distribution networks. And now we know what a digital twin is, and we know what a digital twin is not. That is also something important. A digital twin is much more than a monitoring system like SCADA. It's much more that's a digital representation, like IDIS or ABIM, is much more than a hydraulic model built with a static dataset. And now we are going to see a real-life case example of application of a digital twin for a water distribution network. In these cases, Valencia's digital twin. This digital twin has been supporting the daily operation for over 15 years, so it has been really a long journey. And we have really a great experience from operating it. Really, we started very early because we came up very early with the idea of connecting the hydraulic model with SCADA in order to run simulation real-time. In fact, preparing this webinar, I found that in 2006, we published our first paper speaking about that. Since then, we have been working and working in order to improve that idea until achieve our current situation where we have for today's digital twin. And I think the key of our success has been the combination of different stakeholders like research, utility knowledge, and also digitalization. But why do we start so early? Because we have been operating this digital team for 15 years, as I said before. Because at that time, we had several challenges to face in the water distribution network, like water scarcity, population growth, infrastructures near the maximum capacity, and also keep people in the company near the redetermined. So we realized that we needed a system or a platform that could help us to make decisions, to plan the new infrastructure, to improve for the decision-making process in the day-to-day operation, overall under emergency conditions, and also to make easier the generational change. And here we can see like the three key dates for us that were really important. In 2007, we achieved the connection of the hydraulic model with Skidia. In 2012, we started the digitalization process of joining the company, so much more sensors were deployed and applications. And in 2018, we moved into a smart or data-centric platform where we integrated all the info and connected with the hydraulic model in order to have our today's digital twin. And here we can see like the main figures of our digital twin. Our digital twin in Valencia, it has 900 kilometers in length. It contains all the regulating elements, pumps, pumps and valves, and is connected in real time with 600 sensors, pressures and flow meters, mainly. And as a result, with these 600 sensors, we are calculating or simulating what's happening in 10,000 points of the network in real time. They are called like virtual sensors, soft sensors, and this is something great because we know what happens everywhere in 10,000 points of the network, monitoring only or having sensors only in 600 points. And this is the reason why we say that with a digital twin, we can know what happened when we are not measuring. Here we can see the main uses so far in the daily operation. It has been really useful because we can run simulations in real time at any past time under the current situation or conditions and any other what is scenarios. We can also forecast a simulated behavior for the next 24 hours. And really these capabilities have been really useful for us to help us to make decisions under emergency conditions. And also, as I said before, to estimate values are not meeting the points because with 600 sensors, we know what happened in 10,000 points of the network. And regarding planning, it has been really very useful and we have planned all the new infrastructures because with this digital twin, we have assessed the network requirements, we have designed the new infrastructures, we have defined the behavior and very important, we have determined the commissioning stages in order to affect as fewer people as possible. And this is where we are and now what's next. We think that and I think that a digital twin is a journey full of opportunities. It's true that there are several challenges that we have to face, like for example, the definition of a clear business or some objectives. We have seen that a digital twin can be used for different objectives, but it's very important to start focusing on some of them. We have to see what our challenges are and we have to start developing the digital twin focus on these objectives. And after that, we can make it work adding and adding a lot of objectives, much more objectives. Another challenge is the development of an calibration of a hydraulic model that runs in real time. This is not something easy. Another challenge is the data silos and quality. Most of the utilities have data and have different sources of data but either size in different silos and it's difficult to have access to it. And a digital twin has to concentrate and put under the same umbrella all the data, so we have to have access to all of them. And finally, people engagement and adoption of this new technology. In the end, digital twin is a new way of working. It's very important for us that people adopt this technology since the beginning. Regarding the assessment, I think digital twin is a tool that empowers and enriches people's work. Implementing a digital twin really requires a new innovative culture. As I said before, data quality is key. It's a challenge that the data resides in different silos but also it's a challenge to maintain this quality. So we think that it's very important to present it from the source. So it's very important to have a good selection of sensors, good maintenance of sensors, good protocols for registering an info. And after that, we are going to need data king algorithms but they are the last option. So it's very important we can present the quality from the charts. And after that, finally, keep it simple and focus on your main challenges. Digital twin can be used for a lot of objectives but we have to start with some of them. Regarding the future, I think in the future we'll see different digital twins interconnected. I have spoken in this presentation about digital twins for water distribution network but I think we can develop a digital twin for every phase of the water cycle and they could be interconnected because the outputs of some of them are the inputs for the others. Even we can integrate a digital twin in a smart city context where, for example, in a smart city, every infrastructure will have their own digital twin and they will be interconnected. So we would have the holistic vision of the digital system and we could also give an active role to the citizens. And also, integrating a digital twin in a smart city can bring us a lot of benefits for the water digital twin in itself because we can open a two-way communication channels between citizens and the utility. Citizens can make a more responsible use of water because they can have access to their consumption and they can see the impact of their actions. And finally, we can adapt infrastructures to the needs of the city if this water digital twin is integrated in the smart city. And finally, I think digital twins have reached the water sector to stay. Develop and maintain like a digital twin of a water distribution network today is an objective for most of the utilities. And the good news is that data and tools available nowadays make it possible. So this is the reason why I think the best is yet to come because this is a journey full of opportunities. So thank you all for your attention. So thank you, Pilar. That was absolutely wonderful and so much information in such a short time. We do have some questions in the chat but I'm going to let you handle them as we go along. So there's some of the Q&A. If you're putting the questions in the chat, we can't actually answer them in the chat. So do put them in the Q&A, not the chat box. And I think we'll move on. Thank you very much for that, Pilar. And there'll be more questions to be done in the Q&A section at the end. Our next speaker is Vim Aldenat, who did the most wonderful keynote speech at the Digital Water Summit last November. For those of you who like the world of digital, do come to the next Digital Water Summit, which will be in November. And certainly back in November, Vim talked about not only being technology ready, but also being market ready, which is something I've seen many, many times over the past 20 years of being in the Washington Street. Vim Aldenat runs a company called AM Team over in Belgium, helping people with digital twins and their application around the world. He's the CEO and co-founder, as well as being on the Digital Water Program steering committee. And it's always got a very, very lively and lots to say. So I'll let Vim take over and see what we'll see over the next 15, 20 minutes for the application of digital twins to the treatment side. Vim, do you want to take over? Sure. Thanks, Oliver, for this introduction. Also, yeah, thanks everyone for being here. How great is it to see how much attention digital twins get? And we heard from Pillar, they are here to stay. Well, actually, we also believe they are here to stay. And the coming decade is very promising. We will see a lot of things happening. So what we have not seen happening too much up until now is digital twins at the treatment side. So I mean within the fence of a drinking water treatment plant or wastewater treatment plant or a reuse treatment plant. While actually the potential, if you think of it, is huge. A very mature area Pillar has been highlighting, which is digital twins at the network sites. And in my presentation, I really want to bring us to the treatment sites and show a couple of examples there and what digital twins can do. Good. Let's look at the hype or let's say a fraction of the hype, the reality of it. This is what this is showing is the use of the phrase digital twin as function of time. And you can see from 2014 something happened. So 10 years ago, people literally started talking a lot about digital twins. It also came with the rise of artificial intelligence it coincided with a couple of things. But you can see something is happening, right? This is more than exponential. This is like almost a vertical curve. So this is the use of the word on the internet and books as function of time. Now the real question is how will this translate to adoption in the water industry? Because this is digital twins all over the place in all industries. But the real question is how will it be applied in the water industry and how will it manifest in value? So let's first start with why. Why are digital twins today a little bit more relevant than let's say 10 years ago? Well, it is driven by of course the availability of tools, awareness about AI, having very good computers, things like that. But this is not enough. Like also Oliver said, pushing a technology is not enough. You need the needs and these are the mega trends in the water industry. I will not go over all of them. You can screenshot this slide if you want to or of course look at the recording. But we have really mega challenges coming. Climate is one of them. Climate change, mitigation, adaptation, affluent requirements that get more stringent, the mega trend of reuse, etc. etc. Another one, a human dimension is the aging of the workforce. We have a couple of customers that are building digital twins to improve the process side, but to also store knowledge of very experienced people that are about to retire. So many, many needs. So the value I believe will drive the adoption. It is not really the hype or everybody talking about it. It is really proving the value. So, okay, this was the why of digital twins. How can we build them? The fundamental core of a digital twin is kind of a model, a kind of a model that runs on data, data of a plant or a system. And what is often not very well understood is that there are very big families, two big families of models. We have the data driven models and most of the people are familiar with the name artificial intelligence or statistical models, but not many people are familiar with mechanistic models, or at least not at the digital twin side. And in between you have hybrid models. So the focus of my presentation will be mainly on the mechanistic models. So the turning these into digital twins. This is kind of a philosophy we try to apply at AM team, the knowledge-based approach. Our first goal is always, okay, to start with the why. Why would we build a digital twin? Because this creates the business case and I will come back to the business case later. Once you know the why, our philosophy is start with knowledge-driven digital twins as much as you can. If you have mechanistic models to your disposal, I'm giving one example, the activated sludge models or a lot of chemical models being used in the drinking water fields. They can provide value in the short term. Why? Because they have prior knowledge. So you can start applying them tomorrow and then of course improve them. But this is a step we typically take and of course you can nicely complement them with data-driven approaches. In some cases data-driven first is the way to go. But we try to really give priority to knowledge-based digital twins because especially this learning dimension is very, very powerful. So this brings us to a very first case examples. I will have four case examples. I have chosen to give four case examples, not very in depth. But the goal of these four cases is to give you an inspiration on what are the potential applications at the treatment side. So there are two drinking water treatment examples and two wastewater examples. The first one relates to tackling climate change impacts because drinking water, especially surface water treatment plants, but also groundwater, are facing issues when it comes to climate. So this brings us to the Netherlands. This is let's say northwestern Europe and if we zoom in a little bit in the area of Amsterdam, you have a very big lake which is called the Isle Lake. And you can see that there are two basins here, two intake basins of drinking water. Now this is rainy. Netherlands is rainy. Belgium is rainy. But still we have droughts. Our summers are getting drier. So this means the chloride concentrations are also going up drastically. So we built a digital twin of the food plant and not only the intake basins, but also the treatments, the food treatment side to assess the impact of climate change. So what if the chlorine concentrations, for example, within 10 years would increase to a certain level? What would be the impact on the treatments? And can we take certain steps? So the essential goal of this project was to turn data into information. And information is something you can act upon. So we built the virtual world, which has a lot of data coming in. You have a model and the model creates data that you cannot really easily measure or creates future scenarios. And then you have the real world where you use that information to make better decisions. So we started with a mechanistic approach. We even did a full 3D modeling of this basin. So this is possible at this scale. These are very huge basins, millions of cubic meters. But you can already see that, for example, the mixing in this basin is not very good. You have kind of a short circuiting from inlet to outlet. And it's not optimal. Now if you don't know this, it's very hard to build an accurate digital twin. So it's very good to have some mechanistic understanding. So the next thing we did is we built kind of a hydraulic real-time model, taking into account this reality of mixing. And it gave quite a good representation of the data. So the blue dots here are the outgoing chloride concentration. And the orange line is the prediction of the models over a five-year time span. Now if you would neglect these hydraulics, so if you would assume this basin would be very well mixed, you would end up with the gray line. And this is not really a very good representation of the data. So you clearly see that adding mechanistic knowledge can really help us, either mechanistic models or even data-driven approaches. Now what was the end goal here? Well, to run plenty of scenarios. For example, how long does it take until the chloride concentrations at the outlet change? When should we close the basins? What if we recycle certain streams? Can we reconfigure the intake? Things like that. So these were scenarios that were run with the model. And actually the model is still being used in real right now. Now according to the definition, this is not yet a real digital twin because there is not yet a real-time coupling with the real plants. Now of course we are building the digital twin. It is already being used, but of course the next step is the real-time coupling. Yeah, so this is let's say a digital twin in progress. A second example in a drinking water treatment plant is really on a treatment level, maximizing the energy savings and chemical savings and also the effluent quality maximization. This is another case in the Netherlands. So we have the Dunea case. Dunea is a drinking water company in the Netherlands. At the left you can see the real world. So it's an advanced treatment that treats surface water and removes chemicals of emerging concern. It is quite a unique process because you have an ozone process with a downstream UV peroxide units. It's advanced treatment, but these strains of course are becoming more and more popular to remove marker pollutants or to have reuse going on. So we built a digital twin. That's what you can see on the right. It's running with our model Amazon. So this is a chemical model we have been developing. And this runs in the software Sumo. And we run plenty of scenarios. And then you can of course see how to optimally configure these two treatment steps. Because on one hand operationally ozone is more cheaper. It's less expensive. And UV peroxide is more expensive in terms of chemical and energy. But both remove different types of components. And ozone has a by-product issue. So how do you tackle this? Well, if you have the digital twin, you can of course try to match the data. And that's what we did. We built the digital twin. This is five years of bromate data. So bromate is a byproduct of cozonation. You really want to keep below certain levels. So this orange line without any recalibration of the model is able to really catch the big dynamics. You can see around day 900, there is some mismatch. But also in that region, they also changed the UV sensor. So it can also be a data issue there. But yeah, very happy about this protection. And same with mica pollutants. So these are, let's say, a set of 20 different chemicals. We were quite happy how the model could predict individual mica pollutants. So this was the first, let's say validation and calibration of the model. But then, of course, the most interesting stuff comes here and Pilar already mentioned it. It is the scenario capability. It is not only real-time monitoring, but also running scenarios. What if scenarios, for example, what if we would control our ozone dose differently or the UV dose differently? So blue is the dose applied in reality. Sorry, orange is the dose applied in reality both in the ozone system and the UV system. And if you would do a virtual controller, let's say model-based control, you could actually end up with quite significant energy savings. Yeah. Again, this digital twin is not yet being applied in practice. But just during the building phase already, these scenarios already led to operational measures that were already implemented. The next stage we are still working on that is bringing it live together with the real plants and ultimately to have this model indeed optimizing the UV dose in real-time. So yeah, this gives you just an idea on how these models or digital twins can help you optimizing the interaction between the different treatment processes. And another interesting feature is the soft sensor feature of a digital twin. So this means you predict what you cannot measure based on what you can measure. I will give you one example, the micro pollutants. So these are these are chemicals with concentrations of nanogram, microgram levels. There is not a sensor that can measure them in real-time. But it is really cool if you have a model or digital twin that can predict them in real-time, just based on other data, which is pH, UV transmittance, temperature. So these are things you can measure. So dynamic micro pollutant concentrations are typically not recorded because grab samples are being taken. Okay. So yeah, very interesting feature here. Now we go to the wastewater fields and we zoom in on the monitoring and the control of an advanced treatments. So in Europe, especially driven by the update of the Urban Wastewater Treatment Directive, a lot of micro pollutants need to be removed. But not only in Europe, also if you look at the United States or different countries where reuse is going on, turning affluence into drinking water, for example, yeah, you need these chemicals to be removed. So together with Waterscope de Dommel, we actually built half of the digital twin first. So a twin, as everybody knows, a twin is a baby brother and a baby sister or you need two children to have a twin. Well, the virtual twin baby was born first and the real twin baby was born second. What do I mean with this? Well, we first built the virtual full-scale plants to assess the removal, the performance of the plant, the opax, the operational costs, things like that. And we used existing data from the operational biological treatment plants. So we used existing data from the plants. We fed that to the virtual full-scale plants. That full-scale plant was not built yet. Okay. It was going to be built. And that virtual plant already gave good predictions or assessments in by-products, costs, microplutin-free mobile. In the second stage, the real plant was being designed using the same model. Okay. And once it's being commissioned, because right now it is being constructed, the digital twin will be coupled in real time. So then the second twin baby will be born. We will do a real-time coupling of the pre-existing digital twin half, let's say, with the real plants. And this is supposed to give them on a very low resolution, very fast insights on ozone, chemicals, by-products, et cetera. So again, things you cannot really measure. In the first phase, we would go to monitoring. So aiding the process operation. And in the second stage, when the digital twin comfort is being built, we will go to advanced process control. So the model will really, let's say, intervene in the process in the real time. Okay. This gives us the final case example in wastewater. It's another important topic. It is nitrous oxide emissions. I'm giving kind of an outlook here, what we are going to go towards. So generally, we start again with a mechanistic understanding of full plants. You can see the top view of this wastewater treatment plant here on the bottom screen. Red indicates high unto O emissions and dark blue is low unto O emissions. What I just wanted to show here is without having this mechanistic understanding, it will be very hard to accurately describe unto O emissions because they are very, very local. They are locally happening in a bioreactor. So this is why we start here. We like to start from mechanistic understanding. And then we translate these models again to digital twin model. So first version again is the offline use. And the second stage is the online use. And they can be different versions of the same model. Okay. And they also have different purposes. The first one has a plant optimization purpose and an insight and a learning purpose. The second one has more of an operational purpose. So this is really very promising because the nitrogen oxide is really causing a lot of emissions. And at the global scale right now, the focus on it is growing dramatically. In the end, this is where we want to go with models that dynamically predict unto O production or emissions. That's what you can see on the Y axis. In this case, this is the liquid phase concentration as function of time. And then you can test mitigation measures. For example, improving the carbon dosing, improving the aeration, and you can try to get the emissions down in real time. And our aim ultimately here is to use soft censoring as much as possible to keep the process as lean as possible. Okay. I know I didn't go too deep in the cases, but I just wanted to give you a good overview of the different applications at the treatment side. This brings us to the conclusion. We like to call it what we focus on the 5E framework. Typically people are focusing on the operational sites. Okay, that's improved the affluent quality. That's improved the processes efficiency, lower the costs, for example, lower the emissions. But I think, and we should not underestimate the human dimension in the future. I see people in the future using way more tools, way more new tools themselves, which will lead to a very drastic shift in how we do things. So as educational tools or enablement tools or planning tools, digital twins will have drastic value and also at the operational sites to juggle the balls of the affluent efficiency and the emissions. Okay. Thank you for your attention. So if you want to connect with me, feel free to scan this code. Fantastic. Thank you very much, Bim. And very interesting. There are lots, lots of questions in the Q&A for you. So I will let you crack on and start to answer some of those. Our last speaker today for the Q&A session is James Ballard from Seven Trent Water. Now, James has been doing a lot on digital twins and IoT and huge amounts on technology for a number of years now. And I think James has got some quite interesting things to talk to us about. So, James, I'll let you take over. Brilliant. Thank you very much. Right. Should I wait for control of the slides? Fantastic. Right. So, yes. Hello, everyone. I can't see you all, but I'm imagining there's quite a few. I think there's around 600. So, super excited to be here, I guess. And thank you for coming on to listen. The previous two sessions is really the kind of inspiration that I've been reading about and hearing about for the last, at least the last three or four years. I've been in Seven Trent for 10 years, and now I'm responsible for the digital twin implementation strategy. So, taking this inspiration and I'm sold. We need them. Absolutely fundamental to the success of the water sector. We absolutely need them. How? How? How do we go about deploying these things? So, that's what I'm going to talk to you a bit about. But for those that don't know Seven Trent on the call, I'll do a very quick summary of, I guess, the company. So, you know, we are one of the largest water companies in the UK water sector. We have around about 1,000 treatment works. We have 18 large water treatment works. The 1,000 was the waste of treatment works. We've got around about 100,000 kilometres of sewer pipes. We've got 50,000 water pipes. Or what else have we got? Oh, yes. And we're also one of the largest, yeah, one of the oldest water companies as well in the UK. So, if you think about when the infrastructure was put in, Birmingham, one of the large cities in the Midlands, a lot of the infrastructure was put in around 1876. So, we have some old stuff that we have to work with. And that all forms part of the challenge of how do we deploy digital twins. So, couple of the key themes that I wanted to bring out, I guess, over the next 10 to 15 minutes. You've heard some of this already. So, we've talked about mechanistic versus data driven from Wim. And I'm going to bring it up again because this is one of the key questions that we have to answer when we think about deploying it. Insights, how far can we go? So, do we start small? Do we build out? Do you go for everything? Data quality, I'm going to really touch upon data quality because that is absolutely essential. Technology stack. We haven't really covered technology stack. So, I'm going to say a few words, I guess, on that as well in the spirit of how do we make this real. And I'm going to then bring it to life with a couple of examples of what we're doing in Seven Trends to really deep dive, I suppose, into digital twins and start to figure them out. So, I took this image from the IWA website. I actually quite liked it. So, it kind of describes the scene. Again, we've had a flavor of this from the previous presentations. You've got your infrastructure via your pipes or your treatment works. You've got physical sensors. You've then got the data, the digital twin. So, the data feeding the digital twin and you have mechanistic hydraulic based or process based or data driven. So, that's where your data science comes in. We've then got control systems and we've got users. So, I'm going to jump to the control systems first of all. So, if we think about our treatment works, we've got a lot of control loops going on there. It's quite complex range of processes and a control loop for anyone that's not familiar with them. Think about your heating in your house. So, as the temperature cools down, your thermostat will kick in, it will read the temperature and it will turn on your heating and then watch house heats up. It will then read the temperature again and then it will turn off the heating. So, that is a control loop. That's an on off control loop. And then you get more complex types of control loop. And typically on a site you can expect to see somewhere around 100 control loops. Some of our larger waste treatment plants have upwards of 150 control loops all interacting with each other to keep the processes going through from the inlet of the treatment works, through all of the processes, primary, secondary, the ASPs that we've heard about, and then throughout to the water course. And of course, similar on the water side, the clean water side. And control loops in general, they're designed to keep processes stable. So, they're not already designed to optimize processes. They are there to keep processes stable. We as users want to optimize it because we want to get the most benefits for our customers in the environment. So, we absolutely need to be reducing energy. We absolutely need to be improving final equivalent quality, for example. And we need to be making sure that we improve our impact on society and the environment. So, things like greenhouse gases also come into it like N2O. So, they're the key drivers. The question is how? So, we think about this user sat in a chair down there. We are now asking them to not only deliver the best water, best quality water and drinking water to our customers and taps, but also, and also the best final equivalent to the rivers. We're asking them to do it for minimal energy, minimal chemical use, net zero emissions is one of the drivers I'm going to come on to. So, minimal greenhouse gas impact on the environment. You can see how complex this is starting to become. And that's why we need to start implementing digital twins. It's becoming really, really complex and we need to assist our operations in order for them to help them make the right decisions. Okay. So, as a system, then, what you see in front of you, we want to apply this definitely across source to tap. So, that is the water course. For us, we're landlocked so it's not the sea. It's got the rivers through to the tap and then from drains to the river or the sea or if you're one of the other water companies. So, that's generally the system that we've got and we want to implement this end to end. So, data then. We've talked about data and data fundamentally underpins everything. And the six categories you see above you there. This is really how we're thinking about data in seven trends. So, first of all, we need to know what we've got. So, a data catalogue is absolutely essential. We need to know our asset data. We need to know our telemetry data. We need to know what data we've got, where the gaps are. We want to know standards around that data. We need to know absolutely everything about that data. Metadata. So, data catalogue, number one essential. Number two, we need to be able to acquire that data. So, you think about some of the new and upcoming IoT type sensing. That's really, really good. But how do you then combine that with traditional sensing? So, where we've got sensors on plants and sites, coming through traditional SCADA systems. How can we then also make sure we're ingesting IoT data and sort of where does that play a role? Okay, analytics and diagnostics. So, we've talked about digital twins in the simulation world. And we talked a bit about optimizing. So, I'm not going to repeat that. But there's that bit around analytics and diagnostics. So, where do you do all the heavy crunching behind the scenes? And then how do we feed that into simulations? And then how do we use the simulations to actually optimize the plant? And do the optimizations come into form of recommendations? Does it automatically control our systems? So, yeah, that's that one. Visualization and reporting. So, we've got to think about how everybody from the CEO, so for us live, what does she want to see? If she wants a, you know, kind of a real time view of the situation across seven trends, all the way through to the field workers. So, what do they want to see when they're out and about maintaining our assets? Two in the morning when they've been called out to the middle of the field in the pouring rain. What is it they want to see? What reporting do they want to have? And finally, alerts and alarms. This is a big, big, big, big topic for me. This is close to my heart. So, alarms are very reactive. Historically, we have a lot of alarms. We're always reacting to alarms. So, maybe a customer's rang in and said, I can see a problem. Maybe we have a telemetry alarm that is triggered and we have a network control and our network control have to respond to thousands of alarms every day. What we want to do is feed them proactive information in the form of alerts. So, how can we embed this diagram and everything on it in order to provide as much proactive information as possible to reduce the amount of alarms that we get? Right, a couple of requirements and then I'm going to go on to some use cases. So, scalability. This is something that's really key that we've got to consider when we start to deploy digital twins on our sites and in our networks. How do we make sure that it's scalable? So, if you think about one use case and one application, and we've heard a couple of them, how do we, how do we, so we've seen the network with ICHICA, we've seen treatment works with WIM. So, we've got to think about how can we make this scalable? Do we have to start again every time we want to apply it to a new site or is there a way that we can create an approach such that we can actually scale the majority of the work we've already done through to another site or another catchment or another water company? So, yeah, scalability. Baseline performance, calibration and validation. So, this is where we've got to start thinking about machine learning and data-driven approaches versus mechanistic. So, one of the key things I've seen, I guess, over the last couple of years with machine learning stuff, it's really, really good and it delivers some huge benefits. But one of the key, I guess, benefits we see is that it can detect drift from a baseline. So, machine learning techniques can monitor and can identify when something's going out of range and drifting out of the norm. Perhaps its level in the sewer is drifting out of its normal range. But then you've got to question the normal part. So, how do we make sure the normal is what you'd expect it to be in the first place and kind of what the system was designed for in the first place? Normal might not be actually what you want, but it is normal for your operations. So, what I'm trying to say is maybe it's rubbish, but it's just normally rubbish. So, we kind of need to figure that out. So, again, a question of how we combine and this comes to the combination question of mechanistic and data-driven. And which one do you start with and then how do you enhance it? Soft sensing, we've touched upon that a bit as well. So, making sure that in the future we are expecting billions of more data points to come into a seven trend each year, for sure it's going to happen. But how do we make sure that we are getting the right data and actually where we can start to use soft sensing to reduce the amount of physical sensors we need out there? Because every physical sensor is going to need maintenance, it's going to need calibration, it could produce errors data. So, if we can start to infer and accurately infer some of the data, then brilliant lesson last we've talked about, AIML, what we'll talk about the machine learning part as part of this. So, these are kind of a lot of the considerations we've got for how do we then deploy that on the site. Okay, two examples then. So, these are two off-watt winning projects. The first one is going, we won that about two years ago. And that is the AIOT, so artificial intelligence of things. And that is where we are working with around 10 or 11 partners. So, some of them are Rockwell, Microsoft, Aikawa, and there's many of that at University of Exeter, as well as water companies. And there are more that I've forgotten to mention. Many on the call. Oh, off-watt, by the way, people. So, off-watt is our regulator. So, they hold an innovation competition yearly at the moment. And the innovation competition is where all the water companies in the UK sector, we get together to create projects, essentially, that really push the boundaries. And one of the winning projects a couple years ago was this AIOT. So, briefly, what we wanted to do was to really test artificial intelligence. And therefore, obviously, it's a data-driven approach. And we want to see how far this could get in order to benefit customers in the environment with respect to flooding and pollution. So, obviously, with some of the challenges we've got, climate change, et cetera, really need to be targeting flooding, pollutions, and overflow spills as well, of course. And so, AI, how can it help us? Well, we are probably about halfway through the project now, I think it's fair to say. And we have created a sort of an AI brain, which is made up of a few data-driven models. One is to predict the inflow into the catchment. The second one is then to simulate how that happens in the catchment. By happens, I mean, how does it flow through the pipes? And how does it go from one pumping station to another? Which pumping stations, by the way, that's how we push the flow through the waste network. So, typically, it follows the flow in the middle of the screen here from the customer property to runoff as well, combines, and then we pump it through with pumping stations through into the waste treatment works. So, what we wanted to do was optimize pumping stations. So, ahead of a storm, how can we have, through any storage tanks, through the pumping stations, wet wells themselves, and then during a storm, how can we make sure that we're not actually overloading the system? So, typically, it rains quite heavily in part of the catchment, and it doesn't hit the whole catchment at once, it will rain in a particular area, and it sort of track across. And as it tracks across, you'll get all the pumping stations turned on at the same time because they have relatively simple control systems on off again, they will fill up and they all turn on, and then you get hydraulic, hydraulically overwhelmed networks. So, using these three models then, so that the inflow, simulation of how it runs through, and then finally the one I didn't talk about was the optimizer. So, the optimizer then is a particle swarm analysis for anyone that is into data science stuff. That's a data-driven approach to figure out the best combination of pumping. So, is it you turn these three pumping stations on together, but actually hold some flows back at this other pumping station, don't pump that one yet? Do you stagger them a bit, pump a bit here, stop, pump a bit there? Essentially, it's looking at the system of pumping stations, as opposed to individual pumping stations. So, if you can control that better, can you reduce flooding inclusions and spills? So, so far we've seen some really promising results. So, initial simulations have shown just by staggering the pumping of those pumping stations, we're seeing a 20% reduction in spill volume in simulations just through a data-driven approach. But, there are some challenges. When we think about where does it go? What do we ultimately want to achieve with AOT? Well, we want to do a lot more than just optimize the pumping stations. We want to be able to proactively prepare the network. So, that's completely prepared the network. Let's look at asset health condition. That is look at the pumping station status. We want to be able to warn our customers and get some good notifications out there. During the storm, we want to manage the whole process where the AI brain takes into account all of the status of the health of the assets, all of the customer information, and then sort of real-time calibration and validation, and then post storm retrain the models and then do any remedial works required or trigger some interventions. So, that's kind of where it's going. So, we've seen some really good initial simulations, mainly very heavily data-driven approaches. There's a question now, how do we get it to this vision? And I think that's where we're going to start looking more heavily into the data side of things, more heavily into mechanistic models. Okay, second example. So, this is another off-wall project we've got. So, it's the NetZero Hub. So, it's our program. It's our NetZero program to create a NetZero missions waste water treatment works at one of our largest sites in our patch. So, what we've got is a big combination of physical technologies. There's cover covering ASP lanes in order to capture N2O. Like Brinaus Gaswim was talking about, about 300 times more potent carbon dioxide. It's really important to capture that coming out of the ASPs. And we've got a range of other physical technologies. But really, we want to know how to get the best out of these technologies together, and then we want to be able to optimize the whole site. So, considering greenhouse gas emissions, energy production and sludge production. So, we've obviously got the water liquid side, and we've got the solid side as well. We've got the final effluent going out into the river, of course. So, we started to think about how could we do this using digital twin. And the start of the journey, we had a lot of 3D stuff and we kind of go, that's great. But we really need to get it to a predictive analytics and simulations capability. So, we've broken with a range of partners on this as well. So, some of those are Siemens, I know some of them are cool as well. Siemens, we've got, if I've got anyone, I'll try and shout them out later on. And Atkins, of course. Oliver will tell me off of that one. So, how are we doing? That's probably the next side. Right. So, I gave you a vision for AOT, a way that's going. Time to give you a vision for our net zero site. And it kind of works like this. So, we need to know what's coming into the works. So, that's flow and load. And it's also additional load. Could be trade waste. We then need to figure out how we get it through the treatment process. But this isn't just a hydraulic process anymore. This is a biological process. So, it becomes a little bit more complex. And that's why this one is the foundation, the basis of this one is going to be a mechanistic model. So, unlike AOT, where we went pretty much all out with data-driven, and now we're starting to think about how we build in mechanistic models, this one, the thinking starts with mechanistic models. Because the mathematics, one, a lot of it's known. But two, it is so complex, we need to start with that mathematics. And then, when you think about scaling as well, if we want to scale out to other treatment works, if it's all learnt and it's all data-driven, then all you've ever learnt is what's happened at that site. If it's mathematical, it's got a mathematical foundation, you can then start to apply that elsewhere with some tweaks. But, of course, we want to enhance that with data-driven approaches. And in a similar style, once we've optimized the processes at the works, we want to be able to optimize across the range of the trade-offs we talked about. We want to be able to adjust set points both through our operation center, but also automatically. So, wouldn't it be great if we could have an automation side of this as well? So, as control lips we talked about, if the digital twin could actually influence those, that would be really, really, really good. And then, yeah, I guess at the end, we're making sure that we're meeting our consent. So, again, customer commitments to the environment. But also, there's a bit about resource recovery as well, which I probably should have mentioned earlier. So, again, as part of the NetZero hub and the program that we've got, we're also looking to start to recover nutrients on the site as well. So, about all the tissue paper you put down the toilet, it's all recoverable. The heat that comes off these systems, that's recoverable. So, how can we get to a place where we're recovering all of this stuff and then recycling it? Rural. So, quick summary. I'm sure I've taken up more than my time. Data quality. So, again, going back to the data catalog through to analytics, through to visualizing reporting alerts and alarms, it really requires good data and it requires good data for all of the critical assets that run your plant. So, we really need to know our data gaps. And we're just starting to realize now how well we've realized for a while how important data is, but we're just starting to realize how much of a challenge it is to actually review that and do something about it. Mechanistic versus data driven. So, I've shown you a data driven example where we're looking in the waste networks. And I've showed you a mechanistic based model for the wastewater plant enhanced through data driven approaches. Insights how far can we go. It didn't touch upon that at all, but that will come out of the discussions I hope. And pretty much the same with the technology stack. One thing I will say on the technology stack before I finish, we're going back to this slide. We haven't really touched upon the technology stack. So, when we implement everything you see on here, we forget there's a big world above it, which is like the cloud. So, we've gone, so seven trends have gone to Azure. We've got an Azure cloud platform. We do a lot of analytics in that cloud platform. We've got to ask ourselves, how does this all work with the cloud? You're doing some of it locally, actually at the treatment works. How are you doing a lot of it in the cloud? Is it your cloud? Do you send data out to other people's clouds? Does it start to get quite complex? So, again, maybe this will come out of Q&A a bit. But it's another consideration that we've got is that it really does get more complex. We think about how you implement it in the IT world as well as just the OT world. Oh, back to Oliver. So, thank you, James. Funnily enough, on that last question we do have all these DT cloud based or on premise. They seem to require a mixture of plant data and Excel information too. So, that is just what you were saying, James. Yeah, absolutely. So, in our case, we absolutely want to recognize the power of the cloud. So, cloud based is the way we are going. Again, if you kind of want to scale up the if you want to scale up the solution, so think about you've got multiple sites. If you haven't got it in one cloud based location, you might find it challenging to actually have a digital twin of multiple sites all running together as one system. So, we've got lots of lovely questions in the Q&A. Do keep on adding to them, although probably won't get through all of them. One that Vin wants to answer, he's got this one, William Klassen. Currently, digital twins and all related modelling and data handling work happens a lot in niche software, each with their own model and data formats, often with a software-first design. These tools are the main closed source and bring a lot of implementation headaches and actual production context. Opening modelling ecosystems exist and progress in the digital space could significantly improve the adoption implementation effort of digital twins, bringing time and resources to focus on the essence, the process, instead of reinventing the implementation wheel each time. What is your opinion for the whole panel on the implementation challenges and the progress in the DT space when it comes to improving the technical obstacles relating to the DT usage? Now, Vim, I think you wanted to start on that one. Yeah, I think it's an interesting question by William. With this whole digital twin transformation or digital transformation, I kind of have the feeling that separate platforms from the past start merging gradually more and more. Each platform has unique strengths and not a single tool can do everything. A very interesting question and I'm specifically interested in, for example, how James looks at the implementation from a utility perspective and also Pilar from a software perspective. I can imagine, for example, Idrika has also been integrating softwares. This was my introduction, but I think I will still start listening. James, Pilar, do you want to have a go at that one? Yes, it's a very good question because this is one of the challenging points that we have to face when implementing a digital twin. In our case, what we are using is an agnostic platform. What we are doing is to implement a platform that integrates all the information that resides in the different sources in the company. Once we have this platform where all the information is integrated, we connect it with hydraulic models. Regarding what software of modeling we are using, in our case, we are using open source, IP and net. IP and net is an standard for water distribution networks, which is open source. Most of the other software, package software, can export to IP and net because IP and net is like a reference in the industry. There are huge community researchers working on that and it is open source. We try to be agnostic and use open software modeling. This is our solution. Yes, technical challenges. We are still learning. I have to dodge the question a bit with the we are still learning because it is going to be a challenge. What I love about the off-work projects we have got going on at the moment is it gives us a chance to explore as well. We are exploring some of the benefits and the challenges of each of the technical implementations that we are doing. What we have got to figure out for next time is how we are going to scale all of this out. Is it one platform? Is it ten platforms? How many different platforms do you need across each of the data parts? Do you need a platform for data ingestion? Do you need a platform for simulations? Then you have got waste, then you have got water, you have got infra, you have got non-infra. How do you connect all that into one integrated system? I think we are going to be on quite a fast learning curve with this stuff. Yeah, that is also our experience. If we talk to our utility customers, the whole market is in our learning stage. I will give one example. Recently, a Dutch utility requests a market's input for indeed a vision around digital twins. In that document, they stated, like, nobody can claim right now they have the complete solution. We invite suppliers to now write the vision documents and come together. Then we will interview and we will set together a consortium to see what works best. This is right now a little bit the stage where we are. There are a couple of very big established platforms, but there are still some gaps to be filled. Curious how this will unfold? At that, I have got Erin reminding me there is still something to do with the poll. Erin, do you want to bring that up? I am going to be really adventurous and ask one of the participants to answer one of the questions. I know you are on the call because I can see you. If you could type the answer into the chat, I will read it out. Has there been a pilot of DT in an African country, especially Sub-Saharan Africa? I do not know. I am not sure if our panelists know, but Morgan from Randwater is on the call and he is actually part of the digital water program steering committee. I will wait to see if Morgan answers it in the chat. In the meantime, we do have polls to do answer the poll questions. Do you think there is a consensual definition of a digital twin? Question two, are utilities ready to adopt this technology? Less than 20% between 2050, about 50%. Question three, what are the main barriers that stop the digital twin adoption? Do answer that in the poll? Our hosts and panelists cannot vote, unfortunately. Do you think there is a consensual definition of a digital twin? I think there are a lot of definitions of digital twins. I have seen in the last year, there are a lot of working groups that have made a big effort trying to have a single and unique definition of a digital twin. In the past, as I said in my presentation, I was a digital twin or a beam or a GA, yes, because we didn't know what a digital twin was. And in fact, this was a concept that was introduced in the industry field to optimize the life cycle of a product. It was introduced by Dr. Rives, who is the father of this concept. So when we try to translate this concept from the industry to a city management context, it's difficult. So the good news is that in the last year, there has been a lot of work trying to provide a single and a common definition, and I think today it is. Fantastic. I'm going to quickly interject there, and Morgan's come back to me. Fantastic. Thank you, Morgan. Yes, there are pilots currently happening in South Africa in both water and wastewater. Quick question from William Klassen in the chat, Pilar, could you type the name of your modelling platform? Is that EPA Net? So that's EPA Net, EPA Net. Learning from other communities is always interesting. I'm a fan of your agnostic approach by open source. Fantastic. So I'm going to briefly answer the consensual definition of a digital twin. I'm going to say no, I don't think there is. And it also depends upon the digital twin that you mean. So what do I mean by this? Digital twins, to me, there's several different types of digital twin. We've mainly been talking about the operational digital twin now, applying that to wastewater treatment works, water treatment networks. But before that, you've got what I call the construction digital twin. Some would call it AutoCAD 3D on steroids, maybe. And actually, in the work that Pilar's done in Seville, sorry, Valencia, not Seville, my brain going, you see some great construction models, which, yeah, they're a virtual representation, which people have called a digital twin. You certainly see in manufacturing facilities, where they build a manufacturing facility digitally first, apply a instrumentation layer of how that factory is going to work in a complex system. So they're discrete construction based digital twins, and there's lots of those examples out there. I've certainly seen, I'm not going to say I'm old enough to remember draftsmen with onion paper and drawing on drawing boards. But in my years, I've seen AutoCAD move to 3D, AutoCAD move to BIM for water, etc, etc, etc. So yeah, it's, I will say it depends upon digital twin and depends what you want from it. But I'll let Vim and James answer as well. Yeah, so Oliver, there was also a comment from somebody, I think from THI China that the models are being used for decades already. And what is now changing is just every model right now that is being used in a design stage or so is now suddenly a digital twin under construction. That is, yeah, likely a little bit, or yeah, yeah. Yeah, so it's interesting. So when you take the Gartner hype cycle, we're very much in that inflated expectations that everyone's calling a digital twin. And some of course, what we've seen in Valencia is we've come out the other end. And I'm sure during the many years of development, Pilar, I'm a big fan, you've probably gone through that stage with your boards and people within the organization of, oh, this is going to solve our world. It's going to be a silver bullet. Yes, no. Would that be fair to say, Pilar? Yeah, I think there are some questions about how we can convince the board and how what is the return of investment of this kind of technology. I think if you have some specific challenges that you want to face and you start deploying or developing this digital twin, focusing these objectives is going to be really easy to convince the board. It's difficult sometimes to measure the things that we are going to the benefits that we are going to obtain with the digital twin because some of them can be measured in terms of money, but others not. In our case, I think it was, we were really very convinced about that because having the challenge where, and I think most utilities happen, where the knowledge is concentrated in very few people that have been working for a long time in utility, 30, even 40 years, and these people are about to retire. How to train the new people that join the company? How to manage an emergency condition when this knowledge is very concentrated in some few people. So having a tool that gives you the security for training new people, for the new people that are going to join the company and for managing all the emergency situations that can happen. I think for our water utility, it should be convinced because, as I said before, we have to provide water 24 hours and we can do it very well for years and years, but if we do it bad one day, it will be a disaster. So only with one day it's going to be a disaster for the board. So focus on the objectives in our case was the security and also making sure the transitional change to new generations. I think it was the key and the board was very convinced about that. James, do you want to add, Ben? Do you want to add before I start to wrap up? No? Okay. So what I'm going to do, I'm going to quickly go through the discussion polls. We've got absolutely tons of questions on which we can't possibly answer now because some of us have got meetings in about eight minutes time, so I'll very quickly wrap up. Discussion polls, do you think there's a consensual definition of a digital twin? 60 said yes, 57 said no. So pretty much 50-50. Are utilities ready to adopt this technology? 49% less than 20. 20 to 50% was 42% and above 50, 10%. So really almost 90% is less than 50%. Main barriers, insufficient technology comes in, insufficient data quality, absolutely vital. I certainly see on multivariate process control basically fail about five years ago on that one. What, which phase of the water cycle can leverage more, can leverage the benefits of digital twin more, treatment plants distribution, transport networks is the most, sewage networks, but relatively little and yet we're kind of starting to see that. And I'm certainly seeing it in treatment plants, which profiles can benefit more than this, more from this technology. The winner there was control room operators, certainly, but field operators will, everyone will basically. Do you think this is the mature technology is still developing? A whopping 94% said it's still growing. So absolutely, there's a lot more to go in this place, space. So thank you for Kim who's stayed on to 1am in Korea. That is dedication to the cause. I really want to say thank you very much to Vim, Pilar and James for informing us as some great, great things. I think we've got a lot more to say here. Erin is quickly flicking my slides to say do come up. I think this recording is going to go on to IWA network. If you're not a member of the IWA, Carla would crucify me if I didn't say you, of course, must join the IWF. You're not part of it. There's the International Women's Day coming up on the 8th of March. So do register for that on IWA learn. Erin's going to talk to me about the, of course, the World Water Congress and an exhibition in Toronto. On the 11th to the 15th of August, I flipped back there for Erin for me for a second. We are developing a whole digital aspect of the IWA World Water Congress in Toronto, and we're already starting to plan for the digital aspect of the World Water Congress in Glasgow in 2026. Sorry to be so fast, but Erin, if you click forward for me, join our network of IWA Webinars 24. That's a 20% discount off new membership. If you're not already a member, which I know you all are. So thank you all very much for coming and participating. Really glad that so many have enjoyed it and I can see some great stuff in the chat. Apologies that we couldn't answer all the questions, but we will, I'm sure Erin will pass me the questions afterwards so that I can get some answers to you all. And I don't know, we'll do an article or something on it to answer your questions and see what we can do. So thank you very much to all our attendees. Thank you very much to Pila, James and Vim, and I hope you've enjoyed it and see you soon on the next one. We will be having something on artificial intelligence in wastewater networks sometime in May. I believe it is Erin, was it? But do check out LinkedIn and the IWIO websites for a lot more information. Thank you all.